Adherence to the declaration will prohibit researchers from working on robots performing search and rescue operations or in the new field of “social robotics.” One of Dr. Bethel’s research projects are developing technology that uses small, human robots to interview children who have been abused, sexually assaulted, trafficked, or otherwise traumatized. In one of her recent studies, 250 children and adolescents interviewed about bullying were often willing to entrust information to a robot that they would not pass on to an adult.
Thus, having an investigator to “run”
“You have to understand the problem area before you can talk about robotics and police work,” she said. “They make a lot of generalizations without a lot of information.”
Dr. Crawford is among the signatories of both “No Justice, No Robots” and Black in Computing open letter. “And you know, whenever something like this happens or awareness is created, especially in the society in which I operate, I try to make sure I support it,” he said.
Dr. Jenkins refused to sign the “No Justice” statement. “I thought it was worth considering,” he said. “But in the end, I thought the bigger problem really was representation in space – in the research laboratory, in the classroom and the development team, the management.” Ethical discussions should be rooted in the first fundamental issue of civil rights, he said.
Dr. Howard has not signed any of the statements. She reiterated her point that biased algorithms are in part the result of the skewed demographic – white, male, functional – designing and testing the software.
“If external people who have ethical values do not work with these law enforcement agencies, then who are they?” she said. “When you say ‘no’, others will say ‘yes’. It’s not good if there is no one in the room who says, ‘Um, I do not think the robot should kill.’ “