Press "Enter" to skip to content

The issue is not A.I. technology itself but its application

The issue is not A.I. technology itself but its application

Police departments and government agencies around the world have been deploying facial recognition technologies to identify potential criminals.

Northwestern University artificial intelligence (A.I.) expert Kristian J. Hammond is an expert on ethics and A.I.

Hammond is the Bill and Cathy Osborn Professor of Computer Science in Northwestern’s McCormick School of Engineering and director of Northwestern’s Master of Science in Artificial Intelligence program. He also is co-founder of Narrative Science, a startup that uses A.I. and journalism to turn raw data into natural language.

Hammond has said the following:

“As new technologies are developed that allow us to scale machine intelligence, it is important that we give thought to how those technologies are applied and where they may be powerful tools for social good and where they might threaten basic human rights and dignity. The most recent instance of this is the rise of facial recognition technologies.”

“On one hand, facial recognition can be applied to law enforcement in the form of instantaneous identification of possible perpetrators based on a photo and corpus of labeled faces, freeing staff to work on the human side of policing. On the other hand, we have already seen it being used to outside the United States as a tool for oppression. While the former could streamline an existing process, the latter brings up fears of privacy invasion at scale.”

“Concerns about privacy (and issues of bias and error) have led to an almost reflexive call to ban the technology altogether.”

“The issue, however, is not the technology itself but its application. We contend that the conversation concerning the regulation of this technology — as well as other instances of A.I. and machine learning — should be at the level of exactly how they are applied and how we can develop guidelines and regulations aimed at blocking those applications that violate basic human rights while allowing them to be used in those cases where they serve the social good.”

“We need to be able to use this technology at scale in cases such as the recovery of missing children while making sure that it is not used to create a surveillance state. The only way to do this is to support conversations that include both legal and policy thinkers and technologists to explore the length and breadth of the application of these technologies as they arise.”

Be First to Comment

%d bloggers like this:
Close Bitnami banner