Asset Publisher

Back 2018_12_05_ics_herzog

"Algorithms reproduce the patterns they see in society, and if these patterns are unjust, racist or sexist, they will make unfair, racist or sexist decisions"

Lisa Herzog, philosopher at the Technical University of Munich, visited the Institute for Culture and Society to give a seminar within the Religion and Civil Society project

Descripcion de la imagen
Elisa Herzog
FOTO: Cedida
05/12/18 19:08 Elena Beltrán

"Algorithms reproduce the patterns they see in society, and if these patterns are unjust, racist or sexist, they will make unfair, racist or sexist decisions," according to Lisa Herzog, philosopher at the Technical University of Munich, during a seminar organized by the Religion and Civil Society project at the Institute for Culture and Society. In today's society, there is a tendency to entrust decisions to computers, the expert warns, "Because we have aprejudice that they are better."

However, she maintains that it is not a good idea to delegate important decisions. "In some states of the United States, algorithms are used to assess probabilities of probation," Herzog explained. The program takes into account data such as age, occupation and where the subject lives, even if it does not know everything exactly because "the data is from private companies."

This resulted in a “racist” program because the algorithms used found a correlation between some characteristics associated with black people and certain crimes. "Assuming you can predict something based on past events, as if human beingswere not able to change, is problematic," the philosopher stressed.

Another example that she mentioned is Google's algorithm, which turned out to be sexist because it offered higher paying jobs to men. "Since there are more men in positions of power, the algorithm thought that was a guideline," which leads to continued reinforcement of the situation. "Programs continue with the pattern they detect, regardless of whether they are unfair."

According to Lisa Herzog, Hannah Arendt's philosophy diverges from the way that programs that use algorithms decide. "Arendt is very interesting because she focuses a lot on individuality; she does not believe that we are just data."With the way that Arendt sees people, they can always start over again, make different decisions, or reinvent themselves. On the contrary, digital programs tend to be based on the past.

What can computers do?

The expert believes that computers can be trusted to make decisions in which all related knowledge is present, all the facts are known and nobody wants to make the decision. She proposes that these decisions be more trivial tasks such as a robot to clean the floor or cook, where the decisions or the criteria they establish are not problematic.

However, she also argues that it is okay to consult these programs for important decisions. "I'm not saying they do not have a place, but a human being should make the final decision," she says. Because, in addition, when these decisions are made, people develop skills for reaching agreements and living in society. "By making decisions, we learn to see others as equals, which is very important in democratic societies," she noted.

"Human beings have a tendency to trust computers; it is easier to accept a computer’s solution and not have to think of one for yourself," Herzog lamented. Some people consider computers "more efficient," but the philosopher believes that "some values ​​are too important to be replaced by efficiency."

NEWS SEARCH

NEWS SEARCH

From

Until