Post by account_disabled on Feb 18, 2024 6:02:21 GMT -5
It is a tool, not an end in itself. "It is very difficult for many people to understand that we are still very far from robotics. If a person has been the victim of a legal injustice, and thinks that an algorithm can replace judges, they will think that the algorithm will be more infallible than a magistrate" . "What we have seen so far is that many times technical systems put together the worst of human bias with the worst of technological bias ." "There is candor in many technological discourses" Gemma Galdón, founder of Eticas Research & Consulting. Gemma Galdón, founder of Eticas Research & Consulting.Dani Blanco ARGIA When it is said that an algorithm can include discriminatory logic, we are talking about the process by which a computer automation is capable of leaving a human being without a public social benefit. "The teams that make algorithms are mostly white men from the north.
It doesn't have to be a problem , but historically they have had a harder time seeing who is not there." Gemma refers to the fact that in the case of women or people "with a different ethical background " it would be more normal for "a little light" to turn on when asking themselves a simple question when they create an algorithm: "And what happens to people Europe Cell Phone Number List like I?". There have been many examples where the faces of black people have been underused to train adversarial neural networks that are part of a facial recognition system with artificial intelligence, for example. The real threat of artificial intelligence is the people behind the algorithms, according to an engineer from Google's scientific department "A team of white men trained in issues such as sociology or anthropology would not have that problem," Galdón adds. "Engineers codify reality," after all.
Despite these issues, Galdón insists that there has not yet been enough "sociotechnical" training so that today there are people in positions of responsibility capable of addressing these nuances. "The discourse of technological neutrality is still very broad and I think there is a lot of candor in some technological discourses. There is very little understanding of how technology really works." "Your product may be neutral, but it is in a context that it is not. Never." "A knife, is it good or bad? It can be used to cut chains or to kill a person. But it is evident that the knife appears when human beings begin to consume meat: it is the reflection of a socioeconomic dynamic," he explains. . "You can say that a knife is neutral, but you cannot deny that the society that shapes a knife is a society that needs a knife, so from that very moment neutrality is false.
It doesn't have to be a problem , but historically they have had a harder time seeing who is not there." Gemma refers to the fact that in the case of women or people "with a different ethical background " it would be more normal for "a little light" to turn on when asking themselves a simple question when they create an algorithm: "And what happens to people Europe Cell Phone Number List like I?". There have been many examples where the faces of black people have been underused to train adversarial neural networks that are part of a facial recognition system with artificial intelligence, for example. The real threat of artificial intelligence is the people behind the algorithms, according to an engineer from Google's scientific department "A team of white men trained in issues such as sociology or anthropology would not have that problem," Galdón adds. "Engineers codify reality," after all.
Despite these issues, Galdón insists that there has not yet been enough "sociotechnical" training so that today there are people in positions of responsibility capable of addressing these nuances. "The discourse of technological neutrality is still very broad and I think there is a lot of candor in some technological discourses. There is very little understanding of how technology really works." "Your product may be neutral, but it is in a context that it is not. Never." "A knife, is it good or bad? It can be used to cut chains or to kill a person. But it is evident that the knife appears when human beings begin to consume meat: it is the reflection of a socioeconomic dynamic," he explains. . "You can say that a knife is neutral, but you cannot deny that the society that shapes a knife is a society that needs a knife, so from that very moment neutrality is false.