Press "Enter" to skip to content

Timnit Gebru: “It's not just Facebook, social networks spread hatred with impunity”

Timnit Gebru's dismissal from Google went around the world a year ago. The Ethiopian engineer (Addis Ababa, 38 years old), co-director of the multinational's Artificial Intelligence (AI) Ethics team, was informed by email that her request to leave the company had been accepted. She says she never asked for that; What he did do was complain to colleagues that the company wanted to censor a yet unpublished scientific article of his in which he described how algorithmic language processing models, used to autocomplete searches or conduct conversations with computer programs, are heavily biased and They harm minorities.

Gebru announced last week, just one year after the scandal, the launch of its own research center, the Distributed AI Research Institute (DAIR). Funded with 3.8 million dollars (3.3 million euros) by various foundations, including Rockefeller, MacArthur and the Open Society, it will be dedicated to studying the damage that technology causes in the groups most marginalized by these. For this, it has assembled a multidisciplinary team in which computer scientists, specialists in data science and artificial vision coexist with sociologists, social workers and other humanist profiles. “I have seen how academia and industry work and both have incentive structures that do not align with what I want to do,” he explains to Newsfresh by video call. “My idea is to focus the questions and research in directions that can be useful to people who are often ignored by mainstream research institutions.”

Los Echoes of Gebru's firing still resonate with the industry. And specifically in Google, whose commitment to defending its ethical purposes has since been questioned. Getting rid of a well-known scientist like her in a company whose workforce is only 1.6% black women and in which white men occupy almost 50% of management positions raised blisters indoors. Doing so, also, as an apparent punishment for pointing out praxis failures in a sector as questioned as technology, attracted all eyes. Several US congressmen and senators demanded that the company explain the firing and reveal the full content of the Gebru report. More than 2,500 Google employees and 4,000 academics from around the world signed a letter in support of the scientist. Sundar Pinchai, CEO of Alphabet, parent of Google, had to send an email to his employees apologizing for the dismissal and assuring that the process would be reviewed.

And, in fact, it was revised, as confirmed by the aforementioned. “A few months after I was fired, someone from Google contacted me saying that they were investigating the dismissal and asking me to answer some questions,” he says. “Of course, I answered no,” he adds with a laugh. “I know how these things are going: they wanted to find something to hold onto, something that I would do that could justify it. After being so disrespectful to me, I didn't want to know anything about them. ”

Engineers Timnit Gebru, Rediet Abebe, Joy Buolamwini and Alicia Chong Rodriguez pose at a 2018 Bloomberg party in New York . Noam Galai (Getty Images for Bloomberg Business)

Perhaps the most striking thing about the Gebru case is that it was hoped that the work of the scientist was not to Google's liking. The company signed her up as a prestigious researcher. A doctorate in computer vision at Stanford University under the direction of the famous professor Fei-Fei Li, she rose to fame in the academy after publishing in 2018 with her colleague Joy Buolamwini, from MIT, an already classic article in which they demonstrate that the margin of error of facial recognition systems is 1% among white men, but the accuracy worsens if they look at women or blacks. With black women, these systems are wrong 35% of the time. The stir caused by the study was such that some companies, such as Amazon or IBM, revised their algorithms to try to correct this defect.

Google executives wanted to have her to lead the ethics area of the algorithms. And when he got to work, they didn't like what they saw. “The interesting thing about it is that I did not do anything groundbreaking, I really think that the article would not have received much public attention if I had not been fired. What shows my case is the little stomach that Google has to accept any type of criticism, “he underlines.

He thinks they used it. “They have us for that. When Congress and Senate asked about my firing and the article, Jeff Dean He said the company cares a lot about ethics, that it has 500 articles written on the subject and several research groups working on ethics. This is how they use our work, in the face of regulators. Why did you do these horrible things with black people? Oh no, we do a lot of things with blacks! If we even spend a lot of money on this random investigation that makes us look good… ”

Gebru knew what he was going for. He never intended to change Google from within, although he did think that he could create a space in the company where people could express their opinions. It could not be. “From the moment I was hired I wondered how long I would last at Google. Every month he would say to myself, wow man, can I just keep doing this for another three months? Well, at the end of the term I will be evaluated, but until then, to work. And then he thought again if he could go on. And I was like that for two years ”, he recalls.

Apartheid and genocide

The scientist and her DAIR team already have two projects underway. The first of these aims to analyze the legacy of space apartheid in South Africa using machine vision techniques, a field of machine learning that teaches machines to understand images. Gebru and his team are studying how to make the database public and how to develop visualizations that people can interact with. “The good thing about being independent is that we can invest more resources in polishing the result to disseminate it if we want to. In university research you are required to draw articles at conferences; In corporate research you are subject to the will of whoever is in charge, as happened to me at Google. Now we can control the process from start to finish, ”he says enthusiastically.

The researcher Timnit Gebru, in an image taken shortly after his dismissal from Google. Medium

The second, in a more embryonic phase, wants to measure the impact of social networks in countries that the platforms themselves do not consider relevant. “In Ethiopia, where I was born and raised, at war since November last year, the absence of content moderators has been particularly noted,” he laments. The spread of hate speech and hoaxes has had devastating effects, he says. Gebru and other colleagues denounced what they considered “a clear call for genocide” on Facebook two months ago. The post was published in Amharic, one of the languages ​​spoken in the African country. The company responded that the publication did not violate the rules, although it withdrew it after repeated complaints from journalists and after it had been shared and commented on by many users.

Although Mark Zuckerberg's company takes fame , Gebru warns that other social networks are equally dangerous in the dissemination of this type of content. “We tend to focus on Facebook when we talk about content moderation, but YouTube is a widely used channel too and hardly paid attention to in this context. Clubhouse, TikTok, Telegram… All these networks are used to spread hate messages with impunity. I am trying to see what tangible things I could do in this field that would be of help ”, he points out.

Gebru is aware that his institute is an ant compared to the technological giants. But she is not alone. The Algorithmic Justice League, founded by her friend Joy Buolamwini; the AI ​​Institute, of Australia's Cate Crawford, and Data & Society, among others, work in the same direction, each in its own niche. Although he believes that the first line of defense for society is to promote strong labor regulations in companies that develop the technology. “It is imperative that workers can complain when they see something strange without fear of losing their job. In the US, healthcare is tied to your business; your visa, in case you are a foreigner, too. If they kick you out, you are left with nothing, and the government does not support you. If we don't change that, companies will continue to have the upper hand. ”

You can follow Newsfresh TECHNOLOGY at Facebook Y Twitter or sign up here to receive our weekly newsletter .

Be First to Comment

Leave a Reply

Your email address will not be published.