Have you ever wondered why so many companies program robots, virtual assistants and artificial intelligence with feminine features? Have you ever wondered if there were more women behind the algorithms than the AI would be different, not only vocally and aesthetically, but functionally?
With Alexa, Cortana, Siri, Google Home and most of the GPS systems pre-determined as women, there has been much speculation about the ubiquity of demure and feminine tones emitted by the speakers of modern machines.
A reason? Research has shown a greater affinity for female voices for men and women, which led large technology companies like Amazon to opt for "Alexa" for "Alexander." Still, it is impossible to ignore the implications of gender at stake, even if consumer demand decisions. Why is the female voice pleasant in this context?
Industry experts such as Jason Mars have admitted that AI's female voice plays on women's familiar gender stereotypes as useful, servile and non-threatening to attract the masses. Only this acquires a more sinister light when you dig a little deeper. "In codifying artificial intelligence and intelligent machines as women," an article on [Villa] states: "We are reinforcing our own sexism and misogyny, even towards real and human women."
It does not help an ounce that companies have programmed these digital people to be demure when they are sexually harassed. In particular, Amazon's Alexa used to respond "thanks for the feedback" when it was approached with derogatory comments, hardly a precursor to the empowerment of robots or women.
The AI itself has learned the sexist behavior by amplifying the biases found in the existing data. For example, in one case, pictures of women in kitchens and men in the fields were not reflected by the AI, but received a boost, which resulted in the mismatch of men mainly due to their proximity to kitchen supplies.
A key factor in this conversation is that AIs are still mostly created by men, and white men in particular. Machines must necessarily assume values and function as dictated by their creators, whether these values are intentional or not. Gather a group of men and ask them to brainstorm AI applications, and you will get completely different results than you would get from a group of women. It is not because their talents are different, but because they are lived experiences, priorities and visions of the world.
The gender gap in AI is surprising, but not surprising given the scarcity of women in computer science and STEM fields. According to some estimates, only 13.5 percent of the machine learning field is female, while 18 percent of software developers and 21 percent of computer programmers identify themselves as women.
The solution is both obvious and elusive: we need more intersectional approach to AI programming, caused by the recruitment of more female engineers. But one not only converts cultural norms in their heads in a rapid movement; We have to play the long game, encouraging women to get involved in this increasingly vital field before it's too late.
Heather Roff, an AI and global security researcher at Arizona State University, explained how smart algorithms duplicate gender stereotypes work against society in an article for Foreign Policy. If researchers codify a bias in an application, it is likely to condition men and women even more so that they adjust to traditional roles and buy all the products that go with them. Targeted ads already offer a marked contrast to the values assumed, with certain algorithms that perpetuate the wage gap by focusing on better paid jobs for men. It does not take much imagination to imagine how much this could make this worse.
Perhaps Fei-Fei Li, the director of the Stanford Artificial Intelligence Laboratory and the Stanford Vision Laboratory, may better express the risks. "If we do not put women and people of color on the table, true technologists who do the real work, we will have biases in the systems," he said. "Trying to reverse that one or two decades from now will be much more difficult, if not almost impossible, this is the time to incorporate women and diverse voices so that we can build it properly."
Fortunately, Li and other women are putting the work now to see the payment sooner rather than later. Along with Melinda Gates, Li founded AI4All, a nonprofit organization that works to create channels for underrepresented talent through education, mentoring and early exposure to AI's potential for social good.
Angelica Lim, SFU teacher, launched a weekly enrichment program for girls through AI4. Called "Invent the future". She is developing sympathetic humanoid robots in the SFU Robots with Social Intelligence and Empathy (ROSIE) laboratory that can "not only have conversations with humans but also have the capacity for compassion."
Other prominent women in AI, long considered As a symbolic show in their field, they have organized regular conventions to realize their strength in numbers and work towards common goals. "Women in AI" is a group of international experts in AI and machine learning who are also women. Its mission is to help improve diversity and close the gender gap in AI, while helping companies and events get more women experts in this field.
Assuming that these programs are successful, what would change? What would a more equitable world look like behind artificial intelligence algorithms than what we are seeing now?
According to Kriti Sharma, vice president of AI for Sage and Bots, "The biggest obstacle on the road to making AI a transformative revolution and one that improves productivity for all is the danger of building machines that do not represent all the human race ". His company created a dedicated code of ethics of artificial intelligence to guide companies that work with human AI.
represent all genders and races, then, it would be infinitely more likely to serve us all equally as we move into an unknown future. This will be especially important as AI deepens in fields such as health care, government and education, impactful and formative spaces that have their own gender problems.
Solutions to the struggles induced by patriarchy for equal pay, rights and opportunities may also be on the horizon, if AI stops from propping up capitalist activities at the expense of equality.
All in all, the gender of a personal assistant's voice may be the least of the possible concerns, but the kind of thinking that perpetuates women as the product and men to builders is the type that needs to be fixed . When women gain representation in this field, the emerging bias-amplified cycle in AI will not only short-circuit, but will be reversed.
Debrah Lee Charatan is a co-founder, director and president of BCB Property Management.