A great week for emails in response to my newsletters last week. In response to the newsletter about the preponderance of surveillance cameras, I received this email fromScott Nelson, President, Security & Risk Management Grp, “ Hi-tech crime prevention/solving tools – like facial recognition CCTV, DNA, crime forecasting, forensic analysis, surreptitious surveillance, behavioral analysis, AI – are increasingly useful and inevitable in our complex, self-entitled world. love those cameras.
So too is privacy, but the bottom line is that victims don’t care who or what rescues them such as cameras, cops, good samaritans, or even plain dumb luck. as we say in the FBI: “it all counts towards 20”. Even unabashed privacy zealots are quick to change sides when they are personally victimized. not surprising because human nature always trumps human rhetoric: survival, entitlement, rules are for others. I’m special, fear etc etc. I totally agree, thanks Scott.
Artificial intelligence voice assistants with female voices reinforce existing gender biases, according to a new United Nations’ report. The report looks at the impact of having female voice assistants, from Amazon’s Alexa to Apple’s Siri, projected in a way that suggests that women are “subservient and tolerant of poor treatment.”
The report takes its title from the response Siri used to give when a human told her, “Hey Siri, you’re a bitch.” Further, researchers argue that tech companies have failed to take protective measures against abusive or gendered language from users. “Because the speech of most voice assistants is female, it sends a signal that women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK,’” according to the researchers.
The assistant holds no power of agency beyond what the commander asks of it. It honors commands and responds to queries regardless of their tone or hostility. Research has long found that AI intelligence has a problem with gender and racial biases. The use of smart speakers is continuing to grow rapidly with about 100 million smart speakers sold in 2018.
Technology always reflects the society in which it is developed. The biases reflect an attitude that almost condones a ‘boys will be boys’ attitude and it magnifies gender stereotypes. The female voices and personalities projected on to AI technology reinforces the impression that women typically hold assistant jobs and that they should be docile and servile. Stereotypes do matter because they come back to affect how young girls and young women see themselves and the way they have dreams and aspirations for the future. In some ways it is almost like going back to the image of women that was held in the 1950 or 1960s.
Amazon’s Alexa, named for the ancient library of Alexandria, is unmistakably female. Microsoft’s Cortana was named after an A.I. character in the Halo video game franchise that projects itself as a sensuous, unclothed woman. Apple’s Siri is a Norse name that means “beautiful woman who leads you to victory.” The Google Assistant system, also known as Google Home, has a gender-neutral name, but the default voice is female.
The report calls for more women to be involved in the creation of technologies used to train AI machines, citing research from Science that finds that such machines “must be carefully controlled and instilled with moral codes.” Researchers also call for tech companies to train AI machines to respond to human commands and questions in gender-neutral ways by establishing gender-sensitive data sets for use in AI applications. The bulk of the data used to train the machines now is sexist.
The structure of the subject area of computer science has been designed to be male-centric, right down to the very semantics we use. Although women now have more opportunities in computer science, more are disappearing from the field as they advance in their careers, a trend known as the “leaky pipeline” phenomenon.
Women are grossly underrepresented in artificial intelligence, making up 12 percent of A.I. researchers and 6 percent of software developers in the field. Women are actually being forced out by a female-unfriendly environment and culture. The culture needs to change.
For those that don’t want Alexia listening in on your conversations, they’re making a male version . . . it doesn’t listen to anything