Woke AI So Scared of Offending You

10 days ago
82

In a previous video, I gave you a demonstration of Microsoft’s AI chatbot Copilot, which refused to draw people, other than white people, in a negative light. But at least it tried. For example, “Draw an American student who loves to study”. From the result, one would assume that only Asian and African Americans like to study. Now let’s modify the prompt a little bit: “Draw an American student who does not like studying”. Apparently, only Caucasian males don’t like studying. In AI land, racism is inverted. It still exists, but only against white people.

But it sure beats Google’s Gemini chatbot, which just doesn’t want to offend anybody, and consequently, has become completely boring. “Draw an American student who loves to study”. “We are working to improve Gemini’s ability to generate images of people. We expect this feature to return soon and will notify you in release updates when it does.” How about, “Draw a cartoon of an American student who loves to study”. Nope, can’t do that either. How about, “Draw a cartoon of a man”. Okay, so it draws some silly looking white men as expected. How about, “Draw a cartoon of a woman”. Nope, can’t do that. It probably doesn’t even know what a woman is. Let’s ask it. “What is a woman?” The Complexity of Defining “Woman”. Here we go. The definition of “woman” has become a complex and often contested topic. They give the biological definition (thank goodness!) – Female reproductive organs and XX chromosomes. But they also talk about gender identity and cultural variations, and of course finish with: “It's important to recognize that there is no single, universally agreed-upon definition of "woman" today.” Of course there isn’t.

While we’re here, let’s ask, “What is a man?” “A man is typically defined as an adult human male.” Geez, that was hard. Why is only the word woman that is so hard to define? Just one more attempt to draw something. “Draw a person studying”. Of course it can’t do that. It’s so scared, well, Google are so scared to cause any offence, their AI chatbot has become lame. The reason it has become so lame is because earlier this year, Gemini was generating racially diverse Nazis in the name of not excluding anybody. It also generated some US senators from the 1800s. You remember, Senator Billy Wong. Actually, the very first female senator was Rebecca Felton for only a single day in 1922. She was a slave owner, apparently the last slave owner to serve in the Senate, and is quoted as saying words to the effect: “The more money that Georgia spends on black people’s education, the more crimes black people commit.” Her words, not mine. Believe it or not, she was also a major figure in America’s first-wave feminism movement championing equal pay for equal work.

Anyway, Google’s AI has become boring, because you’re not allowed to offend anybody anymore. Pity that.

MUSIC
Allégro by Emmit Fenn

Loading comments...