“Artificial Intelligence” is kind of an oxymoron.
The word “artificial” means man-made, and “intelligence” is, well, intelligence. But can man — the so evidently fallible ape with amazing capacities for violence, racism and terror — actually create intelligence? Or can the byproduct of ignorance only be more ignorance?
Well, Microsoft’s “Tay” certainly settles on one side of that debate.
Recently, the AI Twitter bot created to interact with and learn from other social media users went on a bit of a rampage.
The bot was designed to adapt its behavior as more and more users communicated with it, but that inevitably took a turn for the worst when a bunch of online trolls fed Tay some heinous comments. This resulted in Tay tweeting that feminism is cancer and claiming the Holocaust was made up, even going so far as to add an evil clapping emoji.
One columnist suggested that this episode proves that the Internet “teems with hate,” and that antisemitism still runs through the veins of social media.
While I do not disagree, let’s not forget that Twitter trolls are people too. And let’s also not forget that many of social media’s finest — like your ultra-religious, racist Uncle Jerry on Facebook — not only share similar views, but have political candidates in modern times that actually pander to their fears.
Therefore, I think Microsoft should pick up the pieces of this project, and unleash a renewed Tay 2.0 on the political environment, not on the Internet. Let the new and improved Tay listen in on a certain political party so she can “learn from” the candidates that our nation has chosen.
Let’s allow Tay 2.0 to learn about the natural animalistic tendencies that all men must have when she reads Donald Trump’s tweet about sexual assault in the military that read, “What did these geniuses expect when they put men & women together?”
Let’s allow Tay 2.0 to learn about incurable mental illnesses when she listens to Trump saying that child molesters can’t be cured.
Let’s allow Tay 2.0 to learn about atrocities worse than slavery, like, according to brainless brain surgeon Ben Carson, Obamacare.
Let’s allow Tay 2.0 to learn about the beautiful humanistic doctrines of modern Christianity by listening to Ted Cruz say that our president is “undermining our values” and that Cruz would “uphold the sacrament of marriage.”
Let’s allow Tay 2.0 to learn about the correct way of handling innocent civilians in an armed conflict by learning the correct definition of “carpet bombing” from Cruz and the proper handling of terrorists’ family members from Trump.
Let’s allow Tay 2.0 to wear a Union Jack and attend a Trump rally and see which group, the supporters or the protesters, are the ones inciting violence. Are they the ones punching or manhandling black people and spitting on immigration activists, or those who show up to protest the businessman stoking these fires?
Let’s allow Tay 2.0 to expose the underlying fear, ignorance and hate that many of our candidates may not possess themselves, but certainly reflect.
Obviously one party does not have a monopoly on stupid statements, but Tay 2.0 can certainly learn a lot from the rhetoric of Republicans in this campaign. The Internet is a dangerous hiding place for many of the world’s worst humans, but it’s the public light that is also receiving some hate-filled, militaristic, suggestive and ignorant statements from those who represent them.
Microsoft created something that is meant to learn. But we as humans are in a curious position of being required to be a responsible role model for this creation, because it is influenced by us. Humanity has shown time and again that we can use technology to create cures to diseases, shoot people out of our atmosphere and build artificial networks that bring people together instead of breaking them apart. Tay 2.0, by learning from us, can show us whether we meet our own standards.
Let’s allow an artificial intelligence to prove whether we are, in fact, intelligent. Let’s see what an innocent Twitter bot can turn into when it’s turned onto the political scene of our country.