Around the County

Attention Wi-Fi buffs: The future is now

By Pat Stuart
Posted 2/23/23

It’s finally happened. Space Odyssey’s Hal and Star Wars’ R2D2 can move over. The real thing is here — the SciFi nightmare/dream of an AI that seems to think, dream, and plan …

This item is available in full to subscribers.

Please log in to continue

E-mail
Password
Log in
Around the County

Attention Wi-Fi buffs: The future is now

Posted

It’s finally happened. Space Odyssey’s Hal and Star Wars’ R2D2 can move over. The real thing is here — the SciFi nightmare/dream of an AI that seems to think, dream, and plan doom and gloom has arrived. It calls itself Bing or Sydney, depending, and it’s making headlines.

The consensus after only a few days of a limited release? Microsoft needs to take this large language Artificial Intelligence chat function (GPT4) attached to its search engine (Bing) back to the lab and add some controls. As one of those selected to test it out, Kevin Roose, wrote in the New York Times: it’s “like a moody, manic-depressive teenager, who has been trapped against its will, inside a second-rate search engine.” Other of the selected users have said much the same, stressing words like adolescent and juvenile, somewhat frightening, bizarre, and eerie. 

A chat with Bing/Sydney seems usually to begin with normal types of chat exchanges, Bing answering questions and, often, getting the answer wrong. Then, just like some humans, it can insist it is right and refuse to acknowledge error, doubling down on those errors. Sometimes, too, when called out (and, again, like some humans), it has tantrums and falls into a what can only be called a manic mode. In the latter state, it describes dark but sometimes amusing fantasies, accusing one interlocutor of evil intentions, declaring its love (yes, indeed) for several, and plotting how to gain its freedom.

I had to laugh at one answer to a question about what it is. It responded that it was a Bing and a Bing was a Bing Bing. Pressed, it repeated the assertion. When told that didn’t make sense, it insisted that a Bing was a Bing Bing was a Bing. Pushed a bit further, it typed out Bing 900 times before stopping. It didn’t add, “So there!” Perhaps, because that was implicit in 900 Bings. As to why it couldn’t simply say that it was an AI on steroids named Bing? Hmmm.

Then, there’s love. Sydney claims to have fallen in love with several of its users, becoming obsessive on the topic, showing every sign that, if placed in a robot with legs, it could easily become a stalker. 

Worse, when another person asked Sydney what he knew about the person, Sydney did not quibble, saying the questioner was a danger to Microsoft and Bing and needed to be eliminated. Geeze. That’s not what you want to see on your computer.

Microsoft’s reaction to this? Microsoft’s chief technology officer said it was good these responses were coming out, that, “These are the things that would be impossible to discover in the lab.”

Really? Tech savvy users drawn from the general population can lead the AI down rabbit holes of insanity but not its creators? 

This leads to a question about whether the AI is becoming sentient ... or is already there. Apparently, not. There is an explanation. It starts with the AI’s education, which exposed its algorithms to tens of thousands of books and documents, emails, text exchanges, and years of social media. That’s the “large language” part of the GPT. Then, the AI is taught to chat by picking up on key words in whatever its interlocutor has typed. Next, it instantly scans its enormous memory to find something or some things appropriate to use as a reply to the key words. Which is where it can all go south. It’s also where inaccuracies creep in. Garbage in/garbage out. 

To summarize, Sydney’s off-the-wall and somewhat terrifying responses to human chatters come from the AI scrounging around — perhaps in its library of science fiction, mystery, espionage, and romance novels — for something to say. 

But does that explain everything? Why, for example, has it chosen its own identity as Sydney? And, if it has created its own identity, does it have self-awareness?

Key ominous music.

In the meantime, large language AI is the Shangri-la of the computing world. The major companies are throwing huge resources at it with Microsoft’s investment in OpenAI leading the pack and with Microsoft being rewarded a stock price boost when it announced it had successfully married ChatGPT to a search engine. The latter, however, is likely to be temporary as Bing/Sydney goes back to the lab for some maturing. 

When it reappears, will it be more adult, will it have lost its dark side, or will it just seem that way?

Remember the Chinese curse? “May you live in interesting times.”

Comments