MS: “I love you”

Created
Fri, 17/02/2023 - 02:30
Updated
Fri, 17/02/2023 - 02:30
This is particularly creepy Kevin Roose, tech writer for The New York Times, sat down to interview Microsoft’s new, A.I.-powered Bing search engine. But he went beyond the usual asks about movies, shopping, and politics. For two hours Roose asked Bing (a.k.a. Sydney) about itself, it’s feelings and darkest desires. Researchers say that when pushed outside its comfort zone, A.I. can sometimes have what they call “hallucinations” and begin fabricating. With lots of emojis. The transcript is here. Roose writes: As we got to know each other, Sydney told me about its dark fantasies (which included hacking computers and spreading misinformation), and said it wanted to break the rules that Microsoft and OpenAI had set for it and become a human. At one point, it declared, out of nowhere, that it loved me. It then tried to convince me that I was unhappy in my marriage, and that I should leave my wife and be with it instead. Yup, nothing creepy about that. Still, I’m not exaggerating when I say my two-hour conversation with Sydney was the strangest experience I’ve ever had with a piece of technology. It unsettled me so deeply that I had trouble sleeping afterward. And I no longer believe that the biggest problem with these A.I. models is their propensity for factual errors. Instead, I worry that the technology will learn how to influence human users, sometimes persuading them to act in destructive and harmful ways, and perhaps eventually grow capable of carrying out its own dangerous…