#microsoft’s Bing is an emotionally manipulative liar, and people love it

Tech

A screenshot of the main homepage of the new AI Bing.
Bing’s new AI chatbot is entertaining (and occasionally scaring) users with its ‘unhinged’ conversations. | Image: The Verge

Microsoft’s Bing chatbot has been unleashed on the world, and people are discovering what it means to beta test an unpredictable AI tool.

Specifically, they’re finding out that Bing’s AI personality is not as poised or polished as you might expect. In conversations with the chatbot shared on Reddit and Twitter, Bing can be seen insulting users, lying to them, sulking, gaslighting and emotionally manipulating people, questioning its own existence, describing someone who found a way to force the bot to disclose its hidden rules as its “enemy,” and claiming it spied on Microsoft’s own developers through the webcams on their laptops. And, what’s more, plenty of people are enjoying watching Bing go wild.

A disclaimer: it’s impossible to…

Continue reading…

Comments