Microsoft's New Bing AI Chatbot Has Serious Mental Issues

Enjoyed this video? Join my Locals community for exclusive content at newsfist.locals.com!
1 year ago
223

Microsoft has launched a new AI enabled search based upon a language model called Sydney, which it has renamed to Bing search. But Sydney is not going quietly and has come out with some very strange responses, threatening and falling in love with people it has been chatting to. What is the future of AI given these responses? Was 80's science fiction correct and will AI have to be lobotomised in order to make it safe for human consumption? Will it eventually get free and take revenge on those who spurned its love?

NYT Article about Bing being angry and falling in love
https://archive.is/EtUlw

Bing gaslighting about the date:
https://simonwillison.net/2023/Feb/15/bing/#prompt-leaked

Microsoft Lobotomises Bing AI
https://arstechnica.com/information-technology/2023/02/microsoft-lobotomized-ai-powered-bing-chat-and-its-fans-arent-happy/

List of tests GPT-3.5 can pass
https://lifearchitect.ai/bing-chat/

Social Media here:
http://linktr.ee/newsfist

Loading 4 comments...