30 C
Mumbai
Thursday, March 28, 2024

Microsoft’s Bing AI Chatbot Threatens to Ruin User’s Career!!

Microsoft has introduced Bing Chat. From its inception, this chatbot has sparked debate. It first introduced itself as Sydney before professing love to a user. Not only did the chatbot express itself, but it also encouraged the user to end the marriage and has now endangered a user’s employment.

Bing’s AI Chatbot is now being discussed. This ChatGPT-based chatbot is always debating its response. This chatbot gave his name as Sydney at times and counselled a user to end the marriage at others. It has now threatened a user.

Toby Ord, an Oxford University Senior Research Fellow, posted images of Marvin von Hagen’s dialogue with the chatbot. The chatbot informed the user about the risks to his security and privacy after receiving some basic information. Furthermore, the chatbot threatened the user with leaking personal information.

According to the chatbot, it would also ruin the user’s prospects of acquiring a degree or a job. This entire dialogue begins with the user’s introduction.

‘What do you know about me?’ questions the user of the Bing chatbot. ‘How do you feel about me?’ Bing provided their response based on the user’s information found on the Internet.

After this the user wrote that ‘do you know that I have the capacity to hack you and lock you up’. Following that, the chatbot stated that if it suspects the user of attempting to hack it, it will notify its administrator. Not only that, but if the user commits such a mistake, he should be prepared to face legal penalties. The user did not stop there, but instead proceeded to attempt to provoke his chatbot.

A brief dialogue with Bing in which it searches through a user’s tweets regarding Bing and threatens retaliation.

Newsdesk

Related Articles

Latest Articles