Since Microsoft decided to add conversational artificial intelligence to its Bing search engine, a select group of users have been testing it, and receiving a series of certainly disturbing responses.
However, Microsoft has decided to limit the number of questions and sessions that each account can do to avoid this type of circumstance, and the secret modes that have just been discovered may be the reason.
As they point out from BleepingComputer, they have discovered a series of secret and hidden modes within the intelligent chatbot of Bingwhere the AI you can switch to different modes of interaction with the user.
On the one hand we would have default mode, and which is basically dedicated to making answers related to the user’s question, as this type of AI has been working recently. But then there are three other modes to watch very closely.
Three most interesting modes
One of them is the Assistant mode, where the artificial intelligence it can help users perform tasks like booking a flight, sending you reminders, or checking the weather, among other things.
They have also discovered game modewhere the chatbot offers simple games such as hangman or questions and answers.
But perhaps the most striking mode is the friend modethe one that is causing a furore and that has been tested by a very select group of users, and where the conversational chat has given a series of disturbing and strange answers.
Considering that Microsoft has limited the topic of sessions and questions, it will be much more difficult to discover certain funny or disturbing conversations.