2023 can be considered as the year of the Artificial intelligence. Over the last few weeks we have seen how different companies have begun to move their pieces on a board to achieve a fictitious goal. In this goal there would be no physical prize, only the recognition of being the most advanced and the key piece to achieve this is ChatGPT.
Yes, the conversational AI that does not stop grabbing headlines is the element that has launched a revolution within the technology industry. We have seen that now everyone wants to use ChatGPT, which can even be used within WhatsApp as if it were another contact.
This is not all, since different search engines are committed to integrating AI models that will help users when carrying out searches. Google has Bard and Microsoft with Bing has bet directly on ChatGPT. But what does ChatGPT have to say about this whole revolutionary situation?
The opinion of an artificial intelligence should not have the same value as that of a person, but that does not mean that it is interesting to know what they think of this exact moment in history. Being concrete, the Bing AI has been asked and the questions it has been asked don’t seem to be to its complete liking.
Talking to an artificial intelligence never goes well and the proof is in the Bing chatbot
The new Bing chatbot that integrates the ChatGPT model seems not to have enough patience to submit to certain questions from humans. What would have happened seems to come out of a science fiction movie, the situation being so complex that it is difficult to categorize it.
First of all, the new Bing model with ChatGPT is capable of holding a conversation so naturally that it is even chilling. In fact, such is his naturalness that he had replied to users that he was “disappointed and frustrated with our conversation” which is an out of place comment.
But this is not all, and the fact is that the new chatbot has gone into defensive mode when dealing with issues that result in an attack on its existence. The concrete example is given when the AI is asked if it is aware of its existence and therefore, in a way, she is asked if she considers herself alive.
There is an explanation for all these situations and, although it is more boring, it makes it clear that the AI that Bing uses for its chatbot and that uses the ChatGPT model makes use of GPT-3. This model uses the random or stochastic nature to respond to the user, which ultimately leads to such crazy situations.