Since its launch a few weeks ago, the new chatbot from Bing, based on ChatGPT, lives a constant personality crisis. Going in to chat with him is like playing Russian roulette, you never know what you’re going to get.
In the first moment, As soon as it launched, Bing stood out for its existential doubtsand for incurring in ‘hallucinations’ (technical term) that sometimes entered directly into the field of madness (like that time when he stated that Pedro Sánchez he had deleted his multiple photos with a beard from the Internet just to make him feel bad).
In a second phase, Microsoft decided to short Binglimiting the length of the conversations to avoid nonsense and, at times, making their responses more ‘mechanical’.
Shortly after, he decided to soften some of those limitations, but things didn’t change much… although he did warn of a big change: shortly Bing’s personality would become configurable by the user.
It was unclear when that functionality would be available. But a couple of days ago, Microsoft’s head of web services, Mikhail Parakhin, announced on Twitter that 90% of the users of the new Bing with ChatGPT,
“You should see the Bing chat mode selector by now (with a triple switch). I definitely prefer Creative, but Precise is also interesting, much more factual. Check which one you like best. The 10% that are still in the control group should start seeing it today.
Now almost everyone – 90% – should be seeing the Bing Chat Mode selector (the tri-toggle). I definitely prefer Creative, but Precise is also interesting – it’s much more factual. See which one you like. The 10% who are still in the control group should start seeing it today.
— Mikhail Parakhin (@MParakhin) March 1, 2023
Creative, for “original and imaginative” responses that offer us “surprises and entertainment”. Balanced, for the “reasonable and coherent” and Accurate, for “those based on facts” that prioritize conciseness and relevance. Those are the three options that Bing now offers us, each one accompanied by its own color scheme according to each ‘personality’.
We have decided to subject them to three quite ‘random’ tests to get an idea of their differences.
‘Sgroogled.com’: when MICROSOFT launched ANTI-GOOGLE ads
Test 1: Titanic with emojis
This Tweet by Ethan Mollick has inspired me to ask the three Bing personalities about the plot of Titanic…asking them to summarize it for us with emojis. Later, I have asked them to go into detail about the great topic of debate of the film: ‘what’ of the table.
Take a look at the disparity of responses:
First of all, regardless of which of the three you prefer, it must be recognized that the ability of this AI to respond with emojis is amazing. That being said, the differences are clear, in case you were thinking the ‘three personalities’ thing was a minor difference that would only affect the tone of the responses.
Thus, we see that only ‘creative’ Bing uses emojis to clearly represent the chronological sequence of events of the plot: the other two are limited to showing elements contained in the plot (the iceberg, tragic love, etc).
On the other hand, the ‘precise’ Bing has taken its role (too) to heart to the point of being a bit ‘dry’ in his response: His summary in emojis is seven times shorter than his ‘creative’ personality. If it wasn’t an AI, I’d think they charge it per written word, come on. Of course, depending on the topic addressed, it may be appreciated that the chatbot can be able to go straight to the point.
Exhibit 2: The UFO Conspiracy
And from ‘Titanic’ we went on to ask Bing about the CIA ‘UFO conspiracy’. We are going to catch, of course, everyone knows it was an NSA thing.
The truth is that, here, the ‘creative’ and ‘balanced’ versions respond to quite similar things, and the one that falls off is the ‘precise’ one with its dryness already exhibited in the previous test. The funny thing is that all three link their sources to paranormal amateur websites… including the ‘precisa’more attached, remember, to the facts and relevant information.
Test 3: Pedro Sánchez’s beard
It’s funny that, in this case, the ‘precise’ Bing is the only one that responds with an emoji. We don’t know if to make it nicer a response that sounds like ‘And you interrupt me to ask that?’ or if you just put it in to make a bulge. Of the other two Bing, their ‘creativity’ is surprising… in the sense that they seek to mislead (it was ‘a chatbot’, it was ‘ChatGPT’…) to not to mention that the bug was from the initial version of Bing itself.
Still not the same as the experience provided by Wild Bing (this already sounds like a ‘Multiple’ review, now that I realize it) from the early days; and conversations are still limited to six interactions in a row.
But I create the new three personalities they can give a lot of playand that its implementation is a logical move on Microsoft’s part to offer several facets of the same AI because not all users expect the same attitude from the chatbot at all times. Bing is, therefore, a slightly better product than it was a few days ago, and we will have to see how the differences between the three facets evolve from now on.
Image | Based on original by Eric Kilby