Microsoft was careful not to reveal it before the time, but users of the new version of Bing, which is available on the waiting list, have been benefiting without suspecting it from GPT-4 for a month.
OpenAI has just presented GPT-4, the new multimodal language model that will, once again, revolutionize Artificial Intelligence. This AI is, a priori, only accessible to subscribers to the ChatGPT Plus paid service. However, Microsoft has just confirmed that the new multifunction chatbot is actually integrated with the new version of its search engineBing, for over a month.
If you use the new version of Bing, revised with ChatGPT sauce, you have therefore benefited from the new functionalities offered by the GPT-4 model. Still, it would have been necessary to know it to take full advantage of it. Indeed, Internet users unaware of the presence of these new features have therefore not activated them. The Redmond firm had nevertheless dropped clues quite subtle in recent weeks.
The new version of Bing integrates the GPT-4 model since February 2023
Between interface improvements and lifting of conversation limitations from 6 to 10 exchanges, or even a better moderation of remarks, we can say that Microsoft has not been idle. A company official said, “We’re happy to confirm that the new Bing runs on GPT-4, and has been customized for search. If you’ve previewed the new Bing in the past six weeks, you’ve already had your first taste of the power of OpenAI’s latest model.
Read — Bing surpasses 100 million users for the first time thanks to new redesign
The most remarkable novelty brought by the GPT-4 model is of course its ability to process multimodal information. Bing’s AI now understands both images and text, and that’s going to change everything. In response to a question, the search engine will be able to offer image answers, and the reverse will also be true. It will be able to analyze an image that you submit to it. In other words, she can solve puzzles and process images.
Source: MS Power User