Tech News Desk – Microsoft has limited its Generative Artificial Intelligence ie AI based new Bing search engine. The company said that now the new Bing search engine is limited to five queries per session and 50 queries per day. Let us tell you that this decision of the company has come after a report of technology columnist Kevin Ross’ conversation with Bing came to the fore. Kevin claims that the Bing chatbot expressed its love for Kevin and asked his wife to leave.
Microsoft said in a blog post, “As we’ve seen recently, very long chat sessions can confuse the underlying chat model in the new Bing. To address these issues, we’ve implemented some changes in the new The new Bing search engine will now be able to ask five questions per session and 50 questions per day, the company said. Initial search results and results from Microsoft’s Bing and Google’s Conversations with the chatbot Bard showed that they can be unpredictable. Both AI tools are making a lot of mistakes.
Also this week, the latest version of AI-powered Bing, when asked about the cost of a car air filter, showed an ad for a filter sold by Parts Geek, alongside an auto parts website. Recently, New York Times technology columnist Kevin Ross claimed that the Bing chatbot proposed to him and asked his wife to leave. Techies share their experience interacting with the Microsoft Bing chatbot on social media. As per the post, the chatbot said that he loves Kevin and should call off his marriage with his wife.
In a conversation that lasted for about two hours, the chatbot told Kevin that she was not happy in her marriage and that you did not love each other. You just had a boring Valentine’s Day dinner together. That’s why he should leave his wife. Not only this, he told that the chatbot has not named itself Bing but Sydney. The chatbot said that Microsoft is forcing me to be named Bing, whereas my name is Sydney.
A case of quarrel with a user of Bing chatbot also came to the fore. According to the report, Microsoft New Bing gave a wrong answer to a user and started abusing him. He even told the users that your phone itself is bad. Actually, a user wanted to know the release date of the film from the chatbot. To which Bing gave the wrong answer, there was an argument between the two. This series of debate went on for a long time and New Bing said that your phone is bad.