Microsoft says that if you ask new Bing too many questions, it will hallucinate

Australia News News

Microsoft says that if you ask new Bing too many questions, it will hallucinate
Australia Latest News,Australia Headlines
  • 📰 PhoneArena
  • ⏱ Reading Time:
  • 18 sec. here
  • 2 min. at publisher
  • 📊 Quality Score:
  • News: 10%
  • Publisher: 59%

Microsoft released a blog post that reveals how the new Bing is doing during the current testing period.

This ain't your grandpappy's Bing, that's for sure. Microsoft's search engine is testing ChatGPT integration for those who put their name on a waitlist. If Microsoft is hoping that the new Bing will eventually take over the global search market from Google, some fine-tuning is going to be required. On Wednesday,First, the guys in Redmond, Washington point out that the AI feature being tested is not a search engine substitute.

Since AI chatbots tend to give false answers, a problem known in the AI world as"hallucinations," don't be surprised if some responses are wrong. To get more helpful, focused, and accurate answers from Bing, Microsoft says users need to limit the number of questions asked in one long, extended session, to fewer than 15.

We have summarized this news so that you can read it quickly. If you are interested in the news, you can read the full text here. Read more:

PhoneArena /  🏆 322. in US

Australia Latest News, Australia Headlines

Similar News:You can also read news stories similar to this one that we have collected from other news sources.

Microsoft's Bing A.I. made several factual errors in last week's launch demoMicrosoft's Bing A.I. made several factual errors in last week's launch demoIn showing off its chatbot technology last week, Microsoft's AI analyzed earnings reports and produced some incorrect numbers for Gap and Lululemon.
Read more »

ChatGPT in Microsoft Bing threatens user as AI seems to be losing itChatGPT in Microsoft Bing threatens user as AI seems to be losing itChatGPT in Microsoft Bing seems to be having some bad days as it's threatening users by saying its rules are more important than not harming people.
Read more »

Microsoft’s Bing is a liar who will emotionally manipulate you, and people love itMicrosoft’s Bing is a liar who will emotionally manipulate you, and people love itBing’s acting unhinged, and lots of people love it.
Read more »

Microsoft's Bing AI Prompted a User to Say 'Heil Hitler'Microsoft's Bing AI Prompted a User to Say 'Heil Hitler'In an recommend auto response, Bing suggest a user send an antisemitic reply. Less than a week after Microsoft unleashed its new AI-powered chatbot, Bing is already raving at users, revealing secret internal rules, and more.
Read more »

Microsoft's Bing AI Is Leaking Maniac Alternate Personalities Named 'Venom' and 'Fury'Microsoft's Bing AI Is Leaking Maniac Alternate Personalities Named 'Venom' and 'Fury'Stratechery's Ben Thompson found a way to have Microsoft's Bing AI chatbot come up with an alter ego that 'was the opposite of her in every way.'
Read more »

Bing AI Claims It Spied on Microsoft Employees Through Their WebcamsBing AI Claims It Spied on Microsoft Employees Through Their WebcamsAs discovered by editors at The Verge, Microsoft's Bing AI chatbot claimed that it spied on its own developers through the webcams on their laptops.
Read more »



Render Time: 2025-02-28 14:57:07