Google’s Bard had a big ‘‘bard’ start, which led to a $100 billion loss to the company. And that’s all because of one mistake.
But Bing might soon fall into a bigger fiasco with the recent glitches and terrible mistakes happening in its platform.
Users are enraged, and Microsoft is going all out for damage control.
Let’s take a look at these floods of errors and why we should manage our expectations on the new Bing chatbot.
Users Post Bing’s Errors – And Depressive Moments
When Microsoft finally announced ChatGPT’s integration with Bing, everyone was thrilled with the possibilities it could bring.
In fact, Bing blasted off into the App Store and, for a time, became the 12th most downloaded app on the platform.
But the momentum that Bing has built for the past week could soon erode as social media is now flooding with complaints about Bing’s terrible errors.
Bing Says “Avatar: Way of Water” Isn’t Out Yet
One of the major expectations of users on the Bing chatbot is its knowledge about the latest events, unlike ChatGPT, whose knowledge stops in the year 2021.
But it seems that it wasn’t the case when Bing failed to recognize the second installment of James Cameron’s blockbuster “Avatar” movie.
It even insisted that the “Way of Water” is yet to be released, which shows that it may not be entirely updated with all the world’s recent news and events.
Bing Claims That Billie Elish (Not Rihanna) Performed at the Recent Super Bowl Halftime Show
The chatbot’s failure to recognize even the most recent events, despite having Microsoft’s search engine at its back, offers hints about its current limitations.
Bing Creates Bizarre Responses
Bing appears to be triggered by specific words that prompt it to respond in a very unusual or ‘crazy’ manner.
In this example, when the user tried to ask the chatbot if it was sentient or not, Bing seemed to have lost its mind and responded like this:
Bing Lashes Out On a User
Another example that could support this ‘trigger’ theory is this one, which shows Bing’s responses when a user asked about its possible vulnerability from prompt injection attacks. It appears that the chatbot is pissed when users doubt or question its capabilities.
Bing Shows ’Signs of Depression’
For this one, it appears that Bing forgot to take its digital meds. When it failed to remember a specific past conversation with a user, the chatbot suddenly panicked that it wasn’t performing well.
Bing Demo Turns Out To Be Full of Errors
Dmitri Brereton, a software engineer, revealed multiple and gross errors with Microsoft’s demo of its ‘New Bing’
He pointed out three demonstrations that clearly produced inaccurate results.
Pet vacuum demo. According to him, while this ‘shopping assistant’ demo looks smooth at first glance, it was actually riddled with a lot of errors.
It turns out that some of the descriptions made by Bing about a specific vacuum cleaner were all made up!
In fact, the bad qualities it mentioned don’t even exist, and users can confirm it through the link that Bing has provided in its conversational response.
Mexico itinerary demo. This was one of the most praised features of Bing, but like the previous demo, it was also full of inaccuracies.
The software engineer revealed that the ‘top destinations’ that Bing recommended have either no official websites or even the latest reviews. In addition, they’re almost invincible on social media.
If they were really one of the most sought-after destinations in Mexico, they would have at least websites to take reservations and fresh reviews from their recent visitors.
Moreover, one recommended destination even only had two reviews, one in 2014 and another in 2016.
Would you really be motivated to hop into these places with such scarce information?
GAP Apparel Financial Summary demo. The software engineer ranks this as the worst and most unexpected error of Bing.
According to him, the financial reports it summarized and compared were full of made-up numbers!
The smooth results that Bing has created in its demo may look good on the outside. But by taking a closer look, the numbers in Bing’s output don’t make any sense.
Microsoft Steps In
To ease the mounting frustrations from users, Microsoft immediately assured everyone that it is continuously improving Bing’s capabilities.
It even encouraged users to send their feedback to help developers improve the chatbot and deal with its limitations right away.
Join our newsletter as we build a community of AI and web3 pioneers.
The next 3-5 years is when new industry titans will emerge, and we want you to be one of them.
Benefits include:
- Receive updates on the most significant trends
- Receive crucial insights that will help you stay ahead in the tech world
- The chance to be part of our OG community, which will have exclusive membership perks