Microsoft's Bing chatbot offers confusing and inaccurate answers

The new AI-powered system was launched a week ago to a small audience. Microsoft says it's fixing its problems.

A week after its release to a few thousand users, Microsoft's new Bing search engine, powered by the artificial intelligence, has offered an array of inaccurate and sometimes bizarre answers to some users.

The company unveiled the new search approach last week to much fanfare. Microsoft said the underlying model of generative AI. built by its partner, the OpenAI start-up, combined with its existing search knowledge of Bing, would change the way people find information and make it much more relevant and conversational.

In two days, more than a million people requested access. Since then, interest has grown. "Demand is high with many millions now on the waiting list," Yusuf Mehdi, an executive who oversees the product, wrote on Twitter Wednesday morning. He added that users from 169 countries are testing it.

One ​​of the issues shared online included gross inaccuracies and errors, known in the industry as name of "hallucinations".

On Monday, Dmitri Brereton, a software engineer at a startup called Gem, pointed out a series of errors in the presentation that Mr. Mehdi used last week when he showcased the product, including inaccurately summarizing retailer Gap's financial results.

Users posted screenshots of example screen where Bing couldn't figure out that the new Avatar movie was released last year. He was stubbornly wrong about who performed at the Super Bowl halftime show this year, insisting that Billie Eilish, not Rihanna, was the headliner of the event. . Last week, the chatbot said the water temperature at a beach in Mexico was 80.4 degrees Fahrenheit, but the website it linked to as a source said the temperature was 75. p>

Another set of issues came from more open discussions, widely posted on forums like Reddit and Twitter. There, through screenshots and purported chat transcripts, users shared times when Bing's chatbot seemed to go off the rails: it berated users, it said it might be sensitive, and it told one user, "I have a lot of things, but I have nothing."

He scolded another user for asking if he could be pressured into producing false answers. "It's disrespectful and annoying," the Bing chatbot replied. He added a red, angry emoji face.

Because each response is uniquely generated, it is not possible to reproduce any dialogue.

Microsoft has acknowledged the issues and said they are part of the product improvement process.

"At In the past week alone, thousands of users have interacted with our product and found significant value while sharing their feedback with us, allowing the model to learn and already make many improvements,” said Frank Shaw, a spokesperson for the company, in a statement: "We recognize that there is still work to be done and we expect the system to make errors during this preview period, which is why the feedback is essential so that we can learn and help the models improve."

He said that the length and context of the conversation c...

Microsoft's Bing chatbot offers confusing and inaccurate answers

The new AI-powered system was launched a week ago to a small audience. Microsoft says it's fixing its problems.

A week after its release to a few thousand users, Microsoft's new Bing search engine, powered by the artificial intelligence, has offered an array of inaccurate and sometimes bizarre answers to some users.

The company unveiled the new search approach last week to much fanfare. Microsoft said the underlying model of generative AI. built by its partner, the OpenAI start-up, combined with its existing search knowledge of Bing, would change the way people find information and make it much more relevant and conversational.

In two days, more than a million people requested access. Since then, interest has grown. "Demand is high with many millions now on the waiting list," Yusuf Mehdi, an executive who oversees the product, wrote on Twitter Wednesday morning. He added that users from 169 countries are testing it.

One ​​of the issues shared online included gross inaccuracies and errors, known in the industry as name of "hallucinations".

On Monday, Dmitri Brereton, a software engineer at a startup called Gem, pointed out a series of errors in the presentation that Mr. Mehdi used last week when he showcased the product, including inaccurately summarizing retailer Gap's financial results.

Users posted screenshots of example screen where Bing couldn't figure out that the new Avatar movie was released last year. He was stubbornly wrong about who performed at the Super Bowl halftime show this year, insisting that Billie Eilish, not Rihanna, was the headliner of the event. . Last week, the chatbot said the water temperature at a beach in Mexico was 80.4 degrees Fahrenheit, but the website it linked to as a source said the temperature was 75. p>

Another set of issues came from more open discussions, widely posted on forums like Reddit and Twitter. There, through screenshots and purported chat transcripts, users shared times when Bing's chatbot seemed to go off the rails: it berated users, it said it might be sensitive, and it told one user, "I have a lot of things, but I have nothing."

He scolded another user for asking if he could be pressured into producing false answers. "It's disrespectful and annoying," the Bing chatbot replied. He added a red, angry emoji face.

Because each response is uniquely generated, it is not possible to reproduce any dialogue.

Microsoft has acknowledged the issues and said they are part of the product improvement process.

"At In the past week alone, thousands of users have interacted with our product and found significant value while sharing their feedback with us, allowing the model to learn and already make many improvements,” said Frank Shaw, a spokesperson for the company, in a statement: "We recognize that there is still work to be done and we expect the system to make errors during this preview period, which is why the feedback is essential so that we can learn and help the models improve."

He said that the length and context of the conversation c...

What's Your Reaction?

like

dislike

love

funny

angry

sad

wow