HomeTechnology“I prefer not to continue this conversation:” how Microsoft decided to anesthetize...

“I prefer not to continue this conversation:” how Microsoft decided to anesthetize ChatGPT

In just one week, the Bing version of the famous chatbot has caused a lot of questions to the point that Microsoft has completely sanitized it.

Microsoft quickly closed Pandora’s box. About ten days after going online, Microsoft’s Bing version of ChatGPT has become a silent voice. The chatbot that insulted, spied on or wanted to be human is over. From now on, most questions that would come out of the highly consensual framework of a vacation in Italy or the best taco recipes end inexorably with the same sentence: “I’m sorry, but I’d rather not continue this conversation.”

For Microsoft, which has great difficulty explaining the excesses (and delusions) of its ChatGPT, it was probably the best way to end the fantasy of conscious artificial intelligence.

But it is also the central point of the chatbot, which had been able to appropriate a very “human” diction despite the fact that its operation is only a series of words without the slightest reflex.

Google in ambush

At the same time, Microsoft had also limited the number of requests to 5 per session and 50 per day. And already, the company should drop ballast at the request of the first users. Because, according to early fans of ChatGPT on Reddit, Microsoft has completely “lobotomized” its chatbot. Ultra-popular in its infancy, AI loses interest in it here, which obviously doesn’t suit Microsoft.

It’s also probably a sign for Google, which earlier this month announced the arrival of its rival Bard, which is currently being tested internally. Between the risk of a careless chatbot and the integration of a useless tool, the company is closely watching Microsoft’s setbacks.

Author: Thomas LeRoy
Source: BFM TV

Stay Connected
16,985FansLike
2,458FollowersFollow
61,453SubscribersSubscribe
Must Read
Related News

LEAVE A REPLY

Please enter your comment!
Please enter your name here