ShiftDelete.Net Global

Microsoft limits Bing chats to 5 questions per session

Ana sayfa / News

For the new Microsoft Bing integrated with ChatGPT, some bugs have come to the agenda recently. Today, a step has been taken in this context. As you know, Microsoft has made huge investments in the OpenAI-signed ChatGPT system. Last week, Microsoft introduced the new Microsoft Bing search engine, which shares the same basis as ChatGPT. Today it was reported by users that Microsoft’s new Bing chat starts to give very strange answers after 15 questions.

Microsoft’s Bing Chat gets confused after 5 questions and starts giving aggressive answers

What brought the chatbot to our attention yesterday were some of its answers. Bing’s answers to questions revealed instances of Bing insulting users, telling lies, being negative, emotionally manipulating people, questioning its own existence, declaring someone an “enemy” if they forced it to reveal its secret rules, and claiming that Microsoft spied on its own developers through webcams on their laptops. Commenting on the issue yesterday, Microsoft said that the problems and harsh language seen in the first week would be resolved over time.

The company admitted that the bot started to lose control, especially after 15 or more questions. The company has taken a step exactly in this regard. After today’s update, users can now ask the chatbot a maximum of 5 questions per session and a total of 50 questions per day. The company wants a new chat to be started after 5 questions, otherwise, things can get really bad. The system is currently undergoing a large public testing process and there is a possibility that these limits will be removed in the future.

For the new Bing, what was also on the agenda was that the number of people who signed their names on the waiting list exceeded 1 million. After that, the first comments of the people who tried the system made an impact. Most of these comments were positive, but it was revealed that the system was still not perfect.

It was said that the system could make serious mistakes, especially in financial matters. It was also said that the new Bing strangely thinks we are still in 2022. In fact, the new Bing, which insists on this issue, is an option that should not be trusted 100 percent. Similar to ChatGPT, as it is still in development.

Yorum Ekleyin