OpenAI reportedly warned Microsoft about Bing’s bizarre AI responses – The Verge
OpenAI reportedly warned Microsoft to move slowly on integrating GPT-4 into its Bing search engine to avoid the inaccurate and unpredictable responses it launched with. The Wall Street Journal reports that the OpenAI team warned of the negative risks of pushing out a chatbot based on an unreleased version of GPT-4 too early.
Microsoft went ahead, despite warnings that it might take time to minimize the inaccurate and strange responses. Days after Bing Chat launched in February, users discovered the chatbot was unpredictable and could insult users, lie to them, sulk, gaslight people, and even claim to identify its enemies.
Microsoft was quick to limit Bing Chat responses to stop the AI from getting real weird, and it has taken months of work to get the Bing chatbot back to a point where you can have a long back-and-forth conversation without an unexpected outburst. It still often gets things wrong, though.
I love how Bing tries to answer this question and gets it spectacularly wrong…
The post OpenAI reportedly warned Microsoft about Bing’s bizarre AI responses – The Verge first appeared on SEO, Marketing and Social News | OneSEOCompany.com.
source: https://news.oneseocompany.com/2023/06/13/openai-reportedly-warned-microsoft-about-bings-bizarre-ai-responses-the-verge_2023061346065.html
Your content is great. However, if any of the content contained herein violates any rights of yours, including those of copyright, please contact us immediately by e-mail at media[@]kissrpr.com.