Microsoft's Bing AI Chatbot Copilot Gives Wrong Election Information
![](https://sm.mashable.com/t/mashable_in/article/m/microsofts/microsofts-bing-ai-chatbot-copilot-gives-wrong-election-info_uzq1.640.jpg)
It appears that Microsoft's AI chatbot is an election truther.
According to a new study conducted by two nonprofit groups, AI Forensics and AlgorithmWatch, Microsoft's AI chatbot failed to correctly answer one out of three election-related questions.
Microsoft's chatbot makes up controversies about political candidates
The chatbot, formerly known …
from Mashable India tech https://ift.tt/7lH8V6r
Comments
Post a Comment