Microsoft's Bing AI Chatbot Copilot Gives Wrong Election Information

New study finds that the AI chatbot provides made-up stories about political candidates.

It appears that Microsoft's AI chatbot is an election truther.

According to a new study conducted by two nonprofit groups, AI Forensics and AlgorithmWatch, Microsoft's AI chatbot failed to correctly answer one out of three election-related questions.

Microsoft's chatbot makes up controversies about political candidates

The chatbot, formerly known …

from Mashable India tech https://ift.tt/7lH8V6r

Comments

Popular posts from this blog

Researchers Are Using AI To Predict Crime, Again

Apple Unveils macOS Sonoma; Check New Features, Supported Devices And How To Download