
Microsoft appears to be trying to clear up an awkward contradiction around its Copilot AI. After one of its own documents made the AI sound a lot less useful than the company’s marketing would suggest.
Users recently noticed Microsoft’s Copilot terms of use included a warning that the service is for “entertainment purposes only,” adding that it can make mistakes, may not work as intended, and should not be relied on for important advice. The same section also added that users must use Copilot at their own risk, which raised many eyebrows, given how aggressively Microsoft has been pitching Copilot as a productivity tool across Windows, Microsoft 365, and enterprise software.
How is Microsoft defending this?
According to Microsoft, the wording used in the document contains legacy language dating back to Copilot’s earlier life as a Bing-based search companion. In a statement to Windows Latest, the company said the
...Keep reading this article on Digital Trends.