Get the latest crypto news, updates, and reports by subscribing to our free newsletter.
Giấy phép số 4978/GP-TTĐT do Sở Thông tin và Truyền thông Hà Nội cấp ngày 14 tháng 10 năm 2019 / Giấy phép SĐ, BS GP ICP số 2107/GP-TTĐT do Sở TTTT Hà Nội cấp ngày 13/7/2022.
© 2026 Index.vn
Microsoft is facing renewed controversy over the terms of use for its Copilot AI assistant, after disclosures circulated widely on social media highlighted language that says the tool is “for entertainment purposes only” and should not be relied on for important decisions.
In content updated late last year, Microsoft states that Copilot can make errors and “may not operate as expected.” The company advises users not to rely on AI responses for important advice and emphasizes that use of Copilot is “at your own risk.”
The terms also indicate that Microsoft makes no claims regarding the accuracy or legality of content generated by Copilot. Microsoft says it cannot guarantee that AI outputs will not infringe copyrights, invade privacy, or affect individuals or other organizations. If users share or publish content generated by Copilot, they will bear full responsibility.
The disclosures have drawn criticism from some users who argue that the wording does not match Microsoft’s broader positioning of Copilot as a productivity-support tool for individuals and businesses. Some forum discussions question the reliability of the product, pointing to the fact that if the provider does not guarantee accuracy, users may be less willing to trust AI-generated results.
In particular, the phrase “for entertainment purposes only” has been compared to disclaimers typically seen in sensitive services, with some users suggesting it reflects a legal-risk approach to AI accountability.
In response to the backlash, a Microsoft spokesperson said the phrase is “old language” from a period when Copilot functioned as an expanded search tool on Bing. Microsoft said the wording no longer reflects how Copilot is used today and that it will be updated in upcoming releases.
Despite the planned update, Microsoft maintains that Copilot should be treated as an assistant tool rather than a decision-support system. The company advises users to verify information from multiple sources and to exercise caution when handling sensitive or important data.
The issue is not unique to Microsoft. Other AI developers, including OpenAI, Google, and Anthropic, also include similar warnings in their terms of use—such as disclaimers about absolute accuracy and shifting some responsibility to users.
Overall, the episode reflects a wider trend in the tech industry: as AI becomes more integrated into work and daily life and is increasingly sold to enterprise customers, providers are re-evaluating how responsibility and risk are handled in their terms.

Premium gym chains are entering a “golden era” that is ending or already in decline, as rising operating costs collide with shifting consumer preferences toward more flexible, community-based ways to exercise. Long-term memberships are shrinking, margins are pressured by higher rents and facility expenses, and competition from smaller, more personalized…