Microsoft has positioned Copilot as a serious tool that can be used as an all-purpose digital assistant, even introducing a new class of laptops: Copilot+ PCs.
But within Microsoft's updated Copilot terms of service — effective October 24, 2025 — is a line that should give pause to anyone using the company's AI assistant for anything more consequential than sorting a list.
The fine print reads: "Copilot is for entertainment purposes only. It can make mistakes, and it may not work as intended. Don't rely on Copilot for important advice. Use Copilot at your own risk."
Mashable Light Speed
The terms go further, noting that Microsoft makes no warranty that Copilot's responses won't infringe on someone else's rights, and that users are "solely responsible" if they choose to publish or share anything the AI produces. The company also reserves the right to limit, suspend, or permanently revoke access to Copilot at any time, without notice, for any reason it sees fit.
To be fair, most major AI companies include similar hedging language in their terms — acknowledging that their models hallucinate, get things wrong, and shouldn't be treated as authoritative sources. But "entertainment purposes only" is a notably stark framing for a product Microsoft has aggressively positioned as a productivity tool and integrated across its entire Office and Windows suite.
The updated terms also added language covering Copilot Actions, Copilot Labs, and shopping experiences — and clarified that when you ask Copilot to take actions on your behalf, you're solely responsible for whatever happens as a result.
So: use it to brainstorm, sure. But think twice before using it as a therapist.