Subscribe to Our Newsletter

Success! Now Check Your Email

To complete Subscribe, click the confirmation link in your inbox. If it doesn’t arrive within 3 minutes, check your spam folder.

Ok, Thanks

Microsoft has shown 'privacy preserving' widespread Copilot use for health and caregiving

Microsoft analysis of more than half a million Copilot conversations reveals widespread symptom interpretation, care navigation and proxy caregiving usage.

Defused News Writer profile image
by Defused News Writer
Microsoft has shown 'privacy preserving' widespread Copilot use for health and caregiving
Photo by Simon Ray / Unsplash

Copilot, Microsoft's generative assistant, handled more than half a million health and well-being conversations during a month-long analysis undertaken by the company.

Microsoft said the research used a privacy-preserving workflow: conversations were de-identified at source and processed by automated topic and intent extraction, and no human reviewers read user chats. Nearly 1 in 5 health conversations, in its research cohort, involved users describing their own symptoms, interpreting test results, or managing chronic conditions.

Around 10.9% of health queries sought symptom interpretation, 9% asked for personalized lifestyle and fitness guidance, and 5.8% of the co-pilot queries related to healthcare navigation, insurance, or benefits.

Mobile conversations revealed twice the rate of symptom and condition questions compared with desktop, and 75% more emotional well-being queries, while desktop skews 3x toward research and academic work.

Across symptom and condition management questions, 1 in 7 conversations are on behalf of someone else, often children or ageing parents, which raises issues around consent and escalation.

Microsoft said its consumer health team is working to improve clinical context, stronger clinical reasoning, and care navigation features, including provider directories and appointment booking in the US, with plans to expand globally. The company also noted its broader products handle substantial health traffic:

The company at the same time noted, by way of disclaimer, that Copilot is "not intended to diagnose, treat, or prevent diseases or other conditions and is not a substitute for professional medical advice."

The recap

Analysis covers over half a million Copilot health conversations.

Nearly 1 in 5 conversations involve symptoms or condition management.

Microsoft will expand provider directories and improve clinical reasoning.

Defused News Writer profile image
by Defused News Writer