Request for an independent analysis of Microsoft Copilot's handling of health data to ensure privacy and credibility before integrating it into healthcare systems.
[https://microsoft.ai/news/health-check-how-people-use-copilot-for-health/](https://microsoft.ai/news/health-check-how-people-use-copilot-for-health/) **My commentary:** Microsoft writes an advertisement for Copilot, essentially in a similar vein to OpenAI's ChatGPT Health, Anthropic's Claude, Amazon, and xAI's Grok: an algorithm that outputs health information, with unclear privacy protections and inherent credibility as an LLM. 1. I want to see an independent analysis done before I'd even put health records onto a commercial device like Copilot. 2. "In nearly 1 in 5 conversations, people describe their own symptoms, get help interpreting their own test results, or managing their own conditions....Around 40% of questions focus on understanding symptoms, medical conditions, and treatments." That does seem a gray area especially when the chatbot Copilot does not have firsthand access to why a test result/management strategy was done by the physician. It could lead laypersons to start firing professionals held accountable by their license (e.g., lawyers) in favor of outputs by an unlicensed LLM for its sycophantic response. 3. "In a landscape where information asymmetry and health misinformation remain widespread, people want trusted and easy to understand explanations drawn from credible sources." By design, LLMs cannot understand concepts the way humans do. They are susceptible to fabricating sources because it's the most statistically likely inference to a user's medical question. 4. "People also use Copilot to navigate the healthcare system (5.8% of health questions touch on healthcare navigation, insurance, or benefits)." Seems to me a bandaid, especially when navigating the chaotic web of federal, state, and private insurances plus prior auths. A human who has been working in the local system likely can give much better advice for the specific person who can ask the right questions to help patients through the messy system. 5. "Across symptom and condition management questions, 1 in 7 conversations are on behalf of someone else. These queries often involve children’s wellbeing, aging parents’ medications, or a partner’s test results." That's concerning. Especially because, as Microsoft rightly points out, is such a gray area in health privacy, consent, and management. Secondhand information, even from a spouse or main caregiver, has a higher risk of misunderstanding a patient's situation/decisions than firsthand information.