Subscribe to Our Newsletter

Success! Now Check Your Email

To complete Subscribe, click the confirmation link in your inbox. If it doesn’t arrive within 3 minutes, check your spam folder.

Ok, Thanks

What you need to know about ChatGPT Health as a secure hub for personal health information launches

OpenAI has introduced ChatGPT Health, a dedicated experience designed to help people better understand and manage their health by combining personal health data with AI, while placing privacy, security and clinical oversight at the centre.

Ian Lyall profile image
by Ian Lyall
What you need to know about ChatGPT Health as a secure hub for personal health information launches
Photo by Christopher Campbell / Unsplash

A new health-focused experience inside ChatGPT

OpenAI has unveiled ChatGPT Health, a purpose-built health experience that brings together users’ medical information and ChatGPT’s analytical capabilities in a single, secure environment.

Health builds on one of ChatGPT’s most common use cases. OpenAI says more than 230 million people globally ask health and wellness-related questions on ChatGPT every week, ranging from interpreting symptoms to understanding test results and lifestyle choices. ChatGPT Health is designed to meet that demand in a more structured and protected way, helping users feel informed and prepared when navigating healthcare.

Crucially, Health is positioned as a support tool rather than a replacement for clinicians. It is not intended to diagnose or treat conditions. Instead, it aims to help people understand information, spot patterns over time and prepare for conversations with healthcare professionals.

Bringing scattered health data together

One of the core problems ChatGPT Health seeks to address is fragmentation. Health information is often spread across patient portals, PDFs, wearable apps and handwritten notes, making it difficult to see a complete picture.

With Health, users can securely connect medical records and wellness apps so conversations are grounded in their own data. Supported integrations include Apple Health, MyFitnessPal and Function. This allows ChatGPT to help explain recent lab results, summarise trends from wearables, assist with diet and exercise planning or help users prepare questions ahead of a doctor’s appointment.

Medical record access in the United States is enabled through a partnership with b.well, which connects to a wide range of healthcare providers. Users remain in control at all times and can remove access instantly via settings.

Designed to support, not replace, care

OpenAI has been explicit that ChatGPT Health is not a diagnostic system. Its role is to support everyday health understanding, not to make clinical decisions.

By focusing on trends, context and preparation, Health is designed to help users feel more confident engaging with healthcare providers. For example, it can translate complex lab reports into plain language, summarise care instructions or highlight questions worth raising at a follow-up appointment.

If a health-related discussion begins in standard ChatGPT, users will be prompted to move the conversation into Health to benefit from its additional protection

Privacy and security as a foundation

Health has been built as a separate space within ChatGPT, with enhanced protections for sensitive data. Conversations, connected apps and files are stored independently from other chats and use separate memory systems.

While ChatGPT may occasionally draw limited context from non-health conversations to improve relevance, information created within Health never flows back into general chats. Users can view or delete Health memories at any time through settings.

All conversations are encrypted at rest and in transit, as with the rest of ChatGPT. Health adds additional layers, including purpose-built encryption and isolation. Importantly, conversations in Health are not used to train OpenAI’s foundation models. Users can further protect access by enabling multi-factor authentication.

Third-party apps included in Health must meet stricter privacy and security standards than standard ChatGPT integrations, collect only minimal data and pass additional security reviews.

Built in collaboration with physicians

ChatGPT Health has been developed in close collaboration with clinicians worldwide. Over two years, OpenAI has worked with more than 260 physicians across 60 countries and dozens of specialties.

These clinicians have reviewed and provided feedback on model outputs more than 600,000 times, helping shape how Health communicates risk, urgency and uncertainty. This input influences how the system explains information, when it encourages follow-up with a clinician and how it balances clarity with safety.

The underlying model is evaluated using HealthBench, an assessment framework built with physician input. Rather than relying on exam-style tests, HealthBench uses clinician-written rubrics that reflect real-world judgement, focusing on safety, clarity and appropriate escalation of care.

Who can access it and what comes next

Access to ChatGPT Health is rolling out gradually. OpenAI is starting with a small group of early users to refine the experience before wider release. Users on Free, Go, Plus and Pro plans are eligible, though availability is currently limited outside the European Economic Area, Switzerland and the United Kingdom.

Medical record integrations and some app connections are US-only for now, and Apple Health integration requires iOS. OpenAI plans to expand access on web and iOS in the coming weeks.

Why it matters

Related reading

ChatGPT Health reflects a broader shift in how people engage with healthcare. As data from tests, wearables and apps continues to grow, tools that help individuals understand and contextualise that information are becoming increasingly important.

By combining personal health data with AI, while placing strict boundaries around privacy and clinical responsibility, OpenAI is positioning ChatGPT Health as a bridge between raw information and informed care conversations.

Ian Lyall profile image
by Ian Lyall

Read More