Skip to content

Glass Health Developing AI to Assist in Medical Diagnosis Suggestions

Glass Health uses large language models to assist clinicians with diagnosis and treatment options. Despite early success, questions about accuracy and bias persist.

Glass Health’s AI

Dereck Paul, a medical student turned entrepreneur, and Graham Ramsey, an engineer, founded Glass Health with a focus on physician knowledge management. They later pivoted to generative AI for medical diagnoses, gaining significant attention and funding in the process.

Glass Health's AI analyzes patient summaries entered by clinicians and offers multiple diagnostic possibilities. It also drafts case assessments that clinicians can review, amend, and include in their records.

While the startup claims its AI is rigorously vetted and controlled, other AI healthcare ventures like Babylon Health and NEDA’s chatbot have come under scrutiny for misleading or dangerous advice.

Medical large language models are often trained on health records that may contain underlying biases. Whether it's racial, gender, or socioeconomic bias, this could cause the AI to give erroneous or biased advice.

Both Glass Health and other healthcare AI companies tread carefully to avoid implying that their tools can replace human expertise, likely to steer clear of FDA regulation and legal scrutiny.

Glass Health claims it can delete all stored user data upon request, but its model’s reliance on continual data input for fine-tuning could be a concern for privacy-conscious patients.

Despite potential pitfalls, Glass Health has already signed up over 59,000 users and is preparing a HIPAA-compliant, electronic health record-integrated offering.

Glass Health is at the forefront of integrating AI into healthcare, aiming to relieve the burden on clinicians and improve patient care. However, the startup has to navigate the complex landscape of medical ethics, data privacy, and potential bias to prove its tool as both effective and safe.