Making Chart Notes is Just The Beginning

AI Copilots Listening to Your Sessions are Just Beginning

Over the past two years AI copilots have probably been one of the most purchased and utilized digital healt tools. If you’re not using it yet, you’re probably evaluating solutions or partnering with solutions your EHR may be building.

There’s a number of companies in this space that have launched into the limelight, Abridge, Ambience, Nabla, Suki, Microsoft. Then there’s specialty specific solutions like Upheal and Eleos in mental health and solo provider/small practice solutions like Freed.

If you’re in the process of evaluating these solutions, odds are your focusing on reducing provider burnout through AI completing, or mostly completing, tasks like writing chart notes based on listening to the visit. Today, many of these companies are quite similar. Evaluations may show some are a few percentage points better than another, but with time, they’ll all become extremely similar. Where they will differ is where they excel. Abridge has already announced their Epic partnership and there’s a clear focus there. Sooo, they’re clearly not going to be the solution of choice for your practice of 10 mental health therapists.

But, what’s the end game?

If you think about it though, listening to audio from a visit and turning it into a chart note can only get so differentiated. There’s a natural progression of functionality that is to be expected: moving to their own AI models and off OpenAI, Gemini, etc, being able to provide more customization in chart notes, serving multiple languages, EHR integrations, etc.

The end game is right in the name of this category: AI Copilots.

You’re already seeing some tools offer pre-charting capabilities to inform providers with pre-visit context before the session starts. It’s essentially summarizing what’s in the patient’s chart to make it easier for the provider to prep. There’s a deeper purpose to this - informing the AI created chart note. If AI is ONLY using audio from the current appointment, it’s acting somewhat isolated. It’s not using previous visits or information related to the patient to help create the note, it’s focused on the audio and the models it’s trained on. But, what happens when the AI is also ingesting the rest of the patient’s chart? Lab results from last month, history of previous surgeries and diagnoses, the AI starts to become more informed about this individual patient.

But again, reading the patient’s chart is only the beginning.

So what happens when the AI has access to the patient chart. Similar to the history of healthcare integrations, things start as “read” and eventually become “write.” This is where the term “copilot” really comes into play.

Writing the chart note becomes only the beginning. The copilot begins to take over administrative items that are monotonous not just for providers but non-clinical staff as well. If the provider determines a lab order is needed, the AI will make the order. If the provider needs to write a prescription, the AI will do it. The provider will be the “approver.”

It reminds me of getting driving lessons when I was 15 or 16. I remember getting in the Nissan Sentra with the instructor to go around town and the driving instructor had a brake on their own side.

So, the future of AI copilots look a lot like workflow automation, that gets more and more clinical, and the automation doesn’t stop at the point of care. And for that reason, my personal opinion is we’ll see more and more specialization across specialties like we already are with a number of mental health specific copilots. Just like how we have Eleos and Upheal in mental health today, we’ll see ones pop up in specialties like nephrology, cardiology, dermatology, and more.

Reply

or to participate.