Artificial intelligence in medicine: Beyond the buzzwords 

  Emma Breck      Epic and technology

What exactly is artificial intelligence (AI) in healthcare? You’ve heard the term everywhere, but as a physician or APP, you need to know what’s actually happening behind the scenes, and why it matters for your practice.

AI isn’t just one thing

In healthcare, we’re talking about several different technologies working together that are designed to assist and augment your clinical work. AI includes machine learning, natural language processing, predictive analytics and generative AI, which work in combination to analyze data, understand language, predict outcomes and generate content.

Here are the key types you’ll encounter:

  • Natural language processing (NLP) reads and understands human language. This includes ambient note-taking technology that listens to your patient conversations and creates documentation, as well as tools that pull key information from discharge summaries and convert dictated notes into structured data.
  • Predictive analytics uses machine learning to forecast outcomes from historical data. Clinical decision support systems that calculate deterioration indices to alert you to significant changes in patient status are examples of predictive analytics at work.
  • Generative AI/large language models (LLMs), such as ChatGPT, generate human-like text and can assist with clinical documentation, patient education materials and research summaries.

For a quick visual guide to these AI types and their applications, please refer to the AI in healthcare: Types and applications infographic.

Here’s what AI is not: It’s not the same as basic automation (like automatic appointment reminders) or simple analytics (like calculating average length of stay). Real AI learns, adapts and makes complex decisions.

Why does this matter to you?

AI is already in your daily workflow, whether you realize it or not. AI pre-screens your radiology reads, analyzes your lab values for critical patterns and enhances your clinical decision tools with algorithms.

The physicians and APPs who understand these tools and know how to use them safely will provide better care more efficiently. Those who don’t may risk being left behind as healthcare rapidly evolves. There is also the risk of using them naively and being misled or receiving misinterpreted or confabulated information, especially if unaware of these possibilities.

We’re currently piloting generative draft response technology and other AI tools. Our group of Epic SmartUsers and Builders will be the first to test these tools and provide feedback on their effectiveness.

Ready to explore AI tools safely?

Consider becoming an Epic SmartUser or Builder. Upon completion of certification, you’ll be invited to join our pilot groups for new technologies:

Epic SmartUser courses (free for physicians and APPs)

  • Format: Virtual sessions
  • Tracks available for: Ambulatory providers, surgeons and obstetricians
  • Courses on: Efficient ordering, Slicer Dicer usage and InBasket management
  • Access:
    • Log in to Epic UserWeb (if you don’t have an account yet, follow this link: SmartUser).
    • Go to the Training tab.
    • Search for SmartUser or Physician Builder.
    • Remember to use HonorHealth Co-Pilot and never enter PHI into AI tools. Need a refresher on our policy? Check out last month’s article.

Want to shape how we use these technologies?

Join the Clinician Technology Experience Council (CTEC). Meetings are every other month on the fourth Wednesday at 5 p.m. Sign up here to have meetings added to your calendar.

Coming next month: We’ll dive into the data that feeds these AI systems and why your documentation is more critical than ever.

 

Was this article too much information, just right or not enough detail?

Help us improve future content by letting us know. Send an email to [email protected] with your feedback.