Article

From Every Angle: AI

As clinical and non-clinical AI use cases increase, our experts offer advice to help providers feel more confident as they venture into this brave new world.
Data & Analytics
Workforce & Culture
Quality & Clinical Operations
Pharmacy
Supply Chain
November 21, 2024

Let’s assume the role of automaton for a second and objectively scrutinize a few statistics about artificial intelligence.

As of 2023, the market size of AI in healthcare grew to roughly $100 billion in five years. In a Vizient survey, 89% of respondents reported implementing AI within their organization during the past 12 months — but 69% see a lack of clear return on investment as the major barrier to AI implementation.

Those numbers prompt an important question: How can health systems ensure their approach to AI is a well-oiled machine — and not simply a “plug and play” strategy that fails to address their specific needs?

After all, Vizient research shows that clinical and non-clinical use cases — including medical imaging analysis, EHR search and extraction, revenue cycle and finance and supply chain optimization — are increasingly common, while emerging uses include clinical trial design, real-time mental health monitoring, staff support robots and patient room virtual assistants.

With all these options — and so many ways to implement them — how can you feel confident venturing into this brave new world? Our experts have some advice to make the road ahead feel a little less bumpy.

Quote Andrew Rebhan, Senior Consulting Director, Sg2 Intelligence Quote Andrew Rebhan, Senior Consulting Director, Sg2 Intelligence

Take an intentional approach to building AI programs

Andrew Rebhan, Senior Consulting Director, Sg2 Intelligence

Why it matters: While AI is set to alter the business landscape, healthcare leaders may be hard-pressed to determine which AI investments will yield the most value. Potential uses span administrative, operational, financial and clinical domains.

“The challenges the healthcare industry currently faces — workforce shortages and burnout, capacity planning, expense management, complex care management — run far too deep not to try and leverage the seemingly limitless computing power, processing speed and scalability of emerging AI tools,” said Andrew Rebhan, senior consulting director, intelligence at Sg2, a Vizient company. “AI will not be optional in the future of healthcare.”

He says that if leaders don’t proactively engage with AI technologies and capitalize on their potential, they will be left behind.

“To advance enterprise goals, healthcare leaders should have an intentional approach to embedding AI as part of a broader digital strategy,” he said. “The undertaking will require stakeholders to understand the promise of AI as well as perceived barriers to its adoption.”

Whether kicking off a new AI initiative or expanding an existing one, successful organizations will be those that prioritize strategic and tactical fundamentals.

Strategies to consider:

  • Perform an internal audit. By starting with the right questions and stakeholders, organizations can appropriately incorporate AI in a way that delivers tangible results. Consider:
    • What problem is the organization trying to solve?
    • How can AI enable short- and long-term organizational strategy?
    • How do hospitals and health systems earn trust and build support for AI among key stakeholders and end users?
    • How is the right talent cultivated to build and govern the technology? What solutions exist that can be built on, and what vendors are potential partners?
    • How can your organization manage uncertainty? Can you experiment with AI without an immediate return on investment? What is the game plan for pilot projects — and, if successful, for scaling them?
    • How is success measured?
  • Evaluate your organization’s governance model. Determine how to integrate AI into the broader governing structure for digital strategy. New talent may be required to determine the right use cases for AI, manage related partnerships and monitor model performance. The governance model should include leaders and influencers across operations, IT, clinical, legal and security domains. Committee responsibilities should include vision setting, evaluating new technology and assessing its necessity; developing safeguards and protocols; establishing value metrics; budgeting; and planning timelines for adoption, while also ensuring consistent communication and transparency across the organization.
  • Identify ethical considerations and monitor for bias. It’s essential to address concerns related to bias, fairness, transparency and accountability. Much of this responsibility will fall on the governance committee to establish clear frameworks and guidelines for the responsible and ethical use of AI, while technical leaders should establish built-in controls to ensure AI solutions protect patient rights, privacy and data security. Ongoing monitoring and maintenance will be required. Constant pressure testing can help intercept data shifts, biased output, liability risks and other challenges associated with a model’s possible performance degradation.
  • Realistically assess organizational capabilities and be open to partnerships. Apart from needing newer skill sets in data science and engineering, health organizations that seek to build their own AI solutions will need to address data integrity and interoperability and invest in a cloud-based platform to support rapid data processing and adequate capacity. Many other organizations will find it more practical to buy commercial models, collaborate with startups on a new build, or simply rely on AI tools that are increasingly being embedded into their existing applications and core IT systems. Beyond direct business relationships, health systems should collaborate with other health systems, policymakers and regulatory bodies to foster a collective understanding of AI’s potential and limitations. They also should consider contributing to cross-industry standards for responsible AI use.
  • Share outcomes and progress. Show stakeholders how AI solutions drive operational efficiencies, reduce mundane work or boost patient satisfaction. Communicating progress also means acknowledging what isn’t working — and when to pivot. This transparency will help foster trust in AI and sustain ongoing investments in technology.
Quote David Levine, Vizient Chief Medical Officer Quote David Levine, Vizient Chief Medical Officer

Supercharge data analysis to support performance improvement

David Levine, Vizient Chief Medical Officer

Why it matters: In September, attendees at the 2024 Vizient Connections Summit were abuzz following an opening session during which leaders from Vizient’s Data and Digital business discussed new generative AI capabilities that promise to support providers and suppliers in everything from patient care to operational efficiencies to financial sustainability — including bringing together the Quality and Accountability ranking, current state progress and "What If" calculator. Additionally, GenAI functionality will allow providers to ask questions about their performance and pinpoint opportunities for improvement.

But, Vizient Chief Medical Officer David Levine cautions, AI isn’t a wand you can wave to solve every challenge. To reference an oft-uttered phrase, technology is only as good as its operator. Healthcare organizations must understand the data they want to pull, and how they plan to weave those measurements into their overall strategy, for AI to be truly effective in performance improvement.

“Technology is an enabler, but there is no ‘magic bullet’ that will solve all problems,” Levine said. “The key is to determine the main metrics and data you need to track.”

Strategies to consider:

  • Ensure data is relevant across the organization. Generative AI should allow for transparent, real-time data that is actionable and relevant to the different levels within your hospital. This allows employees on the frontlines to see how their unit is performing compared to others. And leaders should make sure data is used to make improvements actionable and understandable to frontline staff.
  • See data points as people. Don’t just focus on the numbers. It's important to translate data and metrics into the impact on patient lives.
  • Set goals — but be realistic. Organizations should have scorecards at the system, facility and service-line level and set realistic, achievable goals for improvement based on their current performance.
  • Remember that data is the reflection point for your organization. Technology and data need to be closely tied to the key problems and goals your organization is trying to address, and presented in a way that is transparent, actionable and meaningful to the people using it.
Quote Elizabeth Mack, Vizient Principal and Pediatric Critical Care Physician Quote Elizabeth Mack, Vizient Principal and Pediatric Critical Care Physician

Building trust between physicians and patients, even with AI in the room

Elizabeth Mack, Vizient Principal and Pediatric Critical Care Physician

Why it matters: In healthcare, there’s concern around AI application and implementation, specifically related to direct patient care.

Using AI transcription services such as a scribe in the patient’s room comes with pros and cons. On one hand, it can help reduce mental strain and physician burnout, increase availability, reduce administrative workload and provide more quality time with the patient without a screen barrier between them.

However, the technology can at times inadvertently cause patients to be hesitant with their responses. The lack of openness and honesty then makes it challenging for a physician to diagnose and treat their patient.

“There’s a lot of mistrust and ambiguity around the use of AI and — while AI certainly needs to be carefully vetted before use — it’s important for physicians to be mindful about how a tool like an AI scribe is used during patient interactions,” said Elizabeth Mack, Vizient principal and pediatric critical care physician. “There’s a balance to strike between trust and efficiency.”

Mack says building trust between the physician and patient is key.

Strategies to consider:

  • Acknowledge the elephant in the room. Address the use of the AI tool upon entrance into the patient’s room and explain how it will be deployed. Mack recommends showing patients how the data collected is used, such as demonstrating where an image shows up in the patient’s EHR portal and that the image or recording doesn’t live in the physician’s personal photo album. “Establishing the use of an AI scribe up front — as opposed to not addressing it at all — sets the patient at ease about what it’s being used for and also offers them a chance to decline its use,” Mack said. “It also helps the patient gain insight and clarity into the physician’s processes a bit better.”
  • Be sensitive. Some patients may not want to share certain information with an AI scribe — such as sexual assault details or reproductive health concerns — and being mindful of that is crucial. “I work in a pediatric ICU, and I may have one child who’s here for one night, and they’re going to be back in soccer within a week. I may have another child who is dying, and the family has known it for years, and I may have another child who is suddenly dying, and the family is in shock. So, the one thing I always remind myself is that being in the ICU is every family’s worst nightmare,” she said. “So, when it comes to the use of AI in a patient family engagement, I tell them: You tell me if we get to a point where you’d rather this scribe be off, and I’ll turn it off. Because what might be really sensitive to me may not be to someone else, and vice versa.”
  • Be present. Even if the physician isn’t using an AI scribe, it’s still important to remember the basics of building trust: introduce oneself, sit down, make eye contact, listen to the patient (and not to just to respond) and be vulnerable. To put it simply, be human — because even with an AI tool in the room, that’s what every patient wants.
Quote Adam Fairbourn, Director, Contract Services Quote Adam Fairbourn, Director, Contract Services

Assessing the full picture of AI in diagnostic imaging

Adam Fairbourn, Director, Contract Services

Why it matters: AI has the power to significantly improve patient care in countless ways by providing faster, higher-quality images and enhanced diagnostics. But challenges with adoption remain.

“Health systems need to think about how they’re going to implement and measure the effectiveness of AI, because those who aren’t are going to be left behind,” said Adam Fairbourn, director, contract services.

Strategies to consider:

AI in diagnostic imaging can:

  • Improve patient care through faster and higher-quality imaging. Images are read, assessed and processed quicker, leading to faster diagnosis.
  • Improve efficiency by streamlining diagnostic imaging workflows, aiding in interpretation of diagnostic images and effectively reporting clinical findings.
  • Help address staffing shortages by enhancing the capabilities of new graduates, saving time and reducing the need for retakes. “The biggest challenge facing radiology today is staffing,” Fairbourn said. “Many hospitals struggle to hire enough CT or X-ray technicians and often hire new graduates who are more prone to making mistakes and requiring retakes. AI helps reduce those errors.”

So, why hasn’t it been fully adopted yet?

  • Financial burden of adoption. A lack of financial incentives, such as reimbursement from Centers for Medicare & Medicaid Services (CMS), makes it harder for healthcare systems to justify the cost. “We’re seeing AI in radiology follow the same trajectory as PACS did in the early 2000’s,” Fairbourn said of the picture archiving and communication system that centralizes medical imaging workflows and serves as a repository of medical image information. “PACS really took off once quality reporting was tied to CMS reimbursement, and I think we’ll see the same adoption occur with AI in radiology if reimbursement metrics are eventually tied to using it.”
  • Challenges in measuring return on investment. Different organizations are measuring the return on investment of AI in various ways — such as impact on patient care outcomes and the impact of determining a diagnosis quicker — but the industry lacks clarity on how this translates to enough financial gains to offset cost.
  • IT integration and implementation challenges. With many algorithms entering the radiology AI market, implementation of multiple disparate AI point-solutions can put a strain on provider IT infrastructure and processes.
Quote Kim Wenger, Senior Director, Contract Services Quote Kim Wenger, Senior Director, Contract Services

Mitigate workforce challenges in EVS and food service

Kim Wenger, Senior Director, Contract Services

Why it matters: While most of the conversations about AI in healthcare revolve around clinical implications, it’s important to remember that technology also plays an essential role in indirect spend, including environmental services (EVS) and food service.

These are areas where providers often struggle to recruit and retain workforce, especially as many support staff leave for positions in the retail sector. Smart equipment and robotics have the power to save time (and money), particularly when it comes to less human-focused tasks.

But you want to make this integration near-invisible for patients while also scrutinizing the most efficient and effective ways to build it into your EVS and food service strategy, says Kim Wenger, Vizient senior director, contract services.

“You must take a careful, evidence-based approach to evaluating and implementing AI in the healthcare setting to maximize benefits while avoiding potential pitfalls,” she said.

Strategies to consider:

  • Focus on behind-the-scenes necessities. Look for ways to deploy AI that reduce labor costs and time in EVS and food services that don't require direct human interaction. Examples include using robots for jobs like laundry delivery or food preparation (“robot baristas,” for instance, are increasingly popular additions in healthcare and retail spaces). And don’t forget that EVS workers play an important role in patient satisfaction — by freeing them from basic delivery services, it gives them more time to talk with patients as they clean hospital rooms and common spaces to ensure consumers are happy with their surroundings.
  • Tap into success stories. When implementing AI solutions, healthcare providers should rely on peer recommendations and case studies from other hospitals that have successfully piloted and implemented the technology. (Vizient Member Networks, for instance, is one way to connect with other provider institutions to discover best practices.) After all, it's always crucial to validate clinical efficacy and quantifiable benefits.
  • Collaborate with vendors. Providers should work closely with technology suppliers to understand how to measure and quantify cost savings and productivity improvements from using AI solutions. This is a challenging, but achievable, goal if you invest in building sustainable supplier and GPO partnerships.
  • Evaluate risk versus reward. The long-term cost savings from AI solutions — like not having to pour money into recruitment and retention initiatives for employees who often leave to work in the retail space — can outweigh the upfront investment and maintenance costs. Make this a factor when evaluating solutions.

Want to learn more about how AI will influence care in the year ahead? Look for our upcoming report, Trends 2025: Strategy is (finally) back in the driver’s seat.