How home-based AI can scale back well being inequities in underserved communities [PODCAST]

Editorial Team
26 Min Read


Subscribe to The Podcast by KevinMD. Watch on YouTube. Catch up on outdated episodes!

Doctor government Sreeram Mullankandy discusses his article, “Bridging the digital divide: Addressing well being inequities via home-based AI options.” The dialog highlights that whereas the way forward for well being care supply is transferring into sufferers’ properties, this shift dangers leaving essentially the most weak populations behind. Sreeram explains that non-medical elements, or Social Determinants of Well being (SDOH), can affect as much as 80 % of well being outcomes however are sometimes missed by conventional techniques. He argues that synthetic intelligence, when deployed thoughtfully, generally is a highly effective fairness enabler. The dialogue covers how AI can determine hidden SDOH patterns with excessive accuracy, bridge language and literacy obstacles for the almost 36 million U.S. adults who want it, and function a drive multiplier for group well being employees. Sreeram additionally addresses the vital want for equity audits to forestall AI from perpetuating bias and the huge financial incentive for constructing a extra simply system, which may save $1.7 trillion in well being care prices.

Our presenting sponsor is Microsoft Dragon Copilot.

Need to streamline your scientific documentation and reap the benefits of customizations that put you in management? What in regards to the potential to floor data proper on the level of care or automate duties with only a click on? Now, you’ll be able to.

Microsoft Dragon Copilot, your AI assistant for scientific workflow, is remodeling how clinicians work. Providing an extensible AI workspace and a single, built-in platform, Dragon Copilot might help you unlock new ranges of effectivity. Plus, it’s backed by a confirmed monitor file and many years of scientific experience and it’s a part of Microsoft Cloud for Healthcare–and it’s constructed on a basis of belief.

Ease your administrative burdens and keep centered on what issues most with Dragon Copilot, your AI assistant for scientific workflow.

VISIT SPONSOR → https://aka.ms/kevinmd

SUBSCRIBE TO THE PODCAST → https://www.kevinmd.com/podcast

RECOMMENDED BY KEVINMD → https://www.kevinmd.com/beneficial

Transcript

Kevin Pho: Hello, and welcome to the present. Subscribe at KevinMD.com/podcast. In the present day we welcome Sreeram Mullankandy. He’s a doctor government. In the present day’s KevinMD article is “Bridging the digital divide: Addressing well being inequities via home-based AI options.” Sreeram, welcome to the present.

Sreeram Mullankandy: Thanks a lot, Kevin. Thanks for having me. Folks name me Sri. Be at liberty to name me Sri. I’m very enthusiastic about getting to talk about my article and different issues.

Kevin Pho: Alright, unbelievable. So simply inform us slightly bit about your story and journey after which what led you to jot down this text at KevinMD.

Sreeram Mullankandy: Undoubtedly. I’ll begin with a fast background about myself. I’m a medical physician from India. Then I taught in the identical medical faculty for a couple of 12 months. Then I moved into the company facet. I did my MBA from Boston College in well being sector administration and entrepreneurship. I dealt with product roles in a number of firms, for instance, B-Okay Medical, DaVita, Humana, lately in biopharma, a well being tech unicorn, and now I’m with an organization known as Illumina Well being, which is a brand new firm and we’re constructing an digital well being file particularly for home-based well being care or residence well being care with AI capabilities. Aside from that, I additionally am a decide and mentor for a number of startup accelerators within the well being tech world: MassChallenge, Techstars, MIT Launch, et cetera. Lately, as you rightly identified, I bought a chance to jot down an article to your platform as nicely. Thanks for that chance, and I want to converse to you extra about it.

Kevin Pho: So inform us in regards to the article for many who didn’t get an opportunity to learn it.

Sreeram Mullankandy: On this new age of AI, one of many issues that struck me after I joined this new firm and began constructing an EHR, proper, after I first ventured into well being tech, I might say possibly seven or eight years in the past, at the moment, issues had been fairly troublesome. As of late, due to the arrival of AI, quite a lot of issues have change into a lot simpler. As soon as I began transferring into home-based well being care, initially, my focus, as a result of I’m a clinician with that background, was a really slim focus or tunnel imaginative and prescient round what I can do for the affected person’s ailment or the right way to focus fully on the affected person. We’ve particularly centered on bettering scientific outcomes. I did some analysis, and we had been additionally making an attempt to enhance our outcomes, and we confronted some roadblocks regardless of the most effective efforts, so to talk. Then I did some analysis and discovered that solely 20 % of well being outcomes are instantly associated to the scientific care that you simply present.

Eighty % of it’s really associated to SDOH: social determinants of well being. I by some means knew about it at a excessive stage however didn’t focus an excessive amount of on it till a couple of years in the past. In order that was a revelation for me, after which I began fascinated by what are the various things that we will do to intervene in that space. Additionally, even when you’re a clinician and even when you’re a well being care group, to offer you a fast abstract for the viewers and listeners: social determinants of well being are a broad set of different elements that come into the image that aren’t fully scientific care.

For instance, they’re primarily categorized into three classes: meals safety, residence safety, and likewise transportation. There are a number of, clearly; these are the three broad classes. However it was stunning to me after I first found out that 80 % of the well being care outcomes are associated to these three areas and the elements that have an effect on them. So we began intervening or tried to intervene in these areas. Initially, within the olden days, I might say, even about 5 years in the past, it was very troublesome. However due to the arrival of AI and all the pieces, it’s a lot simpler for us as well being tech professionals and executives to intervene in these areas. So I wrote an article about it, the other ways by which we will leverage know-how to handle that, and that’s how we’re right here.

Kevin Pho: Alright, so that you talked about that 80 % of well being outcomes are influenced by these social determinants of well being. You’re employed for a well being IT firm bettering scientific notes, so from that perspective, how can we handle this subject?

Sreeram Mullankandy: That’s right. A minor growth on what we do: I work within the well being tech division of a bigger firm. Our firm initially began as a house well being care group solely. We had been a service firm. We grew very a lot from a couple of dozen sufferers to now we deal with about possibly 800 to 900 sufferers on a day by day census, in a short time in about three years, I might say. So, very speedy development. We’re constructing an EHR for our personal use first. As soon as we constructed out the MVP, then different organizations began asking for it. So now we’re opening it as much as the market. So I belong to the well being IT facet of a bigger well being care firm.

OK. Having mentioned that, the sufferers have quite a lot of scientific notes, quite a lot of information factors. One of many challenges that clinicians face is that regardless of all the opposite mountain of documentation and paperwork that the clinician has to do, it’s very troublesome for them to undergo all the small print affected person by affected person, particularly when you will have a census as much as hundreds. It’s troublesome to determine what the social determinants of well being elements are in every affected person’s chart and all the pieces of that kind. With AI, we’ve been capable of analyze the affected person notes and vitals and different parameters included, together with demographic data the place the affected person stays, their location, and their zip code. The AI system itself will analyze all the pieces collectively, mixed, at a bigger scale for hundreds of sufferers at a time, and level out to the clinicians or different well being care employees what the challenges are for this specific affected person in relation to SDOH and what the potential interventions are that we will do, contemplating the affected person’s demographics, location, and specifics as nicely.

Kevin Pho: So give us an instance of what that might appear like in motion. You talked about that you simply use AI methods to comb via the medical chart, and utilizing all these information factors, it could actually alert a clinician that social determinants of well being could also be influencing a few of their well being outcomes. So give us an instance of what precisely that might appear like.

Sreeram Mullankandy: That’s right. So, proper now we’ve about an 809-patient census. Going via affected person charts one after the other could be very troublesome. Now we run an algorithm, a general-based algorithm, which matches via the affected person chart. It’s capable of determine with 85 to 87 % accuracy which sufferers are those who want social determinants of well being intervention, SDOH intervention.

Kevin Pho: At a excessive stage, that helps the clinicians or principally well being care employees within the sense that it offers them a brief record of those sufferers to achieve out to and intervene in throughout their subsequent visits as a result of we ship clinicians to their affected person location as nicely. And what precisely are these interventions?

Sreeram Mullankandy: We’ve already constructed a community throughout, let’s simply say, meals pantries and Meals on Wheels and a few housing organizations, et cetera, within the space that we serve. So we’re simply capable of coordinate the exercise and join the affected person to the required sort and stage of service that they want in relation to SDOH.

Kevin Pho: Alright, so are there any research, information, or outcomes research from this kind of intervention that may present that there’s a particular enchancment in affected person care?

Sreeram Mullankandy: Sure. So, a part of it, this can be a bigger algorithm that we’re speaking about. The primary a part of it’s already out. I’ve written a paper about it, which is AI danger stratification, AI-based danger stratification. This one will give an total rating of the affected person’s danger profile. This contains all the pieces, together with the social determinants of well being elements, but it surely’ll additionally level out the highest three or 4 elements which contribute to this specific danger rating.

So this isn’t a blind danger rating like many different AI algorithms that you’ve, that the affected person is of like 90, 97, or 95 % danger or one thing of that kind. It’ll additionally level out to you, due to this explainability, the highest three or 4 elements and the way a lot they contribute. Now, this was the preliminary foray into why we thought, OK, we will add the social determinants of well being elements as nicely. However particularly on the studying of social well being itself, I haven’t written a paper but, but it surely’s in course of proper now.

Kevin Pho: Sure. So from the attitude of a doctor, they get an alert that this affected person’s well being outcomes could also be influenced by their social determinants of well being. What precisely would that doctor see? Would they see the precise social determinants which can be affecting it, such as you mentioned, both meals safety, transportation points, or housing insecurity? Inform us what sort of information that doctor would see.

Sreeram Mullankandy: That’s right. For instance, let’s simply use a synthetic title. This isn’t the actual title. Let’s simply say John Smith. On that specific day, he has a house well being care go to. OK. So on that specific day, when the clinician appears on the chart, they see that they’ve a scheduled go to with this specific affected person. They’ll additionally see a danger code for that affected person and whether or not it’s high-risk, medium-risk, or low-risk, and what particular danger code that’s. So let’s simply say it says one thing like a 75 % danger of rehospitalization. That’s the end result that we’re in search of. It’ll additionally inform you, OK, why this specific affected person has been categorized as high-risk and due to what elements.

So one of many elements could be that the affected person’s vitals contribute to 50 % of the danger. That places the affected person on the greater danger. It could additionally inform you in regards to the affected person’s meals insecurity, residence insecurity, and different elements. By realizing that on the day of the affected person go to itself, earlier than attending to the affected person, if the clinician or whichever well being care employee goes to go to that affected person is aware of that this affected person has these challenges, then they’re ready to intervene in that method as nicely. After they go to that specific location, whether it is meals insecurity, let’s simply say our system will routinely populate the meals help sources within the native space. The clinician can present the corresponding data to the affected person and likewise alert the corresponding group to get in contact with them after which comply with up within the subsequent go to to be sure that they’re linked and bridge that specific meals insecurity hole with Meals on Wheels or no matter service is suitable.

Kevin Pho: And what do you see as the following steps within the foreseeable future?

Sreeram Mullankandy: One factor that I positively began off with is the flexibility to learn a affected person’s chart and clearly the flexibility to intervene in that regard. But additionally, while you have a look at it, quite a lot of sufferers have so many different challenges as nicely: psychological well being, for instance, habit, substance abuse, and issues of that kind. So many occasions what occurs is the clinicians that we ship over there, or the well being professionals, could not have all of the specialty data or specialty information to really intervene or advise on that matter. Due to the skilled algorithm that we’ve, particularly with generative AI capabilities, they are going to be capable to really do a fast search, or the AI can hear itself, and it could actually present solutions or say, OK, these are the most effective methods to intervene on this specific affected person’s regard.

An instance could be, and that is in improvement proper now, we’ve not launched it but, that our AI clinician may also hear in when you have your affected person go to itself. In difficult areas the place you want specialty care, clearly you’ll be able to arrange a specialty appointment for later, however within the meantime, there will likely be specialty solutions additionally offered to the clinician, which they’ll cross on to their affected person as nicely. In order that itself helps in dampening the difficulty slightly bit by the point the specialist responses might be made obtainable.

And to broaden slightly bit, there are various interventions nowadays. All the pieces doesn’t must be tremendous high-tech. For instance, telehealth visits and issues of that kind might be organized for the affected person. Even when, as a company, we don’t have the aptitude of a substance abuse specialist or something of that kind, we will prepare for telehealth visits. The affected person will get the required sort of care, which might not be obtainable with every clinician’s specialty or every group, so to talk.

Kevin Pho: We’re speaking to Sreeram Mullankandy. He’s a doctor government. In the present day’s KevinMD article is “Bridging the digital divide: Addressing well being inequities via home-based AI options.” Sreeram, let’s finish with some take-home messages that you simply want to go away with the KevinMD viewers.

Sreeram Mullankandy: Oh, sure. It is a common message, I might say. I’ve labored in a number of organizations on the supplier facet, affected person facet, and even on the payer facet, et cetera. Proper now, I’m in residence well being care or distant affected person monitoring, and in these areas, I might say the business lacks quite a lot of clinician leaders to get into enterprise as nicely.

One of many issues that I didn’t know, certainly one of my blind spots, I instructed you, was the SDOH issue and the way a lot it impacts issues. Equally, what I didn’t know was how the well being care business itself works and what are the totally different gaps or alternatives that you’ve. Often, the stakeholders who’re making an attempt to bridge that hole are finance professionals, enterprise professionals, or tech professionals. They could suppose that they’ve your entire perspective, however they don’t. I come throughout it on a day-to-day foundation. Half my time is spent on explaining why we should always not do a sure factor as a result of whereas they suppose it improves effectivity by the numbers, there are qualitative points which solely a doctor or a clinician can present.

However after I converse to quite a lot of clinicians, they’re OK with visiting the sufferers and caring for the scientific care. They’re probably not into intervening on the enterprise improvement facet of it or on the know-how facet of it.

So the business is in dire lack of clinicians, actually, if you’re prepared to open up your perspective slightly bit and enterprise out. For instance, with value-based buying choices, value-based buying alternatives proper now, the Medicare and Medicaid prices are going greater 12 months by 12 months. There are managed care organizations (MCOs) in virtually each state. All you need to do is, with a bunch of clinicians, when you’re actually good at what you do, create a company and get right into a contract with an MCO on the idea of bettering outcomes for the cohort of sufferers that you simply select.

When you put together a plan, and you’ll even use ChatGPT, I might say, to arrange a draft for it, it’s really not that troublesome. It’s a must to get via your psychological hurdle of entering into the enterprise facet of it. Examples could be Oak Avenue Well being, for instance, simply in main care, choosing sufferers properly and serving to hundreds and tens of hundreds of sufferers and constructing an enormous firm because of this. Equally, there are various different affected person cohorts additionally. All you need to do is prepare with the plan and attain out to your respective MCOs. Issues gained’t occur in a single day, however total, two or three years down the lane, it’ll be useful for your entire affected person group and likewise for us suppliers as nicely.

Kevin Pho: Nice. Thanks a lot for sharing your perspective and perception. Thanks once more for approaching the present.

Sreeram Mullankandy: I respect it.


Prev



Share This Article