SPEAKER 0 Well, I think I will go ahead and proceed. Welcome everyone to the October Sierra H.C. Grand Rounds. I see these meetings as having two functions. One is educational. And the second is really to discuss some things going on. And one of the things I would like and I'll send out a separate email is that the U.S. embassy campus, there is a growing interest in imaging, A.I. and recognition. And so we have several clinicians. Guns who are interested in imaging and AI projects, so one of the things I hope is that we can send I don't want to call it an RFP that's way too formal. But to find out if there are faculty that are interested in participating with clinicians on some imaging projects, we've had actually some good success with this message getting out. Actually, I just had a call today with a clinician from Florida who's interested in joining our group because he's interested in A.I. and being in private practice doesn't have access to this sort of thing. So. And we'll also have some updates in the next meeting from Justin Cramer about the grand rounds. And so. Please look at your emails as we go forward. So today's presentation is the roadmap to a more useful and usable electronic health record. I'm pleased that I was able to work with the. Following authors, Tom Wendell, who is currently sitting on a beach in St. John's in the Virgin Islands, Eve was sitting with us when Nelson sitting with us, Martina Clark and Andrew Leggett and Dr. Chang, who is on a Zoom call. The goals of this presentation are to provide a brief history of the promise of the electronic health record with kind of a synchronous round truth based view from clinicians discuss the results of our initial survey of the various needs and wants clinicians how we went about using agile development and action research methodologies to deconstruct and reconstruct in a clinical framework and then perform a final evaluation. This is part of a to grant optimizing the chronic health record for cardiac care. And this manuscript has now been published by the Cardiovascular Digital Health Journal, and I believe I set out that connection with the invitation. We don't have any conflicts of interest to report. So if we look back at the promise of the E.H. are. It has been a long time coming, and I'm not going to spend a lot of time on this. But George Bush touted the benefits of electronic health record. It was debated that it would save $70 billion annually and grow from there because of eating efficiencies. Barack Obama and the High Tech Act and the ERA Act actually made it law in 2009 with great fanfare, but from the clinician's perspective, it's been anything but frustrated. In 2000 and 2012 2013, our group had a grant from the National Library of Medicine trying to look at our adoption by private and academic physicians, and each group was pointing at each other, saying, Well, they like it because of this. And it turns out. Nobody like and it was generally such that it interfered with patient care. The president at the time from our esteemed colleagues in the American Medical Informatics Association of the Government, was that clinicians were just Luddites. Why cardiovascular care a I'm a cardiologist, but I know it better, but it seemed to be a good miniature universe to try to study the problem because cardiovascular medicine is interacting in all the domains. So Dr. MyPlate, this here is an emergency room physician, and he will tell me that cardiac visits are only five percent, but there are some of the most interesting five. So I've heard, of course, of course. But we also have the critical care units monitor inpatient outpatient. We have imaging and diagnostics and therapeutic procedures. It's also the number one cause of mortality and morbidity that accounts for nearly 40 percent of outpatient primary care visits. So it's a very large scale methods. I'll start off with the definition of usability. There are two definitions from I Triple E usability is defined as efficiency, effectiveness and satisfaction with health information technology, according to the turf the Houston group. They use the word it's usable and useful and satisfying. I had a long time and I struggled with how to do this and now think about it because I hated the use of usability and the definition of. They get more and more of that. You have to have a model. And so we came up with a functional definition, and others on the recall may tell me if this has been used elsewhere, I could never find it, but really to measure and optimize data flow of workflow and cognition. And we used agile development and action research with jurisdiction to try to bring things together. Dr. Adkins, I didn't realize you're going to be on the poll, but see, I actually gave you credit for keeping that narrow. Although overall, one of the things it's like one of the things that should do that reveals that we have to serve as the entity, whether it's our middle or whatever the American society is slides and just modify it SPEAKER 1 to whatever I wanted to have highlighted besides the announcement settings. SPEAKER 0 So Melissa, do you have a control over the mute button? We're going so the American College of Cardiology and Dramatics and Health Information Technology Task Force was really instrumental as the group looking over our shoulder and providing independent validity. We used a simulated patient which turned out to be very, very useful. It was a complex pain. It should be an initial and final basis of uniformity across sites and eliminated before and security concerns range from individual sites. You'll read it. It makes sense that one of the criticisms early on was, Well, you want to actually just watch in the clinical environment. The problem is there is a lot of privacy and security issues and. There is enough variability in patients. You would get good workflow in our brochure and data. This actually allowed us to standardize and really use the convergent parallel methods to really get at some ground truths in our design. I also really love the convergent parallel mixed methods. I grew up as a quantitative scientist, but the qualitative analysis really allows you to open up to new ideas and thoughts and allows you to modify things. And then you can use quantitative to validate and the touristic evaluation by clinical informatics as computer science really kept us on the straight and narrow right. So doctor training got no, but when you hit the tab, but it should have a single action. So the initial evaluation of understanding the needs of clinicians, we had 55 physicians into an advanced practice providers at eight sites across country. University of Nebraska Medical Center, Duke University, Indiana University, the VA Medical Center through Creighton as academic centers and or private centers. Swedish Medical Center and Seattle Parkview Health at Fort Wayne, Indiana, Ascension Health of Indianapolis and Faith Regional Health in Norfolk, Nebraska. One of the most interesting comments. I don't have it in the slides, but for Faith Regional Health Network, their capture areas about 6000 square miles. When we talk about electronic health records, there are several sites where they don't exist, and the explanation is we don't have internet access. So that's why we like having the variety so we can figure this out. The qualitative analysis of this really exposed our previous work that charges are cumbersome that you use. There's too many clicks, all blindfolded interfere with patient. It's good studies that say that it adds 90 minutes to an average encounter from a qualitative story perspective. What we're doing, we're talking to a very seasoned cardiology nurse. And she said her average day is she got to work at eight o'clock. She had eight patients in the morning, an hour at noon. It's an afternoon. Then she would go home, fix dinner for her family and at nine o'clock at night is back on the computer till 10:30 at night, finishing off notes and then repeat it and very easy to see why. There's burnout and there's specific comments, but there's too much effort used in searching for information. There's a lot of loaded, non useful notes, too much time spent documenting and ordering, and too much documentation of what one person referred to as impertinent negatives. We use the system usability scale to understand how individuals at the eight sites scored. It's a one to one hundredths. Jail on a scale of sixty five indicates less than sixty five indicates poor design, and our initial survey and this included Epic Cerner, the VA CPR system, and I'm thinking that health care was forty six point seven, which is poor design. And their satisfaction score, which we wanted as an independent measure, was three point one, ranging from very unsatisfied to very satisfied people after we evaluated the difference between health care systems. And the answer is no, but we really did not see any obvious difference. But we talked to people about the comments. Certain set ups and certain areas were more successful than others, but there was no common thread that said, Gee, this one's for much better. That's actually confirmed by us daily, which does the survey and their numbers are still in this range. So as I was Duke and giving my presentation to the faculty about the goals of this project, I said our goal was to save one point. One or two minutes per clinical encounter and assuming 1.2 billion visits in the US, that would save twenty five million hours annually. So I was pretty impressed with myself. And then Dr. Chang interrupts or goes, No, no, no. One to two minutes is not enough. We have to say five to eight minutes. So you talk about a challenge and going back and redesign it. Jimmy threw it down for this so that we did this and we listen to people. We start in parallel with talking to folks to really start to go back and analyze what's involved. For those of you not in clinical metaphor, we really are using a bottle of really put forward in the late eighteen hundreds of how you should do a history of physical examination and how you should document things they've gone to different forms through the years. But basically, the structure has been the same, and we felt it was harder to figure out how to make this work. And I struggled with what we do a history of physical. But we do a lot of other things in twenty twenty one. So what we. Belt is out and confirm that the first thing you do before you evasion is due, magistrate and pertinent did not, and you will begin to build your current model of the patient and their problems. Then you interview the pay. If you do, there's really three actions that are going on. And often in a dynamic fashion, there's information synthesis, what additional history of you need to do, what additional diagnostics do you need? You educate the patient and you also engage them in their care. And then finally, you document your thoughts and your orders. So when we started. Doing this, we also broke down what are the actions that occur in a clinical encounter? And so in direct patient care, there is information on demographic quality research and billing administrative data for the chart review history taking information. So this. Patient engagement, medical decision making, process, reconciliation, Medicare. Patient reconciliation decision rules, orders and looking at external references and clinical documentation was really done for self and partners, for the clinical team, for the primary care provider. The patient and the payer. And for many people, they would reverse this more official documentation is for the payer then. One of the things that. As I looked at it, this was, well, how do you build a good model? And this really goes back to honoring Lawrence Wiebe. When I was a third year medical student, Dr. Droneship was the chairman of Medicine. And what he mandated every patient in the hospital had was a problem. Now this work was done on a piece of paper, but we had to have that problem was done by 7:00 a.m. the next morning. And this is really based on the work of Lawrence Reid and his son is actually continuing this work as we go. So to understand problem this, there are really three things that are covered in problems. There are symptoms. You know, I have a fever. I feel poorly. There is diagnosis. I have a heart attack. And there's therapeutic procedures I put in and a warning valve or a coronary stent in a patient. Then there are problem connectors, and that links diagnostics and therapeutics to specific problems. And this is something that actually starts gaining efficiency. And one of the things to think about is that problems are data, and we'll talk about that more as we go forward. So what we did is we created a framework that looks at the flow of data in an encounter. And so there is data collection. There's general data and problems specific. So we asked some of the same questions and then we asked specific. We synthesized the data to come up with a plan or diagnosed athletes with store the data in the form of notes and registries. And with Dr. Cheng, he also wanted structure for data stored. Then there's data retrieval and patient specific means. You open that record and that can be used for building quality dashboards. And. Goes back to theme for the next time you see the patient or for population based data retrieval to help support guidelines and registries and provide cognitive support for clinicians. What what is nice about this model is after you do the initial encounter, then you now start seeing efficiencies. And this led us to the concept of data persistence. And data persistence means that different problems have different timelines. So it's a symptom, for instance, is Hong. That will not have a long persistence. But if someone has a heart transplant or a prosthetic heart, then all that data will persist as long as they live. So that starts getting into how you can start curating problems. But as we go across, then when they return, they encounter when you get to date for review. All you have to do is confirm data integrity and check for gaps. Data collection is about filling in those gaps. Synthesizing the data to see if you is still your original hypothesis is holding true. And so you can go to modify that storage and retrieval. And what we've found is this process provides a significant amount of efficiency for following. Data can be thought of in two formats structured, that is which is understandable as its own singular concept, such as ejection fraction, sodium, sodium or unstructured, that's data which is complex. Humans understand complex data but is difficult for computers, as the patient had a myocardial infarction in the past 12 months as an example of that. We also needed to figure out the states of data and patient data exists in three states. It is recorded, it is clarified and verify what one of the advantages of this is. We included the patient in the system so the patient can record data, but ultimately it is the clinician, the app, fellow or faculty that has to verify that it's true. The other thing we found is reducing intrinsic cognitive load of there is a lot of discussions about what I call a two chamber model of intrinsic and germane or intrinsic and extrinsic or the three chamber model of intrinsic, extrinsic and germane. And so I'm not going to get into that discussion tonight. But reducing intrinsic cognitive load is really what helps you think and supporting problem based medications reduces cognitive load. Pushing the problem to the doctor or nurse practitioner reduces. We found passive decision support helps reduce cognitive load and bringing data forward for better patient helps. Reducing extrinsic cognitive load means you have a static design with a functionality that is consistent, and what we did was adopt the established based functionality principles. And we also, as we'll talk about you have to provide ample real estate for this flight. So this led us to three or four concepts in our design that drove our designs. The first business episodes of care, an add of artifact paper based chart. Patient care is actually a continuum of episodes of care representing a snapshot of a point in time. The second is the physical exam that's no longer central to the clinical encounter. I get this. That's when A1C and comments follow the law and wanting to support this. The better formulation of history and diagnostics concept three. For a clinician, the encounter breaks into one of the result of the review interviewing document and to save time to get doctor tracked 48 minutes. You have to reduce the review and interview process or a third inconsequential. You have to reduce the review and documentation efficiencies, and that clinicians want easily, easily searchable data, not voluminous documents. One of the artifacts of the electronic health record for paper based records is that, in fact, all we do is scan a lot of documents, so it's still a document based system and a file. And then we had some. Specific drivers that helped our design. The first is cardiovascular medicine. That's what is practiced the same across the country and independent of installed. And while different centers of different people adopt different workflow, the processes during the encounter showed little variance. And this was confirmed by Delphi modeling with a task force. So this means that you can establish best practices. Driver number two is clinical care is continuous. You've heard this before, and the encounter simply represents a snapshot of the patient by that specific clinician at this point in time. So in our design, we created the metaphor of a patient's medical record of the library. This actually goes from a comment for a second year medical student who in the virtual class asked me which books in the library should I read and what? I commented, and it kind of stuck with me. It depends what you're looking for because different people will choose different books. And one of our projects early on was that we found that different clinicians wanted and used different information or different data, even in the same patient. And so the use case? It brought this out was we were comparing a liver transplant surgeons questions with a cardiologist questions and when it came to the GI examination of the liver transplant, doctor asked a strange question How many Tylenol tablets do you take every day? I never thought of asking it, but it turns out Tylenol ingestion is one of the leading causes of liver fat. So we understood that we needed to have domain specific. So what we also did to reduce bloating. You could take some bloke off the show, you would use it or shelve it, and it reduces no load, but still supports all the information services documentation building for billing. The third driver was that they exist in three states, collected firefight and verified. As we had data, we showed our previous framework. This actually included optimizes workflow and including a patient data substantially reduces the time to review. One of the things we had with clarification is that. You found his work on this design that we had to build a heart that was understandable by patients. But then there needed to be a translation with more granularity of detail for clinicians. So patient terminology to turn into medical terminology. An example of that is I'm having crushing dumpster sternum chest pain turns into an acute myocardial infarction. Driver number four. But nations want personal data, push them using Dr Reid's consent for promoting connectors. It is possible to push for data, including quality metrics and images, to ensure they get the help of the task force and Delphi modeling really supported this. Driver number five is minimize expensive cognitive load imposed by the age or support clinical expertize. And what we found is it's a good design really do help reduce cognitive load or anybody that uses the epic system will realize that they have a thousand different ways to get to the same point and different pathways with different actions. The finding of what you do. One of the common statements that we found that was interesting. I did not include it in this, but I think it was and short lived as the model of what a human, how many actions a human remember and. No, it's stayed at Vanderbilt, and so there's all this time that, well, a clinician can only horo for four or five data elements together. But when you study expertize, you realize that experts don't view the world that way. They filled through a process of chunks of data, and then they established schema, which allows greater granularity of data and identification of narrative. And there is a great example when Garry Kasparov was playing 60 games of chess simultaneously and he went down the row and this is coming back. He stopped at one of the players and goes, You move the piece. And at first, the bird died that it finally fessed up, that they did. And people were amazed. He simultaneously. He could figure out that a piece of moon. Well, he didn't do that by memorizing all the pieces on all the boards in their position. But he knew how to play this game. Well, we had his mental model was flipped. He knew something was awry. An interesting story as I was learning about this. It also goes to explain why the blue beat him the first time there was actually a programing error in deep blue that created the moves that he didn't understand. And he made the mistake of believing that a computer is incapable of making a mistake. So he actually panicked. And that was the first of his first loss to the computer. What we found is that. When you have standardized actions or trade across large physical goods, play with no hidden data that offers a nation that also optimizes recall. And it really reduces the cognitive overload. So for the final evaluation, so we put all this together. When Melissa is responsible for building out our Typekit. When I went to graphic training, I went to thirty six hours of epic training. We were given we took 15 to 20 minutes to test on our prototype. And that was consistent with demonstrating the design principles and a single complex use case. We then brought in a new simulated patient who was recently admitted through the emergency department and this charge from your hospital and is now coming to set up therapy. Subjects were not involved in the design of the research subjects, by the way, I didn't cover this earlier, were picked by the individual site. We did not pick the research subjects. And the other thing which was an interesting statement was when we were in Indianapolis. One of the clinicians who was actually on the advisory board for Athena Health said, Well, of course your system works well. I've seen lots of demos where everything seems to work really well. And Tom Wendel, who was running it, said yes, but have they ever given you the mouse and let you drive? And all of a sudden he just goes, Oh. And so we really wanted the individuals to drive the computer. So this is an example of what we saw if we step through this. The first thing is we said different clinicians will want a different view. So this is the view of the patient from cardiology. Here's the current visit. If there were previous visits, they'd be there. We can see that they were hospitalized. We developed a quick menu that always allowed you to see what actions were out there and available, whether it was the problem list, the personal health record, graphics procedures, medications, etc. So that was always available. One of the things is you see that the patient had a history of coronary artery disease, so we have the pertinent records. So trans esophageal echo transfer and it's like magnetic resonance damage, any associated lung labs or medications. And what we found, most importantly, was we put the quality metrics up front. And because of its data, we could see that the patient was unavailable or they were on antiplatelet therapy and they had a high intensity stent. But there was no documentation of cardiac angina classification or cardiac rehabilitation being done. So after we did this, we verified the data and set it to the workshop. Then they build out the medication list, and one of the big steps is if you happen to use Epic or Cerner, it lists medications alphabetically. It turns out that that's not the way clinicians think about patients. We think about their problems and what's appropriate. So simply by linking, well, they have atrial fibrillation. We have them on amiodarone, they have coronary artery disease. There are certain Plavix the top will all nitroglycerin and rest or rosuvastatin for cholesterol lowering. OK, got it. They have hypertension. There are two medications there on that. This helps you build that mental model and look for gaps. You can also look over there in the classic alphabetical list of. And we covered this the the antiplatelet therapy. The bookshelf and this is what we found, the clinicians in particular liked was being able to see the image. And so this is now the right hands free, and then we're starting to build our note. We brought the image in here. We can have the clinician annotated, which then goes across as an annotation here. And so we went and after we went through the second use case in the final examination, we went through the results. And actually that should be 25, not 30. So the qualitative analysis, the comments were now the layout is easy to understand linking medications. Problems help identify cab gaps. Pushing data forward really save time. Love the images. We need a better way to document the transition from inpatient to ambulatory. It's easy to demonstrate your quality measures, and it's much faster and more logical. So I showed you before the system, usability is sports scores, and I'm going to skip ahead actually to this slide, which I think indicates better. So again, Adrian above is excellent. The Google browser gets about an 83 score in our prototype. We've had a score of seventy seven point eight. If you remember, the initial value was forty seven point one. Really surprising to us was the final installed. This was the same air at each site. Obviously, we've spent too much time here, gym loads where we're back on. What really surprised us was the final installed are two years to three years after the initiation was still poor and not significantly different. And so in our literature. Twenty twenty to twenty twenty one. There's still the issue of burnout from a poorly designed. So conclusions, I feel this is hopefully not too repetitive, but cardiovascular medicine is practiced the same across the country, therefore best practices can be established. Time savings must come from compressing the process of review, documentation and communication, not from direct patient interaction. Structured data and concise, unstructured data narrative and access to images are important to help clinicians synthesize data. And one of the things that is a thing of beauty is in this case, and I know engineers understand this better. In many cases, the effectiveness and efficiency of other directions. Here we were able to improve both the effectiveness and efficiency. And so this really shows that if you involve users at the front end, you can create a prototype that's superior. And our hope is that this helps vendors down the road. And again, our informatics team Martina Clark, who is at Uno. Tom Wendell, Jeremy Trent, Brian Schussler, who's now out at BYU, proving it useful. Tamara Bernard, who is our research coordinator and helped us a lot with our work. I didn't update plans. Other MERS my first graduate student was Lisa Robin Bower, Eve Al Terrell, Emily Chunka, Jenny Rose and Maryland's tourists were all involved. Instead, I will stop, stop sharing and see what questions are out there and how I can be useful. No, this is really great work and much needed in the air space, as you well know and I know we've had lots of conversations on this. I think my question is, you know, is this intended to influence the nature or developers or provide hope to hopeful developers or both? Or what do you? What do you think is the maybe the logical way to share this information or to spurn the discussion on? Thank you, David. That's obviously a great question because you spend your life doing this. I think the answer is both. We really want one of the major companies to see that there is a way to do it. There are certain companies whose business model is. Really built on the if you build one, you've seen one. There is no consensus on how to do this. ET cetera, et cetera, so they build in a consulting relationship into the sales, if you will or won't see, as you know, is making a major push to undo the Twenty Nineteen High Tech Act because the high tech that has the burden consequence of driving innovation out of the market. There were essentially only two manufacturers, Turner and Epic, who could meet the interoperability criteria that they laid out. And so as well intentioned as interoperability was Dr Chain System. You got disassembled, the Mayo Clinic system got disassembled, Vanderbilt that disassembled Mass General System because they couldn't meet the interoperability. So I think in the new era, we also believe that as APIs and fire become more involved, it's going to allow front end development that's far more functional, far more pleasing as information. And that's helpful, and I think we are seeing some of that. Now, I mean, it's not as mature as we'd like to see it yet for more broad application, but I think we're starting to see some of the fire API applications be successful and in small use cases. David, I will also say and you understand that there is no evil in all of this. It is. If we pick on Epic, they are a product of their 1980s design and. So there is an intention to make it overly complex, but but they just have some difficulties. I will tell you, I didn't cover it here. I really feel good with our personal health records, so the first thing we did to get started was build our prototype, if you remember. And we were able to go up to epic and talk with those folks and present it to the user groups. And again, a lot of it's just common sense. But if you look at the artist now, they've been able to integrate a lot of the concepts we found that patients wanted. 100 percent agree. SPEAKER 1 And jobless British, of course, agree with which is that as well as we've talked about this for a long time, a couple of the other issues one is is that there's sort of the tumult see the new user mode and who really wants to simplicity. And then there's the sort of as you gradually get expert, you do want all the bells and whistles to be able to to really drill into that. So the more complicated sorts of things. And so, you know, the best systems address both to those issues. And then of course, one of the ways of doing that is sort of the knowledge based support where where you do have a knowledge base templates that you can fill in for a specific problem and think so. So it's kind of this sort of object oriented thing where you've got simple to start with and then you've got complex that can be modular either eyes. And and then it really is an evolutionary kind of thing. I I have a tendency to think that the glass is half full. I think I think you correct me on this last time, but that we've made really significant progress since the the old days. But I think we've gone wrong tracks on on some things, basically because of the business requirements that we need to take it back as clinicians and make it beneficial for us and for the patients. And and let the the billings be the byproduct, which is the other way. It is opposite of what what has happened in the recent past. SPEAKER 0 If we had an eight hour lecture, I could add all of us for senior people realize the idea that came out and when Jim Campbell and so forth started working on the charts with Costar and so forth, it truly was simply a file cabinet, and it replicated the paper based records and then the. Ears started saying, well, wait a minute, we can use this to validate services. And so the first thing that came out of that, and for all of us who've been in practice a while, is the attestation statement. So I saw and examined the patient and independently interviewed and assessed them. And if you didn't have exactly the right words, Medicare would throw out your claim and say why you didn't really deserve to get paid. And that all fell apart when dot phrases came in and you go attestation play. Oh, we're 100 percent compliant now. And and that's where we have to get away from it. I like your idea about expertize. It was really interesting to me and went and even filled out all the details. And we had past medical history, past surgical history, all of these things. We had evidence of clarification, et cetera. And I was amazed at multiple centers with senior clinicians for just looking at things, and they pick out errors that we have missed in terms of what you said, they had a tonsillectomy here, but it went across. There is a leg fracture, had nothing to do with the case and nothing, but that really taught me that clinicians really do build things in trust and statements. And so you know this when you walk in and no offense to the emergency room. But as I told Rick Walker, once I spent five years trying to get you to think like a cardiologist and I failed. And so what happens if people say, Well, how do you do that? And the answer is you have such a deep set of data. And because we have the ability to see patients and their heart cast, which you have an outpatient clinic, we get very facile in our in our diagnosis and diagnostic eds. And yes, you're still susceptible to some cognitive bias. But for most of us, it takes us 60 seconds to form our initial impression. And so we need to have a chance at eventually. And this is what you and the group and you try get the adaptive decision support, which really is another way of saying what you're saying. And so if you're a novice, if you have a certain number of questions, we were all taught to ask open ended questions. I don't know. The last time I asked an open ended question seriously in a patient's room. Besides, how are you? Because as a clinician, I'm just trying to get to the diagnosis and jumping between questions and diagnostics to get me there. So it's about closing the gaps as a long winded answer. SPEAKER 2 John, thank you very much for reviewing five years of work and creating a forty five minute synopsis of it for everybody, you can imagine that there's a lot of substance behind this, but I'm going to ask you to think a little bit beyond the work that you presented in terms of the enablement of change. What do you see for us as clinicians in five years or 10 years, or a different way of saying at what period of time do you think it's going to? What period of time is going to elapsed before we really see substantive change in the models of our chronic health record systems? SPEAKER 0 I know that's a great question. There's it's a multi-tiered answer. And so I'll start off with the economic answer. How much did you spend on helping? SPEAKER 2 The direct span was north of 200 million. The indirect word twice that, so at the end of a decade, we spent more than a billion dollars. SPEAKER 0 Correct. So if I went to the C-suite and the board of directors then said, we're throwing out epic today and we want you to spend 200 million on the next great thing. And no matter what the efficiency and effectiveness it is, I don't think you do it since you're spending your whole life on data liquidity. You know, obviously I agree with you. That's the next big step is to eliminate e-tolls and have the data being the data and doing all the things that every other industry has done in terms of interim once used many times. But I really think the. Advert of API is going to be the leap and that. Honestly, I think systems will stand on top of Epic and Cerner and gradually completely replace them because they are great data repositories and paper based repositories, and there will be tools that sit on top of them. So I think we're actually. And you know, my my favorite movie to think about this with Tom Cruise, the Minority Report, I really see that. Coming in the not too distant future. Pending regulation. Because the world, as we saw with meaningful use, you dangle. What was it? $1.2 billion dollars in front of health systems. And they'll do the minimum amount necessary to get the money so that Nebraska suddenly I was measuring depression scales and counseling on weight loss because those two were the easiest ones. But I think quality will improve with the things that sit on top. How about yourself? Thank you. Where do you see this, John? SPEAKER 2 You know, you and I are often times of disturbingly and surprisingly light mind. So I would imagine that with the advent of APIs, as you're describing them, that the approach that's being taken now, the use of the phrase is it's being used now will slowly but surely be replaced by at least by interfaces that allow us say, as you described, a much more usable environment, not from a standard or not from a classic usability definition, but the functional definition that you outlined in one of your first lines. I think that's where we're going. That's the only logical way we are burning not millions, but probably billions of hours of time and time. That is not a sustainable model for health care. So the directions that we need to go in have to acknowledge that our workforce is much better used to actually taking care of patients than being computer for computer data entry tech. SPEAKER 0 Well, and I think to the AI scientists on the air, I do see a significant role for our professional intelligence and our favorite son. So Steven Thaler nudge capabilities, or you can go to the black swan approach, which is the systems should learn how we want to see patients and present the information. I mean, Amazon does this in a very minor way, but that technology will get adopted. And so when we walk in based on patient preferences, et cetera, et cetera, I think there's going to be 20 questions to get in through the front end. And then when you open up, the patient records 90 to 95 percent of the things that you want will be there. One of the things that we didn't comment on existing in the twenty five people we interviewed, not one ever asked for a priest piece of information. Thank you, Bruce. Now, the one piece of information was ever asked for that we didn't already have filled out. And that is a scarily good result. Well, we are approaching the bewitching hour. If anyone has any questions, please feel free to reach out to me. I appreciate the opportunity to present to this group. I look forward to next month's presentation.