Tuesday 10 January 2017

Nursing Care Plan Software

[music playing] >> patricia grady: goodafternoon, everyone. it's my pleasure towelcome you to the ninr director's lecture. we at the national instituteof nursing research are very pleased to have with ustoday dr. kathryn bowles, who will speak to us asone of the nation's top researchers inher area of work. and this is the purpose ofbringing kathryn here today,

and all our ninrdirector lectures, is really to bring to thecampus some of the top researchers in the field andto have them -- give them an opportunity to describetheir research programs and some of the implicationsas we move to the future in these programs. and so we're reallyparticularly pleased to have kathybowles today. she is the van ameringenprofessor in clinical

excellence, director of thescience for integrative -- center of integrativescience and aging at the university of pennsylvania'sschool of nursing. she's also the vicepresident and director of the center for homecare police and research, the visiting nurseservice of new york. so she is a -- she is a topresearcher who does combine her basic background andher clinical knowledge, also incorporating someof the trends that

we see as wemove forward. trends in big data, trendsthat involve use of the electronic recordin clinical care, and some of the issuesthat we talk about after we do the research --what happens then? how can we maximizethe impact? and so we're really pleasedthat kathy can be here with us today and to enlighten usin some of those areas. she leads a veryinterdisciplinary program

at the universityof pennsylvania. you'll hear a little bitmore about that as she talks about her program. but she and her team havedeveloped software to enhance healthcareproviders' decision making for referral upon dischargeof hospital patients. they also -- her goal -- heroverall arching goals is to improve quality of care andquality of health for our elder population.

she's published widelyin a number of areas, particularly clinicaldecision making, telehealth, quality of lifeamong frail elders, and interventions toreduce health disparities. she really does exemplifyhow nursing science provides a critical background andthe skills necessary to solve some of ourchallenging research problems in addition to someof our real-world problems, so that we do see awonderful transition here

between the researchcomponent of what kathy does and also the impact inthe real-world settings. she continues to lend herexpertise to a number of significant advisory boards,and i won't go through all of them because they're kindof long and we'll be taking away from her time tohear about her program, which we'rereally keen on. but most recently she wasinvited to serve in the national quality forum onthe care coordination

steering committee. also works with the centersof medicare and medicaid, cms, on its initiative toidentify the key data elements essential tosafe care transitions. and as all of you know,cms is the agency that's responsible forreimbursement in the healthcare arena. so it's very important toget in on the ground floor of some of those decisions.

she also is a member ofthe health information technology standards panelof the care coordination committee, which wascommissioned by the office of the national coordinatorof health information technology, something wehear a lot about here. and her role there is toidentify standards for the electronic patient record,which is extremely important now, but as we move forwardin this healthcare arena, that will be one of the mostsignificant documents that

not only helps us getbetter healthcare, which is the idea, but alsoprovides the data which then feeds into the big dataarena and helps us to be able to determine some ofthe national trends that will be occurring aswe move forward. so that's going to beincredibly important as we move forward. so we look forward tohearing kathy bowles, dr. bowles, speak oninnovations to improve

discharge planning, andwelcome her to the podium. [applause] >> kathryn bowles:thank you, dr. grady. i am so thrilled to be here. i am honored to havebeen invited to do the director's lecture. it's a real privilegeto be here, and thank you for that. thank you for theintroduction.

and i want to take thistime to thank my team. i had a terrific team ofinterprofessionals that helped with thiswork over the years. without them, this wouldnot have happened. i have an amazing projectmanager and research associate, and dozens ofstudents over the years that have helped with this work,so i want to acknowledge all of them, as well as thehospitals that have graciously de-identified andgave their data to us so

that we could run the --build the case studies that you'll hear about andrun the algorithms that we developed. and that involvedinformation technology staff, casemanagers, nurses, and lots of --lots of people. so thank them all. i also need to start by --i am making a disclosure. so the work i'm going totalk about did result in

an inventioncalled the d2s2. stands for dischargedecision support system. and we patented thatand licensed it to rightcare solutions. that's a company thati cofounded at penn. and i held financialinterest in that company until it was recently soldin december of 2015. the work that was doneduring that period was done under a conflict of interestmanagement plan that was set

out by the university, inwhich we use an independent statistician to do all ofthe data analysis. so i never touchedthe data or did any of the analysis myself,to keep it objective. so i want you to take a lookat these four ingredients, because as i do my talk,they're going to come through loud and clear toyou as really the four essential ingredients tohaving success in launching a program of researchand in launching the

business that we did. and that's the content,the people, the timing, and financing tomake it happen. the overall research goal ofour program of research is to get the right careto the right patient, at the right time,in the right place. so we're dealing withdischarge planning, a process where patients, asthey near discharge from a hospital, a decision is madewhat to do with them next.

should they receivepost-acute care, or should they go home tothe care of their families? i'm going to talk with youabout a seventeen-year program of research. i started it back wheni first got out of the doctoral program wheni was just a kid. and we did some pilotstudies, got r01 funding, did a couple more proofof concept studies, got more ninr r01 funding,and then two phase --

phase i and a phase iisbir grants that helped us launch the company. so i'll tell you all aboutthat amazing journey. so this all began witha clinical question. so as i said, i was just akid working with my mentor, mary naylor. and mary had just finishedher randomized clinical trial in which she alwaysreferred patients -- always enrolled patients in herstudy who were high risk.

these were patients whohad multiple comorbid conditions, trouble withhome, lack of caregivers, depression, multiplemeds, difficulties with functional status. they were in her studybecause they were at risk for poor discharge outcomes. and yet when we werelooking at her data, and we looked at the controlgroup who got usual care, we discovered that lessthan half of the sample got

post-acute care referrals,meaning got home care, or went to a skilled nursingfacility, or got rehab. and i was reallysurprised to see that, and appalled by it, and soi wanted to investigate why that might happen. so i did a pilot study,building preliminary data for an r01. so those traineesin the room, this will show you how youbuild some pilot work to

demonstrate that indeedthere is a problem and then go forth for nih funding. so i wondered, how did thosepatients who got referred compare to those who didnot get referred? so we took mary's data andwe compared them on all the characteristics that we had,which is a rich comparison of all their clinicalcharacteristics and even their home environment,and discovered that -- oh, first of all, what we didwas we created the case

studies out of the 99patients who did not get referred. and so this is an example ofsome of the content of the case study, where we knewthe patient was depressed, and we knew theirhospitalization history, and we knew their familycaregiver availability, et cetera. we took these case studiesand we -- i want to show the example of this case studyin that this person went

home without home care. so an illustration of howsick they were -- didn't have an understanding oftheir blood sugars, et cetera, and still wenthome to their family to self-care. so we took thosecases, 99 of them, and we asked nurses whowere working for mary in transitional care program-- who were experts in discharge planning andtransitional care --

to take a look atthese 99 cases. and they said in 96 out of99 of the cases that they would have referred them forsome type of services. and in 49 of those cases,they said that they were a high priority for careand definitely in need of home care. and these 49 patients,when we took their characteristics and comparedthem to those that did get home care, did notlook much different.

they were younger, they hada shorter length of stay, but clinically, they werejust as debilitated and just as sick. and yet they went homewithout home care. we also focused in on thosehigh-priority patients, those 49, and we followedthem out to look at their readmission rates by sixmonths and discovered that nearly 50 percent of themwere rehospitalized versus 38 percent who didget post-acute care.

we demonstrated through thisstudy that we're missing people, who go on to havepoor discharge outcomes. they were just as sickand just as deedy, but somehow theygot missed. and we felt that theresearch information we put together in this case studygave our advanced practice nurses more information tomake a better decision than what our clinicians mighthave when they're trying to make this decision in reallife in the hospital.

we published this workin the journal of american geriatricsociety. so next -- we demonstratednow that, you know, we were missing people --next i wanted to know, "how does this happen?" so we took six of thesecase studies over to the hospital, and we asked twodoctors, two nurses, two social workers to readthese cases and tell us what they would have donewith these patients.

they read the cases,and they referred all six of them. but in real life, none ofthese patients got referred. and so i went on tointerview them about or "how couldthis happen?" and the transcriptswere coded into three main themes. they told us that therewere patient issues, "the patients are able tohide their needs from us,

" that there's a lot ofcognitive impairment out there that wedon't know about. and so we might send aperson home that's cognitively impaired andunable to manage their care. short lengths ofstay were an issue. patients who havea common illness, you see them over and overagain and you really kind of think, you know, we'llbe seeing them again, and so you justsend them home.

system and staffing issues:being too overworked, a lack of teamwork, lack ofa systematic approach to doing this. across the nation thereis no standard way to make thesedecisions. education issueswere also a factor. the lack of documentationof needs in the record. not asking the rightquestions -- they felt that that mightbe an issue.

and insufficientunderstanding of what post-acute care services cando for a person might also be a reason whywe miss people. and this was published in"applied nursing research." so these two studies gave usthe ammunition to write to ninr and ask for an r01 totry to build some decision support to fix this problem. we are fortunate tobe funded and my co-investigative team thereis listed where we had

decision scientists,nurses, and informatician and a statisticianon the team. and here's where i want totake a moment to thank my mentor, mary naylor,for her generosity, and to demonstrate the valueof using -- reusing r01 data for another study. in this study, we used allthree of mary's r01 data that she had collected thatdescribed these patients to build case studiesfor this study.

we used 200 patientsfrom her records, and then we collected about153 more prospectively, which really spedup the project. so those hospitalized olderadults were written up into case studies, and we -- someof the factors there that were within these casestudies are listed. and they wereorganized by orem's self-care deficit theory. and orem's theory, if youremember back from nursing

school in those modelsclasses we had to take, says that if patients cannotprovide self-care, they need nursing care. and that made such senseto us that we're trying to decide who, when they'reon their way home, is going to need support. and so we organized our casestudies according to the domains of orem'sself-care theory, which was functionalstatus, health state,

caregiver availability,those types of categories. in this study, we went forthagain with that case study method, and we had fournational scholars and four local clinical experts-- doctors, nurses, social workers, andphysical therapists, the clinicians who makethese types of decisions every day -- on a team,who read these cases, all 353 of them, and told uswhat they would have done. they provided us with ayes/no decision of whether

or not they would refer thepatient and the reasons why within the case theywould refer them. and from that, we were ableto run regression models. we started out with about20 characteristics of the patients, and it boiled downto about six that were the most predictive of theneed for post-acute care. we validated that algorithmagainst the holdout sample and achieved an areaunder the curve of 0.86, which is a very good model.

so these are somepictures of our team. on the left top,there's a guy with his head circled there. that's eric heil. he was an undergraduatesystems engineering student who worked with my team forhis senior thesis as an undergrad student at penn. so take note of him,because he comes back into the story later.

the picture on the right arethe experts sitting around the table, readingthe cases, and making their decisions. and the bottom two areour team meetings. so the findings of ourninr study showed that the experts referred 80percent of the cases. they were 18 times morelikely to refer than clinicians in real life. the clinicians in real lifereferred only 28.5 percent

of the patients forpost-acute care. and you can see thedemographics of the sample. they were quite old,65-to-90 years old, and 51 percent of themrated their health as fair and poor. the algorithm that resultedhad these factors in it, so the experts were morelikely to refer patients who had no help availableor only intermittent help available at home; majorwalking restrictions;

less than excellentself-rated health; a long length of stay;higher depression scores; and more comorbidconditions. and these reallymade sense to us, and i think they willto you too as well, that they reflect patientswho have a lot of needs. this was published in"nursing research." then we also looked atthe outcomes of these patients at 12 weeks.

and we compared them topatients that the experts said should get areferral but in real life they did not. and we saw a nearlyfivefold risk of subsequent hospitalization by 12 weeksif they -- if the expert had said, "refer this person,"and they did not get it. if the expert said yes, butin real life there was no referral, the readmissionrates were about 24 percent. and when the expertsaid, "do not refer,

" and in real life theywere not referred, the readmission ratewas only six percent. so this demonstrated that wehad an algorithm that did parse people into high-need,high-risk categories and was also not over-referring. we published this in thejournal of medical care. but we had -- so we had thisbeautiful algorithm, and we proved that it workedwell and it was associated with outcomes, but nobodycared about readmission

at that time. it was before theaffordable care act. and so, meanwhile, i went onand did some other things. so we publishedour findings; we validated the -- wechanged the depression questions that were in theoriginal algorithm from a 30-item questionnaire totwo because we wanted to friendly to use in clinical;we developed a scoring system for it; andthen we tested

it in a hospitalsetting. so the work continued, butnobody was really too excited about it becausereadmissions were not a concern at that time. i went on to write andcomplete two randomized clinical trials, one fundedby ninr, on telehealth. and then i submitted acompeting renewal of this -- to carry on thiswork of referral, but this time wewanted to know,

"where should werefer people to?" and so that work isjust finishing up, and i'll be telling youabout that as well. so after that subsequentwork of making the algorithm more simple, andclinically friendly, we also developed acognitively impaired version for it, because this was forpatients who couldn't answer the questions themselves. and these questions areembedded in the nursing

admission assessment. so we all ask lots ofquestions when we admit patients, and so thesequestions are now a part of that screening. so this is the study thatwas a quasi-experimental study where we took thealgorithm into the hospital to try and test it ina real-life setting. and what we did in thisstudy was we went into three medical units at thehospital of the university

of pennsylvania, and we hadthe algorithm on paper. so we collected it in acontrol phase where we knew which patient scored inneed of post-acute care, but we did not share thatwith the clinicians. and we collected thatfor eight months. and then we went into anexperimental phase where we educated all the cliniciansabout the algorithm, what it meant, and thenwe began to share it with them on adaily basis.

so they had it as a piece ofinformation as they were making theirdischarge decisions. and this work was fundedby three sources at the university of pennsylvaniaand the kynett foundation. so this is the resultsof that study. on the left graph, it showsthe time to readmission, and the top lineshows us the patients who have high need. they scored as needing areferring on the algorithm.

and their readmissionrates at 30 days were about 23 percent. and this is when we'renot sharing the advice. and the bottom line showsus the people that are "do not refer." on the right-hand side isafter we began to share the algorithm withthe clinicians, we were able to get care forthose high-need patients and bring that time toreadmission down to a rate

of about 17 percentat 30 days. so it showed that havingthat extra information helped the cliniciansmake better decisions. we published this in "professionalcase management." and this study was criticalfor us when we launched the business to show theproof of concept. so now it's 2011 to 2013,and reimbursement is now linked toreadmission rates.

the affordable care act hascome into being and we know that in 2013 that patientsare -- hospitals are now going to start beingpenalized for readmissions, and at the bottom of thescreen there it shows you that if you reduce hospitalreimbursement by just one percent, that equates to$1.2 billion nationally. so this was a reallyimportant policy change. so knowing that,i thought, "hm, i think we might havesomething here."

so i walked over to techtransfer office at the university of pennsylvania,and i described the work to them, and i told them whatwe had achieved in the research, and they looked atall of our publications and interviewed us aboutthe algorithm. and they agreed thatwe did have something. so the university helpsyou start a business, and they helpyou find a ceo. but two weeks after ihad walked over to tech

transfer, that undergraduatestudent, eric heil, called me just out of theblue and said, "kathy, have you heard about theaffordable care act and how people now careabout readmissions? and have you thoughtabout doing something with the d2s2? would you like tostart a business?" and i said, "eric, ican't believe you're calling me now.

i just walked over to techtransfer two weeks ago. yes! we're looking for a ceo." eric stepped up to be theceo of the company, from being an undergradeight years earlier. pretty neat. the picture at the top ofthe screen is us when we won the janssen connectedcare challenge. so what we did when welaunched the business is we

started entering conteststo win seed money to help us get started. it also was that eric was inthe wharton executive mba program at the time atpenn., and he was in -- happened to be in the classwhere he had to write the business plan. so he wrote ourbusiness plan in class, and when he presented thebusiness plan in class, two classmates walkedforward and said to him,

"eric, if you ever startthis business, we're in." so they're now our chieftechnology officer and our chief commercial officer. so i have a dream team fromwharton executive mba program leadingthis company. not only that, but eric'sbusiness plan won the wharton businessplan competition, and that gave us seedmoney for legal fees, accounting fees, and thepicture at the top right

is -- we as a team got thering the closing bell at nasdaq. so that's us. yeah, isn't that cool? that's us live atnasdaq on times square, and rightcare solutionsbanners on the eight-foot-storybuilding at the top. it was fantastic. so also in 2012, we werefortunate to be funded be ninr for our phasei sbri grant,

which was critical tobuild the software. we had thisthing on paper. we needed to put it into away to automate it and figure out how to get itinto ehrs and make it -- make it work in hospitals. so that phase i sbri helpedus to go live at thomas jefferson universityhospital with the software, and it also gave us theammunition to go forward into the shark tankand present this to

venture capital people. and we were fundedby $1.7 million in a series a startup funds. so the sbri results toowere very compelling. and this was published inresearch in nursing and healthcare. the bars -- thegraph on the left, the dark line are thehigh-risk patients during a control phase.

again, a similarquasi-experimental where we collect the risk scoresbut we don't share them. and then the white isafter we share the results withthe clinicians. so the high-risk patientsare on the left. we made the largestdifference with them. being able to identify themand get them care really brought down thosereadmission rates. and over at theright, the far right,

are the combined high- andlow-risk patients together, and still we see almost afive percent decrease in readmissionsusing the tools. so in 2013 we went liveat the hospital of the university of pennsylvania. they're athree-hospital system. and we got another $5million in venture funding to help launch the business. we continued to win someprizes and recognition that

gave us press, gaveus credibility, gave us attention, which theventure capitalists liked, and then we continued tohire people and grow and move on, and wentlive in houston, in hospitals in texas. we were also fortunate toget a phase ii ninr sbir grant, which helped usto further develop the software, as i'll talkabout in a minute. we went on to develop somepartnerships with some

post-acute care settings,because now they're interested in the --identifying patients early in acute care so they canget their staff ready and geared up when recognizingthese high-risk patients are coming to them. and we, again, closed seriesb financing for another $4 million. so in all, we got about$12 million in seed money, prize money, grant money,and venture capital money

to launch the business. the phase ii sbir, whichwe're finishing this year, enabled us todevelop what we call "smart capabilities." so our little six-itemalgorithm is embedded in the ehrs of hospitals, and asnurses admit patients, they answer thesequestions with patients, and the algorithm runs. but what the smarttechnology does is it

follows patients throughthe hospital stay, and it collectsinformation about what happened to that patient. did they get referred? where did theyget referred to? did they getreadmitted? did they usethe e.d.? and over time, that algorithmgets smarter and smarter for the patients in the settingin which it is installed.

and they're also pullingin some addition patient-in-hospital specificlevel characteristics to add to the algorithm, to make itusing again machine learning techniques to make it moresensitive and specific to the setting. the second aim of this wasto connect the software to the post-acute caresetting so that we could electronically referpatients and send information across thatchasm from the hospital

to the home care agencyor to the skilled nursing facility. so 2015 was alsoa busy year. we developed thesmart technology, so we haveachieved aim 1. we also acquired atelehealth business. and this was a small startupthat was available to us to purchase to help monitorpatients after they go home. and so this is a telehealth,and it kind of fit in with

other things that i've donein my career as well with the telehealth work. we went live inhartford, connecticut, in a five-hospitalsystem there. and we've grown to 26full-time employees. and we are now installed in32 hospitals nationwide in seven states. and then mostrecently in december, the company was boughtby navihealth,

which is a cardinalhealth company. and it happens that they arein the post-care business, and we have a niche that wecan fit right into their workflow, their software, inidentifying patients and making a seamless transitioninto the post-acute care setting. so the software as itstands today was developed from the beginninggrant with the d2s2, developed the algorithm,and then the phase i sbir

enabled us to turn it intosoftware where we have a dashboard for the users,they can see their specific patients, they can drilldown from the unit, the service, the verycase manager that's managing the patient. we help them find post-acutecare facilities in the patient's neighborhoodthrough maps, and we bring them thequality scores of the organizations to whichthey are referring,

which is somethingvery new to them. they really didn't know thereadmission rates, et cetera, of where theywere sending patients, and so now that's amore informed decision on their part. and the phase ii sbir hashelped us to connect to the post-acute care settingselectronically and to provide outcomes back tothe hospital as to how the patients are doing in thepost-acute care setting in

terms of readmission rates. so back to those fouressential ingredients. as you see, wehad the content. we had that algorithm thatwas accurate, it worked, proof of concept camethrough to show us that we had something that mightbe useful in clinical. we had tremendous people. terrific team,terrific students, terrific leadership in thecompany to get things going.

timing. i was patient, sat on thisfor five to six years before we finally realizedthat, you know, we -- the time was right,and we jumped on it. and then the financing. without the funding that wereceived and the venture capital that we received,we could not have made this happen. so now to tell you a littlebit about our current study

and where we'regoing with this. so this is the study wherewe've already developed an algorithm for yes/no referpatients to post-acute care. and now we're working on"where should we send them? should we send them to homecare, skilled nursing facility,rehab, nursing home, or hospice?" this required a much largersample, and so we worked with sixhospitals to obtain ehr data

from them in order again tobuild these case studies. and we used 1,500 casestudies this time. we drew this ehr data fromthe nursing admission assessment and through theirdocumentation as the patient went through the hospital. but there are so manychallenges to using ehr data, and it took us abouttwo and a half years to obtain, clean,merge this data, and build thesecase studies.

so it was anonerous process. and the reason it was sodifficult is because even though the six siteshad the same ehr, they had customized it. so they had gone in andchanged standardized data elements to their ownterms, their own words. we also had difficulty withfinding people that could get the informationback out of the ehr. so you can always put itin, but it's difficult

to get it back out. free text was an issue. we have structured drop-downboxes to choose a structured answer, but a text box belowit and the clinicians would just type something inand leave the box empty. and so we were trying -- hadto then reconcile what they wrote and try andcode it backwards. and then, again, it wasan onerous, with bold, should be capital letters,of the cleaning process to

reconcile these differences. we published our woes aboutthis process in the journal of nursing administration asan appeal to nurse leaders in acute-care settingsto not allow this customization, to reallythink twice about it to make sureit's necessary. because it will inhibit usfrom doing the big data research that want to doacross settings if we're all collecting thingsdifferently.

so some results so far. the 1,500 case studies havebeen judged by 171 experts, and we put them inteams of three. there were doctors,nurses, social workers, and physical therapistsfrom all over the country, and we assigned them sothat we mixed them up by discipline and we mixedthem up by the part of the country they were from,because referral rates and local conventions mighthave biased them,

and so we wanted towash that out. again, in this study -- idon't know if you recall, but in the first study theexperts referred 80 percent of the sample. once again, our experts inthis study said 80 percent needed referral. but in real life, 67 percentthis time had got referred. much higher than the 24percent we saw years ago, so we are getting better atrecognizing these patients.

but experts who have theinformation in front of them in a case study with thetime to sit and think about this case i think canmake better decisions. so we're really excitedabout this algorithm we've been able to build doing itthat way to give more information to casemanagers, discharge planners, to helpthem make better decisions. and this work was justaccepted for publication in "applied clinicalinformatics."

so the model we've built,the yes/no referral model, again in validation is89 percent accurate. in the where torefer patients, it is 87 percent in aholdout validation sample. we were unable to parse outthe patients who should go specifically to snf,nursing home, or rehab. we were only able to say,"send them to home care" or "send them to somelevel of facility care." we did not have enoughexamples in the cases to

build good, accuratemodels for that, and so we ended up witha dichotomous model, very helpful nonethelessto differentiate between those settings. some different factors cameinto play when we're looking at where to send patients. as you can see, someinteresting ones there about fall risk about beingdischarged on narcotics, which has become areal national issue.

their self-rated health,their prior history of admissions, and many,many more variables are in this moresophisticated algorithm. this algorithm also updatesitself as the patient goes through the hospital stay. so it's embedded in ehr, andit's pulling information as the nurse documents thefunctional status, for example, as itchanges throughout the hospital stay.

the algorithm updatesitself twice a day. it is installedin a hospital, and it's being prospectivelytested at this moment. we have finished the controlphase and we're near finished in the experimentalphase of this study. and then we'll be looking atreadmission rates and how we agreed or disagreed withwhat happened in real life. and i'm really pleased thatwe're pulling data from the nursing admission assessmentand their documentation.

this study shows theimportance of what nurses collect, and i think it'shigh time that we use that information and putit to good use. so the next steps for thiswork are to complete the data analysis of ourexperiment to determine, as i said, the impact onreadmissions and emergency use, and the agreement topublish the results, to register thatalgorithm again with tech transfer andsee what happens to it.

future research i think isthat i'd love to look at some interventions totry to reduce patients refusing post-acute care. so we can offer it, we canidentify who needs it, and we can offer it to them,but in our studies we've seen that about 28percent refuse the care. they don't want someonecoming to their home, they don't want to go to askilled nursing facility. and so i'm interested instudying that to see how we

can understand patients'decision making and help them make better decisions. also an opportunity maybeto look at decision support once they get intopost-acute care as to what resources they need there. do they need physicaltherapy, social work, o.t.? and then referrals fromthe emergency department. really interested inpicking up patients who are discharged from emergencyroom versus being admitted.

so these peopleare sent home, and i would love to study --i've got a doctoral student interested in this -- tobuild an algorithm that helps identify thepatient leaving the e.d. who needs home care, so thatwe don't just send them home but we recognizethose that are going to need more support. and then finally, totranslate our new tool which we named direct,which stands for discharge

referral expert caretransitions tool, into practice. so i've given you a list ofour publications that i've spoken about today,and i thank you so much for your attention. i'm really looking forwardto a lively discussion. thank you. >> patricia grady: thankyou so much, kathy. that was an incrediblyinteresting and

far-ranging presentation. they say that if you scratchthe surface of most people that there's an entrepreneurunderneath, and -- >> kathryn bowles: [laughs] >> patricia grady: --clearly we didn't have to look very far belowyour surface. so the floor is nowopen for questions. please step up to themicrophones on the side. if your voice isextraordinarily loud,

you could speakfrom where you sit, but we prefer you goto the microphone and ask your questions. this is your opportunity. >> amelia roberts:hello, how are you? >> kathryn bowles: great. how are you? >> amelia roberts: hi, myname is amelia roberts [phonetic sp].

i'm a nurseentrepreneur as well. >> kathryn bowles: yay! >> amelia roberts: yep, aswell as a care coordinator. so, question. you mentioned somethinglike very briefly, you made it soundeasy, but -- >> amelia roberts: --interoperability between, you know, the hospitaland the care facility, information goingacross the chasm.

can you tell memore about that? >> kathryn bowles:[affirmative] yeah. so the rightcare solutionshas developed that aspect of it, and it varies accordingto who they're working with. and i don't know thetechnical details of it, i'm sorry to say. in some cases, it's anemail document being sent. in other cases, they'reable to connect. so if it's, you know, if thehome care agency has

software that is alreadyconnected with the hospital system in whichthey're working, then they're able to do it. so it is not easy,as you said. it's difficult. but they're able tocommunicate live, so they've set up a featurewhere the case managers can talk with the dischargeplanners in the hospital in a live chat typeof environment,

rather thanplaying phone tag. and they can also leavemessages for each other within the software. >> sue wingate: hi,good afternoon. sue wingate. i'm from the intramuralprogram here. very nice presentation,thank you. i have a question about theinstitutions you've gone to that have used your tool --

>> kathryn bowles:[affirmative] >> sue wingate: -- and haveyou collected or have they collected information aboutwhether they've been able to avoid penaltiesrelated to that? >> kathryn bowles: yeah. i've not beeninvolved in that, because any insulations thathave occurred have been through the company, so --but the company has been tracking that.

and they had given me someslides to show here today, but i chose not tobecause it's commercial. i didn't want, you know,it's not scientifically peer reviewed, and so ididn't feel comfortable in doing that. but they are showing thatwhen the advice is followed by the algorithm thatreadmissions are decreased on average by 24 percentacross -- the slide showed 12 hospitals thatthey aggregated data.

and the other thing they'reseeing which is interesting is a decrease in length ofstay by about a half a day, and that's because we startdischarge -- we send this advice on day one. and so the dischargeplanning process can begin early. >> male speaker: dr. bowles,you've alluded to the great mary naylor'smentorship of yourself. you've talked about howyou've mentored eric.

you've also mentored quitea few nurse scientists. what advice would you giveto your colleagues who are serving currently as mentorsor who will serve as mentors, and what advicewould you give to those seeking mentorship? what are the lessons learnedfrom some pretty significant mentorship relationshipsin your own life? >> kathryn bowles:oh, thank you. this is a very importantpart of my life,

so i appreciatethe question. so to those studentsseeking mentorship, i'm -- roll your sleevesup and get involved, and don't waitto be asked. i mean, the bestopportunities i had was just to volunteer and getinvolved and try to learn and be helpful, and amentor will take to that, believe me. and, you know, my bestmentees are those that

do just that. they're really interestedin the work that we do, and they look to be helpful,and they're always looking for an opportunity to, youknow, ask the next question, and somehow build on thework that we're doing. and so, making it ateam environment, and also involving them-- i include them in publications, i give themopportunities to do posters and presentationsof our work.

so look for things likethat to help them. >> female speaker: greatpresentation, very clear. >> kathryn bowles:thank you. >> female speaker: not beinga nurse entrepreneur, or having even thoughtof it in my career, i wonder how does one approachventure capitalists? >> female speaker: imean, to me that is like the big black box. >> female speaker: what doyou have to do to first of

all make the connection,and then second of all, how do you talkto those folks? >> kathryn bowles:[affirmative] do you ever watch "shark tank" [laughs]? >> female speaker:no, i haven't. [laughter] >> kathryn bowles: well,it's not quite like "shark tank," but theylet us sit down. we didn't have tostand [laughs].

but fortunately forus, eric, in those eight years he wasoff working in the industry, was in venture capital. so he had a lot ofconnections in the industry, and so he knew who tocontact and who to put us in front of. so that was very helpful. but the university can helpyou with that as well. so the technology transferoffice, you know,

helps you find a ceo,helps you patent, helps you find seed money,helps you find venture. what it was like -- idid probably about 10 presentations in roomswith men, all men. and they were nothealthcare providers. and so it was up to me asa nurse to help them understand why this wasimportant, and, you know, why people would want tohave this algorithm help and that proof of conceptstudy that we did in the

hospital where we sawresults was shown to them, and that was critical ithink in convincing them that we did indeed havesomething worthwhile. and then as theybuilt the software, they continued to be ableto demonstrate that, and that's how we gotmore and more and more. >> female speaker:thank you. that's intriguing. >> female speaker: hi.

chris [unintelligible],ninr, and i'm wondering if youthink that these new -- the office of the nationalcoordinator just released the 2016 standardsfor, you know, supposedly to overcomesome of these issues of interoperability thatyou encountered. and is that going tofacilitate, in your opinion, more of this? or did they notgo far enough?

or -- if you have anyspeculation on that, i'd love to hear it. i haven't had a chanceto look at those yet, but i am thrilledthat it's happening, because it'sabsolutely necessary. so as you heardfrom, you know, my story of how difficultit was -- and we weren't trying to eveninteroperate anything. we just were trying to grabdata and, you know, use it.

but the transfer ofinformation live is a real challenge, and so ithink it takes the office of the nationalcoordinator and, you know, someone setting standardsand publishing those standards for people to pickthem up and follow them, if this is evergoing to happen. so. >> female speaker: as ninrhas done the common data element seminars andvarious things like that,

they've been -- the officeof the national coordinator had put this outfor comment, and -- >> female speaker:-- you know, their comments areall over the board. i took a look at it, but ihaven't had a change to read the whole thingeither, so [laughs]. >> kathryn bowles: okay. well, it's our homework. >> female speaker:hello, dr. bowles.

>> kathryn bowles: hi. >> female speaker: my nameis andrea and i'm a nurse here at the clinicalcenter at the nih. and you mentioned how thedata highlights what nurses contribute, you know,in their documentation. so i was wondering, were youable to collect any data on the impact of the d2s2or your work on nursing perception ornursing practice? >> kathryn bowles:no, i've not -- no,

i have not looked atthat, not collected any information about theimpact on nurse practice. so mainly we work withthe case managers, discharge planners. and so the models varyacross the country, and so sometimes the staffnurse, the bedside nurse, is involved inthose decisions, and most times they're not. and so, so far, in theplaces we've installed in,

they have case managementdepartments. and so the bedside nurse,i've been unable to impact. and i would love to hearfrom the case managers that, you know, this has helped tomake them more efficient. i've not heard that. and i have actually heard,you know, people say that, you know, i've been doingthis for 40 years and this piece of software's goingto tell me what to do, and that's not at allwhat we're about.

we really want this tobe decision support. it's support. it's not "do this or else,"but it's "here's some pieces of -- more informationfor you to help you make better decisions." >> patricia grady: well,your timing is impeccable. we thank you again somuch for being here, and we do have a memento foryou to take back with you. so i'm presentingdr. kathryn bowles with

a certificate recognizing withgrateful appreciation her ninr director's lecture, march 3, 2016. so thank you again. [unintelligiblecommentary] >> male speaker:[unintelligible] thank you. [inaudible] >> kathryn bowles:thank you very much. >> patricia grady: andalso, as we've said, this is the first of ourseries of ninr director's

lectures for this year. so i would like toinvite all of you, including kathryn, to comeback in person or to watch on the website the seriesof lectures this year. the next one will bemarie nolan on may 5, who will talk aboutreframing shared decision making at the end of life. ellen goodman will provideour "science and the public" lecture on "the mostimportant conversation

we're not having." that will be onseptember 13. and in conjunction with ouranniversary in november, we will have our lastlecture of the year, dr. sandra millon-underwood,who will talk about her work in reducing healthdisparities in patients with cancer. so again, thank youall for coming, and thank you again fordr. kathryn bowles.

>> kathryn bowles: thankyou very much.

No comments:

Post a Comment