Research Article - (2012) Volume 2, Issue 1
Introduction: By forging partnerships among academia, industry and other stakeholders in health care IT, it is hoped that more user-friendly and efficient HIT solutions will become available soon. Although this shared stakeholders approach appears effective and synergistic, there has been minimal prior work that describes such collaborations.Understanding this collaboration is particularly important due to the different perspectives and priorities of academic and commercial stakeholders. In this report we share our collaborative experience of developing and delivering the Geriatric Enhancement Module (GEM), a software application comprised of a 7-item questionnaire designed to generate discussions among staff, providers and patients regarding quality of life issues.
Methods: Our academic-industry collaborators worked cooperatively to select and recruit three practices for the study, iteratively developed the GEM questions and available responses, and devised the method of delivering the questions within our pilot practices by participating jointly in a series of face to face meetings and conference calls during 12 months out of the two year study period.
Findings: One of the most important lessons learned is that despite the vendor’s computer programming aimed at delivering the module in a specific way for a particular set of patients, practices figured out ways to control the delivery of the module or tailor it for their own purposes in ways that certain automated features, such as “popping” up for all patients aged 50 or over, were not being used in the way that the vendor had thought. Therefore, we experimented with free-text software as an alternative method for delivery.
In small health care organizations, such as office-based practices, health information technology has been variably adopted [1,2]. Additionally, when health IT is implemented in smaller care settings, its usefulness may be limited due to narrow functionality, inadequate training of users, excessive interference with office workflow, and variable uptake among individual practices and physicians [3-5]. This is a concern because providers have immediate needs for participating in quality improvement (QI) or other practice transformation initiatives. In the U.S. such initiatives include Medicare and Medicaid’s “meaningful use” of electronic health records, Patient Centered Medical Home (PCMH) programs supported by the National Committee of Quality Assurance (NCQA), and medical specialty boards’ certification processes. To keep up with these evolving data requirements, providers need to have HIT solutions that are accessible, cost efficient, and functional and that can be implemented feasibly within clinical practice settings.
There have been calls to bring industry leaders, skilled methodologists and systems experts together to develop solutions and strategies that can create the right technology and methods to support providers in their efforts to participate in meaningful practice improvements [6]. By forging partnerships among academia, industry and other stakeholders in health care IT, it is hoped that more user-friendly and efficient HIT solutions will become available soon. Although this shared stakeholders approach appears effective and synergistic, there has been minimal prior work that describes such collaborations. Understanding this collaboration is particularly important due to the different perspectives and priorities of academic and commercial stakeholders.
In this report we share our experience of developing and delivering the Geriatric Enhancement Module (GEM), a software application comprised of a 7-item questionnaire designed to generate discussions among staff, providers and patients regarding quality of life issues (Figure 1). The GEM was developed by a team of university-based health services researchers in conjunction with a private software vendor. The purpose of this report is to describe our experience partnering with an industry HIT vendor to develop and implement the GEM application, our expectations for the collaboration, and the impact the collaboration had on the development process. We feel that this work provided valuable lessons that can inform future collaborative EHR module development. Being able to develop and then test the GEM sequentially inthree primary care settings collaboratively led to the most practical lesson learned: the discovery ofan unanticipated method for delivering the GEM questions within diverse electronic health record systems.
Expectations for the collaboration
From the perspective of the academic health services researchers, collaborating with an IT vendor to develop and deliver a quality-oflife EHR module offered access to the vendor’s experience gained from many implementations. The primary expected benefit of this experience was the ability to integrate the new module smoothly within the existing EHR systems of the participating practices. Specifically, we expected the answers to the GEM items to be routed through all sections of the HER. In addition, we anticipated the vendor would have insight into which practices should be selected as pilot sites. We also expected the vendor’s experience and knowledge of the EHR systems to be useful in selecting which items to include in the GEM.
Our academic-industry collaborators worked cooperatively to decide what types of practices to recruit for the pilot study, iteratively developed the GEM questions and available responses, and devised the method of delivering the questions within our pilot practices by participating jointly in a series of face to face meetings and conference calls during 12 months out of the two year study period.
Site selection
To generate a list of potential pilot practices, the academic group reviewed sites in the North Carolina Network Consortium (NCNC), a statewide primary care research network [7], while the industry group reviewed a list of North Carolina practices that currently used their software products. The team subsequently discussed an aggregate list based on the following criteria: (a) geographically located within a 100 mile radius of Chapel Hill; (b) not administratively unstable; (c) having a practice administrator and medical director willing to participate, and; (d) having operational EHR systems. We initially targeted small private primary care practices (defined as less than 4 providers). We implemented the GEM in two small practices, both of which were initially contacted about the study via the industry-vendor who had previously assisted these practices implement various EHR “add on” question sets. The academic team then formally recruited the practices in to the study via mail, phone and face to face contact. Of note, based on our experience of implementing the GEM software in the first two practices, we came to realize that a practice’s respective experience with using similar software products was not a key factor determining implementation success. Therefore, for the third practice we decided to pursue a radically different approach (described in detail later in this report).
All three sites received a one-time $2500 incentive to reimburse them for staff time related to the study. The study was approved by the Institutional Review Board of the University of North Carolina at Chapel Hill prior to its initiation.
Development of the GEM
The GEM questions were created by reviewing the publically available self-reported health related quality of life (HRQL) items that have been validated in older populations. The global domains covered by these questions included: (1) physical health; (2) emotional health; (3) physical functioning and limitations in activities of daily living/ instrumental activities of daily living, and; (4) level of social support. The research-industry team then iteratively formatted and arranged the questions to maximize comprehension and field-tested the items on a convenience sample of 20 elderly patients who attend a family medicine outpatient practice (Figure 1).
Integration of the GEM within EHRs
Once the final GEM questions were selected, the academic-industry team defined how the GEM questions and answers items could flow through the respective practice’s electronic health record sections and billing systems via participating in a joint webinar hosted by the software vendor. We believed that providers may be interested in having the GEM items populate the “review of systems” and “history of present illness” sections of their respective medical records as attention to these items could not only support general documentation of the discussions, but also support billing codes, such as those itemized in the evaluation and management coding framework (E&M codes) in the U.S. Technical support to practice sites and the academic research staff during implementation of the study was provided by the industry group for the first 2 practices and accomplished via phone and Internetbased communications.
The vendor programmed the GEM items to automatically open for all patients 50 years of age and over in the first 2 practices. However, this actually did not occur as expected. Instead, the providers had to open the GEM questions in a separate step. This issue was handled expeditiously by having the study research assistant (RA), one of the academic investigators, an IT staff member in the academic department,and the vendor confer about the issues and identify a “work around” that seemed most efficient. The RA and/or the vendor then relayed this information to practice staff.
The GEM was installed in the first 2 sites via an internet connection by the industry partner or by direct loading of the software into office computers that did not have internet connection. For all practices, the academic group visited the site to provide details about the study and obtained informed consent from members of the practice. After informed consent was obtained, the RA provided some on-site training with the office staff to work with them to determine the best way to recruit patients into the study, electronically deliver the questions to patients prior to the provider visit and have the patient answers to the GEM questions available to the provider during the visit. The RA worked with practices to have patients read and answer the GEM questions directly into the medical record system, or do this with some assistance of a staff person during the intake portion of an office visit. Providers could choose to address the GEM items or not. Likewise the patients could raise issues that surfaced based upon the GEM question items. Such information could be placed into the medical record via text (directly or via dictation) in the “history of present illness” section, and items were loaded into domains within the “review of systems” section of the electronic health records in these two practices.
One of the most important lessons we learned from our first two practices, is that despite the vendor’s computer programming aimed at delivering the module in a specific way for a particular set of patients, practices figured out ways to control the delivery of the module or tailor it for their own purposes in ways that certain automated features, such as “popping” up for all patients aged 50 or over, were not being used in the way that the vendor had thought. During this time we also learned of an approach that used freely downloadable “text editing software” that our own family medicine clinic had used to deliver a small research study within our own “home grown” HER. Our academic team decided to experiment with this “text editing software” delivery option. Its reported ability to be used with virtually all electronic health systems was of particular interest to many of us who are keenly aware of how many different electronic medical record systems exist in our state. The fact that this simple solution was free and relatively “low tech’ made it particularly attractive to the academic team. Therefore, for the third practice site, we used this different delivery system, which did not require the resources of our vendor partner. Instead, the IT staff of the academic department and study personnel provided the necessary technical support to this site.
Using text-capturing software
The “text capturing” function enables desired standard text to be placed into any available “text field” that already exists in an EHR (or any electronic document). For example, if a provider wants to place smoking cessation counseling text into an EHR, he/she only needs to first type or “cut and paste” this text into the “phrase content” field as displayed in Figure 2 and import the text into the free text field via one of three options that are supported by the software. These options are (1) type a few letters that represent the standard text (e.g., “.scc”) to have “smoking cessation counseling” text that was previously created automatically entered into a free text field, (2) use a “hot key” option that will populate the standard text into the chosen text field, or (3) choose the text from a menu that is available by clicking on the text express icon. There is also an option of adding additional text to your standard text; thus, if a provider created a numbered list of smoking cessation options a patient can choose from, the provider can type in the patient’s response and have this stored as permanent textual data (Figure 3).
To use the text.express* software, the provider must first download the software to a computer that uses a Windows operating environment. (Macintosh compatible products exist as well. See “Typinator product listed below). Instructions and short video tutorials make this process easy even for the fewer computers savvy. After downloading, the text. express icon automatically appears in the bottom right hand corner of the computer. The setup from there is straightforward and is menudriven. Of course, there are more “expert” versions for sale, but the free version was sufficient for our GEM module. Also, there are “network” versions that one can buy that will allow multiple computers to be able to use the same software maintained in a central computing environment. We are purposefully not supplying exact directions for all of these options as there are many text editing / text substitution software products that are easily located via an internet search (e.g., Texter, TextBeast, Typinator (works with some Macintosh operating systems), Rapidkey, JItBIt, Breevy). Many of these can insert text and images into any application that you type in. Our research group did not experiment with any of these other products because our clinical practice site already had some experience with Phase Express. However, we do not have any reason to recommend one over the other, and there also are likely many other text editing products available.
Of course, there are limitations to this kind of “work around.” Most notably, the answers to the patient question items are not in “discrete fields,” which limits the ability to analyze aggregate data. However, one can use discreet field data that is available in some EHRs to document that certain QI items were addressed/performed and have that discreet documentation supported (i.e., “backed up”) by the standard text field created. For instance, an EHR may have a pull-down menu or radiobutton field to indicate that a patient received smoking cessation counseling. This kind of discreet field data can then be “counted” by running computer queries. The content of this counseling can be captured via the text imported by text.express. Also, text capturing products (“natural processing language”, NPL), which can “mine” textual content, are under development. If NPL software becomes more readily available, providers should eventually be able to render text data, which was previously deemed “uncountable,” as discreet field data.
Another current limitation of using the text.express software is that the provider is not specifically prompted by this system to ask the patient questions. The system must be prompted (via the various ways described above) to get the standard text seen. Staff could assist with opening up these items to remind the providers to address the various issues, but we realize the difficulty in using this consistently in busy practice settings.
*The actual product we use is Phase Express (phaseexpress. com), no financial or professional relationship exists between the aforementioned research or industry group and this company.
Implications for future collaborations
To our knowledge this report is the first description of an academicindustry partnership to implement an EHR module in primary care settings. Our experience suggests that a shared role for academic and industry partners is feasible in the pre-implementation development, subject recruitment, and execution of HIT in primary care practices. Prior work in the area of academic-industry collaboration has largely focused on conflict of interest issues, predominantly within the context of clinical trials [8-10]. Academic groups are traditionally expected to generate fundamental knowledge that industry, in turn, develops and markets [10]. In undertaking clinical research, this translates into academic partners having primary roles in several areas: study design, subject recruitment, execution and completion of the trial (with safety monitoring), and data analyses [8]. Industry, on the other hand, is generally responsible for pretrial development of the intervention and funding of the study [8]. Our study was federally funded, which may have ameliorated potential conflicts of interest and facilitated a knowledge and transfer flow between the partners. In addition, throughout the course of the study there were recurring exchanges among different types of investigators, a key factor that has been noted to make academic-industry collaborations workable [10].
Academic health centers and industry differ in their missions, cultures, resources, and incentives, differences that need to be acknowledged and respected in approaching partnerships [10]. In considering an academic-industry collaboration, several organizational-level factors for each partner may need to be considered: (1) organizational readiness to change; (2) adaptations that each organization makes to enhance the “fit” of the intervention, and; (3) the extent to which organizational members perceive that the intervention reflects their values [11-13]. Successful future collaborations will need to be mindful of these factors along a dynamic process of discovery.
In summary, we found that an academic-industry partnership is a feasible approach for developing innovative HIT software, recruiting primary care practices and patients into an HIT implementation study, and executing HIT within the practices. In our study, the HIT software application, recruitment protocol, and execution of the HIT within the practice sites were all markedly altered due to a collaborative, heuristic process. As such, it challenged several a priori organizational expectations for the academic and industry group, but eventually led to a creative bridging of their respective roles.
This study was funded by the National Institute on Aging (R21AG030166). Christopher M. Shea is currently supported by a career development award through the North Carolina Translational and Clinical Sciences (NC TraCS) Institute at the University of North Carolina – Chapel Hill, which is funded through the NIH Clinical and Translational Science Awards (CTSA).