In collaboration with the Bergen County Prosecutor’s Office; 6 NJ/NY CLE credits. Click here for more information or to register.
Helen Oscislawski, Privacy Risk Assessments and Privacy Challenges
Helen Oscislawski is the founder of Oscislawski, LLC in Princeton. She provides legal guidance on HIPAA, HITECH, state privacy laws, electronic health information exchanges and health information technology to HIEs, RHIOs and ACOs, and counsels other healthcare clients in various matters.
Ms. Oscislawski was appointed by Governor Jon Corzine in 2008 to the New Jersey Health Information Technology Commission (NJHITC) and was reappointed to the NJHITC by Governor Chris Christie in 2010 where she also served as Chair of the Privacy and Security Committee for NJHIT Coordinator. She is the primary author of Update to Privacy and Security Compliance Manual, which was developed for the New Jersey Hospital Association and, most recently, she has developed and authored several editions of the HIPAA-HITECH Helpbook, a manual that combines tools and sample forms that address HITECH changes, state law and other considerations and Meaningful Use and Health Information Exchanges.
Before founding Oscislawski, LLC, Ms. Oscislawski was a healthcare attorney at Fox Rothchild in Princeton, New Jersey, where she counseled healthcare clients on a wide range of legal matters. She received her BA from Rutgers University, Douglass College and her JD from Rutgers School of Law.
Frank Pasquale, Professor of Law, Seton Hall Law School, The Past, Present and Future of Health Privacy
Professor Frank Pasquale is the Schering-Plough Professor in Health Care Regulation and Enforcement at Seton Hall Law School. Professor Pasquale has taught information and health law at Seton Hall since 2004. He has published over 20 scholarly articles. His research agenda focuses on challenges posed to information law by rapidly changing technology, particularly in the health care, internet, and finance industries.
Professor Pasquale is an Affiliate Fellow of Yale Law School’s Information Society Project. He has been named to the Advisory Board of the Electronic Privacy Information Center. He has served on the executive board of the Health Law Section of the American Association of Law Schools (AALS), and has served as chair of the AALS Section on Privacy and Defamation.
Professor Pasquale received his BA from Harvard University (summa cum laude), his M.Phil. from Oxford University, and his JD from Yale Law School.
Jaime S. Pego, Director, Healthcare Advisory Services, KPMG LLP, (along with Joy Pritts, Mark Swearingen, and Frank Pasquale, Moderator) Panel Discussion: The Practical Steps Necessary to Promote Privacy and Cybersecurity in Modern Healthcare Organizations
Jaime S. Pego is a Director in the Short Hills, New Jersey, office of KPMG LLP’s Healthcare Advisory Services Practice and serves as the firm’s National HIPAA Privacy Director. She has substantial experience in healthcare regulatory compliance and healthcare-related advisory services.
Ms. Pego works with a variety of healthcare clients to assist with identifying and preventing compliance risks and complying with federal and state regulations. Her work for KPMG includes serving as lead director for OCR HIPAA audits, as well as acting as Privacy Lead for the KPMG HIPAA national service line assisting covered entities and business associates with HIPAA compliance. She has conducted internal investigations concerning a variety of topics, including fraud and abuse, HIPAA violations, as well as other legal and regulatory matters, and researched and developed compliance policies for institutions in the areas of gifting under the Anti-Kickback Statute and Stark Law, the DRA, HIPAA, EMTALA and others. She participates in the KMPG National HIPAA working group to develop tools and methodologies for client needs, and conducts and manages ICD-10 Impact Assessment at a variety of healthcare organizations to help identify gaps in ICD-10 readiness. She has also served as the firm’s lead manager for health care reform legislative analysis and research.
Prior to coming to KPMG, Ms. Pego was a Local Compliance Officer at a teaching hospital and outpatient center for one of New Jersey’s largest health care systems and has worked with some of the country’s leading health systems. She received her BA from American University and her JD from Seton Hall University School of Law, with a Concentration in Health Law, and is Certified in Healthcare Compliance (CHC) by the Health Care Compliance Association (HCCA).
Joy Pritts, Chief Privacy Officer, ONC, HHS, Meaningful Use Regulations: What Providers Need To Know To Comply
Joy Pritts joined the Office of the National Coordinator for Health Information Technology (ONC), Department of Health & Human Services in February 2010 as its first Chief Privacy Officer. Ms. Pritts provides critical advice to the Secretary and the National Coordinator in developing and implementing ONC’s privacy and security programs under HITECH. She works closely with the Office for Civil Rights and other operating divisions of HHS, as well as with other government agencies to help ensure a coordinated approach to key privacy and security issues.
Prior to joining ONC, Ms. Pritts held a joint appointment as a Senior Scholar with the O’Neill Institute for National and Global Health Law and as a Research Associate Professor with the Health Policy Institute, Georgetown University. She has an extensive background in confidentiality laws including the HIPAA Privacy Rule, federal alcohol and substance abuse treatment confidentiality laws, the Common Rule governing federally funded research, and state health information privacy laws.
Ms. Pritts received her BA from Oberlin College and her JD from Case Western Reserve University.
Anna Spencer, Esq., Sidley Austin, LLP, Data Breaches/Data Breach Notification Requirements and the Need for Encryption
Anna Spencer is a partner in Sidley Austin’s Washington, D.C. office whose practice focuses on health care. Ms. Spencer primarily works on matters involving the privacy and security of health information and she is the firm’s global coordinator for health information privacy. She regularly counsels a broad range of clients on healthcare information privacy and security issues. This includes assisting clients with respect to HIPAA and HITECH and has significant experience in investigating and responding to data breaches and information security incidents. She has represented clients in connection with data breach reporting obligations under the HITECH regulations for breaches of protected health information and defended health care providers in investigations initiated by the Office of Civil Rights, Department of Health and Human Services.
On behalf of covered entities and entities that qualify as HIPAA business associates, Ms. Spencer has developed multiple HIPAA privacy and security compliance and training programs. She has negotiated hundreds of Business Associate Agreements on behalf of various clients.
Ms. Spencer has spoken on privacy/security matters on behalf of numerous groups such as BNA and the American Conference Institute. She has authored a variety of articles on privacy/security issues, Medicare coverage, and fraud and abuse. She is currently authoring a book for BNA on health information privacy. Ms. Spencer received her BA from Sewanee and her JD from Vanderbilt University School of Law.
Mark Swearingen, Esq., Hall, Render, Killian, Heath & Lyman, PC, HIPAA and HITECH Trends (Enforcement and Otherwise)
Mark Swearingen coordinates the HIPAA practice and provides counsel on health information privacy and security matters such as breach response and notification and the creation, use, disclosure, retention and destruction of medical records and other health information at the Indianapolis law firm, Hall, Render, Killian, Heath & Lyman, P.C. His counsel to clients also includes a variety of health care topics related to regulatory compliance, physician and clinical services contracting, risk management and Independent Review Organization services. He has provided such services to a broad spectrum of health system, hospital, physician practice, diagnostic imaging center, ambulatory surgical center and long-term care facility clients.
Mr. Swearingen has spoken and written nationally and regionally on numerous topics, including antitrust, electronic medical records and health information privacy and confidentiality. He is an adjunct professor of a course in Law and Medicine at the Indiana University School of Informatics at IUPUI.
Mr. Swearingen received his BA from Indiana University and his JD from Seton Hall Law School.
Seton Hall Professor and Health Care Regulation Expert Frank Pasquale to Present Draft White Paper Outlining Options and then Moderate a Discussion on its Pros and Cons with Fellow Academics
Washington, D.C. – Seton Hall University School of Law hosted an academic roundtable discussion on how our current healthcare law will respond to the new technology environment – in particular, maintaining privacy for consumers as the health industry expands adoption of cloud computing, on Friday, March 22, 2013. Seton Hall Professor Frank Pasquale moderated the event, “The Future of HIPAA and The Cloud,” and also released a white paper he coauthored with Tara Adams Ragone on the challenges that cloud computing technologies pose to the Health Insurance Portability and Accountability Act (HIPAA).
As the recent HIPAA Omnibus Rule showed, regulation must both reflect and shape technological advances. As stakeholders face new challenges and opportunities, the roundtable asked: What is the future of HIPAA in the cloud? How will patient data be used? What is the role for third party vendors? And who should be held responsible for security breaches in the cloud?
White paper abstract:
This white paper examines how cloud computing generates new privacy challenges for both healthcare providers and patients, and how American health privacy laws may be interpreted or amended to address these challenges. Given the current implementation of Meaningful Use rules for health information technology and the Omnibus HIPAA Rule in health care generally, the stage is now set for a distinctive law of “health information” to emerge. HIPAA has come of age of late, with more aggressive enforcement efforts targeting wayward healthcare entities. Nevertheless, more needs to be done to assure that health privacy and all the values it is meant to protect are actually vindicated in an era of ever faster and more pervasive data transfer and analysis.
After describing how cloud computing is now used in healthcare, this white paper examines nascent and emerging cloud applications. Current regulation addresses many of these scenarios, but also leaves some important decision points ahead. Business associate agreements between cloud service providers and covered entities will need to address new risks. To meaningfully consent to new uses of protected health information, patients will need access to more sophisticated and granular methods of monitoring data collection, analysis, and use. Policymakers should be concerned not only about medical records, but also about medical reputations used to deny opportunities. In order to implement these and other recommendations, more funding for technical assistance for health privacy regulators is essential.
In her article, “Institutional Competence to Balance Privacy and Competing Values: The Forgotten Third Prong of HIPAA Preemption Analysis,” Barbara J. Evans takes on the well-settled belief — or “rumor,” as she calls it – that the HIPAA “Privacy Rule merely sets a floor of privacy protection that leaves states free to set stricter privacy standards.” (A draft of this article is available on SSRN, and it will be published in the University of California-Davis Law Review in 2013.) Although this general rule of HIPAA preemption is largely accurate, the article argues that it is wrong with respect to an enumerated “class of public health activities that Congress deemed to have high social value,” including “reporting of disease or injury, child abuse, birth, or death, public health surveillance, or public health investigation or intervention.”
Professor Evans begins with a textual argument, pointing out that HIPAA’s statutory text specifically includes a third prong, while HIPAA’s Privacy Rule, one of HIPAA’s key implementing regulations, collapses the statutory language into two prongs. The article maintains that in doing so, the “Privacy Rule ignored a clear statutory instruction to preempt state privacy law in a specific circumstance where Congress determined that individual privacy interests should give way to competing public interests.” In this specific public health context, she continues, “the HIPAA statute creates what might be called a ‘canopy,’ to shelter specific socially important data uses from more stringent privacy laws.” The author buttresses her analysis with legislative and regulatory history as well as a comparison with the structure of ERISA preemption provisions.
Noting that the statute speaks directly to this issue, Professor Evans maintains that the public health portion of the Privacy Rule is not entitled to Chevron or Skidmore deference where its interpretation is contrary to the statute and the agency did not offer a persuasive account to justify its interpretations. Rather, “the HIPAA statute preempts state privacy laws — even ones that are more stringent than the HIPAA privacy Rule — in situations where state laws would interfere with public health surveillance and investigations.”
Professor Evans attributes the inconsistency between the Privacy Rule and HIPAA to politically savvy rather than incompetent agency drafting. She asserts that HHS was aware that states were afraid that their privacy laws would be preempted, and thus the agency took a modest approach in the Privacy Rule, leaving unspoken the effect of the third prong on more stringent state laws in the limited context of enumerated public health activities. The statutory text, however, reflects Congress’s choice to ”trust no institution other than itself” to “strike the balance between privacy and competing public interests.” There was a conscious choice not to permit a patchwork of varying state laws to frustrate the development of multi-state, interoperable databases needed for the enumerated public health activities.
This article breathes new life into statutory language that has been largely overlooked in the sixteen years since HIPAA’s enactment and is critical reading for anyone interested in public health surveillance, investigation, and privacy law. Professor Evans argues that facilitating access to large-scale, multi-state, interoperable databases of health-related data for tens or even hundreds of millions of people could speed “the detection of drug safety risks, unmask ineffective or wasteful treatments, and understand disparities in health outcomes among various populations subgroups,” while “unduly restrict[ing] access to data and biospecimens can very literally kill people.”
The article closes with an invitation to scholars for further “dialogue about [HIPAA']s forgotten preemption provision,” an invitation the health law community would be wise to accept. While she readily acknowledges that her conclusions are unorthodox, they will undoubtedly generate substantial and serious academic discussion.
Another important article for interoperability policymaking is Leslie P. Francis‘s article, “Skeletons in the Family Medical Closet: Access of Personal Representatives to Interoperable Medical Records,” which recently was posted to SSRN and was published in volume 4, issue 2 of the 2011 Saint Louis University Journal of Health Law & Policy.
With HIPAA’s Privacy Rule and the HITECH Act, federal law now grants patients the right to access their own medical records, including EHRs, with some limitations for certain records, such as psychotherapy notes. Importantly, personal representatives now generally enjoy the same rights of access to medical records that patients themselves hold, consistent with state law.
In addition, although HIPAA preempts state laws that are inconsistent with federal law, HIPAA generally (see Professor Evan’s important caveat above) does not preempt state laws that protect privacy more stringently than federal law. A state law is deemed more stringent when, for example, it provides individuals with greater access to their health information. As a result, “states may expand the individual right of access to health information, but may not contract it.”
The article points out an unintended consequence of such an expansion, however, given federal law on access: states that provide equal rights of access to patients and their representatives would be expanding personal representative access in step with any increased rights for patients.
But given the breadth of interoperable EHRs, patients may not want or expect their personal representatives to have access equal in scope to their own. Interoperable EHRs may very well contain records of medical care that are not directly relevant to the patients’ current care and that patients may not want their personal representatives to see. Professor Francis offers the example of an older patient being treated for a stroke who may not want her child to learn about her prior, unrelated pregnancy termination or psychiatric history – what Professor Francis calls “the metaphorical skeletons in her closet.”
The article thus explores the extent to which states may protect patient privacy and confidentiality in this legal framework by regulating personal representatives’ access to patient records. For example, although states generally either grant or deny personal representatives access to patient records, Professor Francis details how some have been more nuanced. For example, some permit patients to use advance directives to define the scope of access by personal representatives, such as on a need to know basis, while others restrict personal representative access to mental health or substance abuse treatment records.
Given the importance of respect for private autonomy, Professor Francis then makes four recommendations:
(1) Advance directive statutes should permit competent patients to designate the scope of their personal representatives’ access to interoperable medical records, ideally with respect to specific types of information, such as mental health, substance abuse, and reproductive history, and options such as all information, information only as needed to make care decisions, or no information.
(2) When patients do not have advance directives, there should be a presumption that personal representatives only have access to records needed for decision making about their care.
(3) Interoperable medical records should be designed to permit special management of sensitive medical information, such as mental health or substance abuse treatment records, to which personal representatives would have access only when necessary for emergency care.
(4) These recommendations generally should apply regardless if patients have mental illness or cognitive disabilities.
Graduate Certificate Program in Pharmaceutical & Medical Device Law & Compliance to Start Again, October 7, 2012
Filed under: Compliance, Drugs & Medical Devices, Seton Hall Law
Seton Hall Law School’s Center for Health & Pharmaceutical Law & Policy starts classes again on October 7th for the Graduate Certificate in Pharmaceutical & Medical Device Law & Compliance. The priority application date is September 24, 2012.
The Graduate Certificate in Pharmaceutical & Medical Device Law & Compliance is a non-degree program designed for individuals who seek in-depth knowledge about legal, regulatory, and ethical issues related to the pharmaceutical and medical device industries. Taught exclusively online, it offers students nationwide a targeted immersion in key substantive issues along with the practical skills necessary to research and communicate effectively about the law.
The intensive program is geared to busy professionals who want to cover a significant amount of material in a relatively short period of time. The program is open to students who have earned a baccalaureate degree from an accredited college or university. It is specifically designed to meet the needs of mid- to senior-level professionals in the health care industry, but highly motivated students from other backgrounds are also welcome to apply. It is not necessary to have prior academic or work experience in health care in order to do well in the program.
Additional information and registration is available here.
Why study pharmaceutical and medical device law at Seton Hall School of Law?
Seton Hall Law School has specialized in health law for more than a decade, and its health law program is consistently ranked among the top ten in the nation by U.S. News & World Report. The Law School’s health law faculty specialize in a wide range of health law topics, including healthcare organizations, nonprofit governance, healthcare financing, healthcare fraud and abuse, food and drug law, research with human subjects, genetics and the law, public health law, and bioethics. In addition to training future lawyers, Seton Hall Law offers a Master’s of Science in Jurisprudence degree for individuals working in the health care industry, as well as an innovative compliance certification program for pharmaceutical and medical device professionals. Seton Hall Law is also a center for scholarship and public policy development related to health care, particularly through its Center for Health & Pharmaceutical Law & Policy, whose mission is to foster informed dialogue among policymakers, consumer advocates, the medical profession, and industry.
I have hinted at problems with uniform trade secrecy laws in this volume and a law review article. I plan to continue that line of research in a co-authored work with Dave Levine, exploring the costs of trade secrecy in the finance, energy, and communications sectors. When it comes to “solutions,” I’m increasingly inclined to frame the issue as: how do we operationalize the insights of Michael Carroll’s “Uniformity Costs” concept? In other words, how do we shape doctrine so that it respects the unique economic conditions (and moral imperatives) related to specific industries?
One way to do so is to insist on the autonomy of a subject matter defined legal field (versus the trans-substantive aspirations of, say, contract, property, or intellectual property law). The “law of the horse crowd” usually assails that autonomy by warning about the distortionary affects of applying different laws to different sectors. Health law professors shared that worry for a while, debating whether health care law is a “coherent field.” But that anxiety seems to have faded as a distinct arena of health care economics develops and lawyers set to work implementing the massive HITECH and PPACA legislation passed in 2009 and 2010. The stage is now set for a distinctive law of “health information” to emerge, as third party payers and government use their leverage in the sector to tamp down counterproductive IP- and contract-based corporate strategies.
The law of health information is neither more “open” nor more “closed” than information law generally. Free access should be dictated in areas of extreme personal or societal need; in other cases, it may be right to force high payments, either ex ante via taxes, or ex post via high prices, from those with the ability to pay. Privacy should play a far more important role here than it does in the usual Wild West of internet data collection and processing. But once data is truly anonymized, the research imperative for access is perhaps more pressing than in any other area of law (except, perhaps, national security.).
For a recent controversy where laws of copyright seem inappropriate in a medical setting, check out this story:
According to the New England Journal of Medicine, after thirty years of silence, authors of a standard clinical psychiatric bedside test have issued take down orders of new medical research. Doctors who use copies of the bedside test which will have been printed in some of their oldest medical textbooks are liable to be sued for up to $150,000. . . . [E]ven the ghosts of positively ancient abandoned copyrights for the very simplest of ideas can be used to block new medical work through legal bullying.
The “thirty years” of silence part makes me want to look into a laches claim. The simplicity of the test also seems to invite a merger defense. On the other hand, perhaps the best answer is compulsory licensing, which should have gotten more attention during the SOPA/PIPA flap. Whatever solution is optimal, the implication of the NEJM piece is clear: health professionals believe their field deserves some autonomy from the normal laws of intellectual property. Popular reaction against secret prices of medical devices and hospital procedures also reflects that view.
In many areas, such rebellions against pricing the priceless have translated into general skepticism about intellectual property. In health care, they may lead to something different: a health information law distinct from the IP and privacy laws of general application.
An eminence grise of cyberlaw once told me that he got into the field in the 1980s because it was one of the few areas where things were “up for grabs” enough that a creative scholar could still have an influence. An elder statesman of the IP field told me that it had gone into “normal science” mode as of 2004 or so. Perhaps those who still want “paradigm shifts” need to work heavily regulated fields like health information law, where government policymakers are more regulators for (rather than instruments of) vendors and providers.
Health information law is a very exciting field. Lawyers, doctors, and start-ups are re-thinking health care as an information industry. I’ll be speaking on privacy and fair data practices at an upcoming conference. The relationships between privacy, “big data,” and trade secrecy will bear a great deal of attention in coming years.
Software-based automation has raised living standards dramatically. It makes factories more efficient, renders vast amounts of information accessible, and daily improves quality of life in barely noticed ways. To realize these types of advances in health care, government and NGOs have begun to catalyze better data collection, retention, and analysis. Life sciences companies need to report more data on drugs and devices. Hospitals and doctors are incentivized to use electronic health records via stimulus funding and rulemaking based on the HITECH Act’s meaningful use and certification requirements.
How will traditional intellectual property laws interact with these initiatives? Will the increasing need for cooperation and sharing of information alter the landscape of trade secrecy and other IP protections that have often siloed health data? Will providers find alternative funding sources for the collection, retention, and analysis of data, as some traditional IP protections appear increasingly outdated in a world of “big data” and market-driven transparency?
Medical privacy law has focused on assuring the privacy, security, and accuracy of medical data. The post-ACA landscape will include more concern about balancing privacy, innovation, access, and cost-control. Advanced information technology has raised a number of new questions. Beyond HIPAA and HITECH regulation, consumer protection law plays an important role in these fields. (For example, the FTC recently required firms that “score” the health status of individuals based on their pharmacy records to disclose these records to scored individuals.)
Patients are opting to personalize their health records with the help of cloud computing firms; what law governs this digital migration? There is increasing concern about the role of “incidental findings” in medical research and practice; how will regulators and professional groups address them? When employers demand access to employee health records, in what ways can they use them to profile the employee?
We also need to examine the legal aspects of data portability, integrity, and accuracy. When two health records conflict, which takes priority? What is “meaningful use” of an electronic health records system, and how will regulators and vendors assure interoperability between systems? The course will also cover innovators’ efforts to protect their health data systems using contracts, technology, trade secrecy, patents, and copyright, and “improvers’” efforts to circumvent those legal and technological barriers to openness.
Finally, what are pharmaceutical companies’ past and present strategies regarding the disclosure of their research, including non-publication of adverse results and ghostwriting of positive outcomes? Will a “reproducible research” movement, popular in the hard sciences, reach pharmaceutical firms? Insurer data will also be a target of reformers (including trade-secret protection of prices paid to hospitals, conflicts over the interpretation of disclosure requirements in the ACA, and state regulation of insurer-run doctor-rating sites). Quality improvement and pilot programs will need good provider and insurer data–how we will ensure they have them?
[Ed. Note: We are pleased to welcome Ana Liggio, Esq., to HRW. She is a health care and technology lawyer, in practice over 15 years. Prior to pursuing her LL.M. in Health Law here at Seton Hall Law, she was Director, Law Department, for Sony Electronics.]
The CMS website explains that meaningful use “means providers need to show they’re using certified EHR technology in ways that can be measured significantly in quality and in quantity. As CMS moves into finalizing meaningful use, Stage 2 requirements, I would like to introduce the concept of “meaningful experience” as an essential corollary to that of “meaningful use.”
Meaningful experience takes the idea a step further, representing ways to evaluate and encourage the merits of both proposed and existing criteria as seen from the value they bring to the provider and healthcare consumer stakeholders. While “meaningful use” focuses on ensuring that the financial beneficiaries of the Medicare and Medicaid EHR Incentive Program (the “Program”), the Certified Electronic Health Record Technology (“CEHRT”) industry, and the eligible healthcare providers (insofar as meaningful use bonus payments are at stake), continue to operate their EHR in a purposeful manner, there are additional, important stakeholders to consider. With billions of federal and state dollars earmarked for the Program and a strong interest in seeing EHR enjoy long-term success, taking a broader view of stakeholders and inserting more transparency into their experiences will better help the Program thrive. Meaningful Use, Stage 2, is the perfect time to look towards ensuring meaningful experience.
The Program is in full swing, with the Centers for Medicare and Medicaid Services (“CMS”) having released the NPRM on Meaningful Use, Stage 2, in the Federal Register on March 7, 2012.
The CMS blog explains: “Today’s proposed rules focus on using EHRs to improve health and health care while reducing the burden on physicians and hospitals where possible.” With early participation rates appearing strong, CMS continues to be cautious about keeping industry groups engaged and seeking out robust commentary through the NPRM. CMS clearly wants the healthcare industry to continue up the “EHR Escalator” without having anyone jump off for being frustrated or overwhelmed. To date, the strategy is working, as the CEHRT industry and healthcare providers appear to be embracing the Program. However, as Nicolas Terry points out in his article “Anticipating Stage Two: Assessing the Development of Meaningful Use and EMR Deployment,” ultimately, growth will have to be endogenous, fueled by innovation and consumer demand.
The comprehensive NPRM for Meaningful Use, Stage 2 demonstrates CMS’s commitment to considering the experiences and opinions of the interested industries. The ONC also asks data holders and non-data holders to take a pledge “to empower individuals to be partners in their health through health IT.” There is no doubt that the Program is making huge strides and continuing to chip away at the difficult issues of interoperability, access, privacy and security- and pushing the United States slowly but surely closer to a much higher healthcare IT standard similar to that enjoyed by many other developed nations. Moving into Stage 2, CMS seeks to enhance interoperability among different entities and further patient involvement by requiring increased access to their health information. That being said, the ONC’s National Coordinator for Health Information Technology, Farzeed Mostashari, explains that Stage 2 is meant to be more “evolutionary than revolutionary.” Importantly, Stage 2 also begins an initiative to align the requirements of the Program with other complementary, ongoing healthcare reform initiatives involving national quality and the development of ACOs.
Reading through the NPRM, I saw a few areas that CMS could focus on to help build a self-sustaining system. First, the initial iteration of the Program was clearly written with an eye toward maximizing meaningful use for family care and general practitioners and not towards other types of practices like pediatrics, various specialists, and physicians whose practices do not entail much face-to-face patient interaction (e.g., radiologists); they should be given further attention. Second, while CMS provides somewhat of a return on investment analysis in the NPRM, it apologetically declares it too early in the Program to be able to provide meaningful data; CMS could use the attestation process to collect the necessary data. Finally, healthcare consumers – those taxpayers who fund this program — should be actively considered and made aware of the enhancements and improvements that comprise the Program, which will be offering them a more efficient, accessible, safe and evidence-based healthcare experience; a “meaningful user” designation for CEHRT users who meet certain criteria could be developed to help providers publicize their investment in the Program and the attendant benefits it will bring to their patients. Meaningful Use, Stage 2, is the perfect time to address these issues and move the Program forward in such a way that will make it self-sustaining for the long-term, not because of incentive funding, but because meaningful use is providing a meaningful experience to the various EHR stakeholders.
As with early versions of the Medicare Shared Savings Plan and healthcare reform generally, the focus of the Program’s meaningful use objectives and criteria, initially at least, is on general practitioners and how they can use EHR to advance the overall wellbeing of the population. This goal is laudable, of course, but the population of eligible providers extends well beyond PCPs. Certain objectives and measures allow providers to claim an exclusion if they do not apply to their practice, thereby not penalizing those types of practitioners for whom compliance would be unnecessary and inefficient. However, focus on these different categories of practices could allow for alternative objectives and measures to be found. If one were to consider meaningful experience in addition to meaningful use, the attestation would ask EPs who are claiming exemptions to use and, possibly attest to, alternative meaningful use standards that are applicable to their practices. For instance, there is a proposed measure for recording 80% of an EP’s patients’ height, weight and blood pressure as structured data. There is an available exclusion, however, for EPs who do not believe that recording such vital signs is “relevant to their scope of practice.” An EP who claims the exclusion simply gets a pass on this field during the attestation process. Alternatively, a required (or even optional) free-form response area could be provided in the attestation each time an EP claims exclusions. As time goes on, data would be collected that would allow CMS to customize attestations, and CEHRT requirements as well, to different specialties so that meaningful use translates into meaningful experience for those whose practices do not fit the general practitioner mold on which the first versions of Meaningful Use were based. Certainly the technology will allow, rather easily, for modifications where appropriate if the effort is set forth to ask those in the field what would be meaningful to their practices and to encourage them to use the EHR tools available to them in such ways.
Because the proposed rule is anticipated to have an annual effect of over $100 million on the economy, a Regulatory Impact Analysis (RIA) that measures costs and benefits must be performed. While CMS does a fair job of estimating costs to providers of implementing EHR and costs to taxpayers of funding the Program, it has not done much to quantify benefits gleaned. The NPRM qualifies its analysis by pointing to various unknowns and a lack of “new data regarding rates of adoption or costs of implementation.” Without specific data, it estimates various “high and low” scenarios for different practice settings and ultimately concludes, “there are many positive effects of adopting EHR” as well as various benefits for society. While I tend to agree with this conclusion as general matter of conjecture, why not collect the actual data during the attestation process? Ask the EHR attesters how much their systems cost initially and to maintain. Ask the EHR attesters where the systems are adding value to their practices and for their patients. Yes, it’s a leap of faith to ask these questions because the answers may not offer a perfect picture, but they will offer an honest representation of the current state that can be addressed going forward. It is only fair to give the stakeholders an honest assessment and it would not be difficult to collect the data. While EHR is all about collecting healthcare data and crunching numbers to see trends and identify areas where improvements can be made, let’s use those same principals here to perform the same analysis with regard to the EHR technology.
Finally, to assist providers who have made the investment and will continue to feed important data to the various government health databases, CMS could offer some type of certification that the providers could use in marketing their practices. For all the good that EHR is meant to do in terms of patient safety, efficiency of care and meaningful communication between patients and their providers, let’s devise a way to inform patients about which providers are running state-of-the-art practices. Providers who attest to meeting the meaningful use requirements could be offered the option of using a certified meaningful user designation and displaying a certain logo, all of which would indicate to the public that such providers are using the latest healthcare technology. For healthcare consumers who consider it important to have the ability to access their records or have their prescriptions transmitted electronically, for example, this designation would help lead them to the types of practices they desire. Assuming this is the future of healthcare and what the American public desires or will come to desire of its healthcare providers, such a tool would be useful to the providers and healthcare consumers alike.
At the end of the day, the success of the EHR program, and the value it will have brought to the US healthcare system, will be measured by the experience of the healthcare providers and consumers. In the best-case scenario, there will be data showing that the EHR Program has achieved the desired results with a minimum burden placed upon providers. But what will actually entice providers to continue to make “meaningful use” of the systems will be when meaningful use results in an experience they deem worthwhile for themselves and their healthcare consumers and when their patients agree. As such, CMS should use the attestation process and resultant data to continuously measure the actual costs and benefits and make adjustments as needed. During the attestation process, it could ask providers to suggest alternative meaningful uses for EHR when the existing measures do not apply and to volunteer cost data and their impressions of meaningfulness. Finally, CMS could give providers a way to publicize their commitment to using technology to enhance patient care. Some time and effort devoted to meaningful experience will allow meaningful use to translate into a self-sustaining, successful program.
[Ed. note: this piece originally ran on April 17, 2012, but was lost in the vagaries of cyberspace to a blog mishap. It's just too good to lose and so here enjoys a repeat performance]
Filed under: Electronic Medical Records, IT, Medical Journals, Medical Malpractice
If one jumbo jet crashed in the US each day for a week, we’d expect the FAA to shut down the industry until the problem was figured out. But in our health care system, roughly 250 people die each day due to preventable error. A vice president at a health care quality company says that “If we could focus our efforts on just four key areas — failure to rescue, bed sores, postoperative sepsis, and postoperative pulmonary embolism — and reduce these incidents by just 20 percent, we could save 39,000 people from dying every year.” The aviation analogy has caught on in the system, as patient safety advocate Lucian Leape noted in his classic 1994 JAMA article, Error in Medicine. Leape notes that airlines have become far safer by adopting redundant system designs, standardized procedures, checklists, rigid and frequently reinforced certification and testing of pilots, and extensive reporting systems. Advocates like Leape and Peter Provonost have been advocating for adoption of similar methods in health care for some time, and have scored some remarkable successes.
But the aviation model has its critics. The very thoughtful finance blogger Ashwin Parameswaran argues that, “by protecting system performance against single faults, redundancies allow the latent buildup of multiple faults.” While human expertise depends on an intuitive grasp, or mapping, of a situation, perhaps built up over decades of experience, technologized control systems privilege algorithms that are supposed to aggregate the best that has been thought and calculated. The technology is supposed to be the distilled essence of the insights of thousands, fixed in software. But the persons operating in the midst of it are denied the feedback that is a cornerstone of intuitive learning. Parameswaram offers several passages from James Reason’s book Human Error to document the resulting tension between our ability to accurately model systems and an intuitive understanding of them. Reason states:
[C]omplex, tightly-coupled and highly defended systems have become increasingly opaque to the people who manage, maintain and operate them. This opacity has two aspects: not knowing what is happening and not understanding what the system can do. As we have seen, automation has wrought a fundamental change in the roles people play within certain high-risk technologies. Instead of having ‘hands on’ contact with the process, people have been promoted “to higher-level supervisory tasks and to long-term maintenance and planning tasks.” In all cases, these are far removed from the immediate processing. What direct information they have is filtered through the computer-based interface. And, as many accidents have demonstrated, they often cannot find what they need to know while, at the same time, being deluged with information they do not want nor know how to interpret.
A stark choice emerges. We can either double down on redundant, tech-driven systems, or we can try to restore smaller scale scenarios where human judgment actually stands a chance of comprehending the situation. We will need to begin to recognize this regulatory apparatus as a “process of integrating human intelligence with artificial intelligence.” (For more on that front, the recent “We, Robot” conference at U. Miami is also of great interest.)
Another recent story emphasized the importance of filters in an era of information overload, and the need to develop better ways of processing complex information. Kerry Grens’s article “Data Diving” emphasizes that “what lies untapped beneath the surface of published clinical trial analyses could rock the world of independent review.”
[F]or the most part, [analysts] rely simply on publications in peer-reviewed journals. Such reviews are valuable to clinicians and health agencies for recommending treatment. But as several recent studies illustrate, they can be grossly limited and misleading. . . . [There is] an entire world of data that never sees the light of publication. “I have an evidence crisis,” [says Tom Jefferson of the Cochrane Collaboration]. “I’m not sure what to make of what I see in journals.” He offers an example: one publication of a Tamiflu trial was seven pages long. The corresponding clinical study report was 8,545 pages. . . .
Clinical study reports . . . are the most comprehensive descriptions of trials’ methodology and results . . . . They include details that might not make it into a published paper, such as the composition of the placebo used, the original protocol and any deviations from it, and descriptions of all the measures that were collected. But even clinical study reports include some level of synthesis. At the finest level of resolution are the raw, unabridged, patient-level data. Getting access to either set of results, outside of being trial sponsors or drug regulators, is a rarity. Robert Gibbons, the director of the Center for Health Statistics at the University of Chicago, had never seen a reanalysis of raw data by an independent team until a few years ago, when he himself was staring at the full results from Eli Lilly’s clinical trials of the blockbuster antidepressant Prozac.
There will be a growing imperative to open up all of the data as concerns about the reliability of publications continue to grow.
Hospital readmissions for chronic diseases such as asthma, congestive heart failure and diabetes are said to have been estimated to account for over 80% of hospital inpatient stays. In an effort to reduce these admits and consequently lower healthcare costs, AT&T and Intuitive Health have collaborated to pilot a home-based remote patient monitoring solution which would allow patients to spend more time at home and engage in their own care rather than with healthcare providers at medical facilities. Through wireless connectivity provided for by AT&T, the system works to send data from the patients’ unobtrusive personal health device, to a secure software platform integrated to the health ecosystem through Intuitive Health’s technology–emphasis placed on the confidential nature of the transmission of patient’s personal information.
“Innovation is desperately needed outside the four walls of the hospital,” said Eric Rock, CEO and Founder of Intuitive Health. “In order to increase our nation’s quality of care and gain control of our healthcare spending, patients of all ages and technical ability must be given intuitive tools to improve their own health, while remaining engaged and monitored by their caregivers remotely.”
In the April 2010 Position Paper on “Technologies for Remote Patient Monitoring in Older Adults” by the Center for Technology and Aging, it was hypothesized that the U.S. health care system could reduce costs by nearly $200 billion within the next 25 years if remote monitoring tools are utilized for chronic diseases. To be sure, figures are not easily discernible; the amount and types of people who choose to utilize such treatment cannot be easily predicted.
The collaboration between AT&T and Intuitive Health is not the first of its kind; and with the increasing popularity of Smartphones, it is reasonable to anticipate that mobile technology will play a role in rise of the use of remote patient monitoring services. It is, perhaps, however, worthwhile to reconsider Michael Ricciardelli’s related post written three years ago, as a way to evaluate the role technology has and may continued to play in areas of health reform.
Since the data breach notification regulations by HHS went into effect in September 2009, 385 incidents affecting 500 or more individuals have been reported to HHS, according to its website. A total of 19 million individuals have been affected by a large data breach since 2009. The regulations require a covered entity that discovers a reportable breach affecting 500 individuals or more to report the incident to the HHS Office of Civil Rights immediately. After an investigation, HHS publicly posts information about the reported incident on its website on what has become known as the “Wall of Shame.” Of the 385 reported incidents, there are six separate incidents each affecting a million individuals or more. In its 2011 annual report to Congress, HHS reported that in 2009 covered entities notified approximately 2.4 million individuals affected by a breach and 5.4 million individuals the following year. This number grew in 2011 and it will likely continue to grow in 2012. To date, the largest breach took place in October 2011 at Tricare, the health insurer of American military personnel, which affected 4,901,432 individuals after storage tapes containing protected health information (PHI) were stolen from a vehicle. These numbers are staggering, but fortunately more can be done and should be done to prevent data breaches.
Data breaches can cause great harm to the affected individuals, providers and institutions. Individuals may experience embarrassment and harassment because sensitive health information was released. Individuals are vulnerable to identity theft and financial fraud if personal information such as social security numbers were accessed. More frequently, institutions are offering credit monitoring services to affected individuals to monitor for potential fraud. Similarly, data breaches carry a very high cost for institutions that will have to spend great sums to investigate and report a breach to HHS, the media and the affected individuals. An institution or provider’s reputation can also be harmed through negative publicity and the loss of consumers. More institutions are hiring public relations teams after a breach to minimize the amount of fallout and negative publicity. The threat of litigation and class action lawsuits following a breach is also present and very real. Stanford Hospital, Tricare, and Sutter Health are all facing million and billion dollar class action lawsuits for their 2011 data breaches.
The bad news is that data breaches are impossible to predict and it is impossible to protect against every type of possible breach. Unfortunately, even the strongest policies, precautions and security measures cannot protect an entity from a hacker, thief or an employee or business associate’s honest mistake. As more providers and institutions adopt electronic health record systems and digitize their records, data breaches will continue to occur and large breaches will be spotlighted by the media. Pursuant to the regulations, a covered entity must alert a prominent media outlet if a reported breach affects more than 500 people of that state. Based on the events of last year alone, it is clear that the media loves to report on data breaches and will continue to do so. Hopefully this public exposure will serve to increase accountability to the public rather than instill fear in the public and hurt consumer confidence in the EHR movement.
The good news is that more can be done by providers and institutions to prevent harmful and costly data breaches. Data security and patient privacy should be the focus of the industry in the upcoming years because it is just as important as meaningful use certification. The benefits flowing from the Medicare incentive payments that an institution may receive under the Affordable Care Act can be canceled out in the event of a large and debilitating data breach. It would be wise for covered entities to focus on preventing data breaches as much as achieving meaningful use.
There is no easy solution to preventing breaches, but encryption is one surefire way an entity can better protect itself from a costly breach. As entities become more familiar with EHR systems and recognize the risks involved in storing and transferring PHI data, implementing encryption technology should become a top priority for each entity.
Encryption of PHI is a major step a provider or institution can take to secure its sensitive patient data. Encryption is the use of an algorithmic process to transform data into a form in which there is a low probability of assigning meaning without use of a confidential process or key. According to a Guidance from HHS, if an entity encrypts its data in accordance with the National Institute of Standards and Technology standards for encryption, then any breach of the encrypted data falls within a safe harbor and does not have to be reported. This is an incredibly important safe harbor that could save an entity a lot of money. It is shocking that more entities, especially those with the means and resources to install a qualifying encryption system, do not utilize encryption technology on any of their electronic devices, especially portable devices.
Of the 385 reported breach incidents, thirty-nine percent involved a lost or stolen laptop or other portable media device containing unencrypted PHI. A report recently released by Redspin, an IT security firm, states that data breaches stemming from employees losing unencrypted devices spiked 525 percent in the last year alone. This statistic confirms that devices, including laptops, tablets and smartphones, pose a very high risk for a data breach. Redspin reported that eighty-one percent of healthcare organizations now use smartphones, iPads, and other tablets, but forty-nine percent of respondents in a recent healthcare IT poll by the Ponemon Institute said that nothing was being done to protect the data on those devices. At the very least, these reports and the statistics on HHS’s “Wall of Shame” should encourage entities to encrypt their portable electronic devices that contain sensitive PHI.
There are of course costs associated with adopting encryption technology in an EHR system. There are costs to install the system and maintain it with the help of an IT expert. Encryption of information can also slow down the processes used in sharing information. After all, one of the main goals of an EHR system is to make it easier for providers to share health information about their patients. An entity should work with an IT expert to determine what information should be encrypted in order to maximize the efficiencies of an EHR system. Despite the costs, the money and resources spent implementing encryption technology can be well worth it and are a smart investment for any entity with an EHR system. In a study published in 2011, the Ponemon Institute found that the cost of a data breach was $214 per compromised record and the average cost of a breach is $7.2 million. In light of the large data breaches that have been reported, it is clear that the costs of a breach can be much higher than the costs to implement encryption technology.
Under the HITECH Act and HHS’s interim final rule, encryption of health information is not mandatory. It remains to be seen whether HHS will impose a mandatory encryption policy on all devices or, at the very least, all portable devices capable of storing or transferring PHI, when it releases the final version of the data breach notification regulations sometime this year. The health care industry’s lack of encryption for patient information has drawn attention on Capitol Hill. At a November 2011 hearing before the Senate Judiciary Committee’s panel on Privacy, Technology and Law, Deven McGraw of the Center for Democracy and Technology testified that “we know from the statistics on breaches that have occurred since the notification provisions went into effect in 2009 that the healthcare industry appears to be rarely encrypting data.” At the hearing, Senator Tom Coburn, a physician himself, and Senator Al Franken, the chair of the panel, both voiced their concern over patient privacy protection and the current regulatory scheme. Senator Franken has said that he is contemplating legislation to encourage encryption by providers, although no action has been taken.
In the interim, it is reasonably clear that most, if not all, entities can benefit from implementing encryption technology when considering the costs and headaches associated with a data breach. When encryption is done properly, it has the potential of saving an entity a large sum of money, perhaps millions of dollars, in costs and fines — and that should be reason enough for entities to start taking this step in EHR technology.
Last year I published a piece called “Beyond Innovation and Competition,” questioning the dominance of those values. Economists celebrate innovation and competition as the main source of future growth. Innovation has become the central focus of Internet law and policy. While leading commentators sharply divide on the best way to promote innovation, they routinely elevate its importance. Business writers have celebrated search engines, social networks, and tech startups as model corporations, bringing creative destruction and “disruptive innovation” in their wake. Maximum innovation is the goal, and competition is billed as the best way of achieving it. Players in the vast and dynamic tech marketplace are supposed to constantly strive to innovate in order to attract consumers away from rivals.
In the piece, I explain how both competition and innovation can be as destructive as they are constructive. There are many social values (including privacy, transparency, predictability, and stability), and companies can compete for profits in ways that erode those values. In an era of inequality and hall-of-mirrors stock market valuations, innovations of marginal or negative impact on society at large can be vastly overvalued by a stampede of fickle investors.
The shortcomings of the innovation and competition story also play out in health information technology. Stimulus legislation in 2009 provided many carrots and sticks for doctors to digitize their recordkeeping systems, ranging from bonuses now to reimbursement haircuts later this decade if they fail to implement the technology. Congress structured the incentives to encourage a competitive and innovative marketplace in health information technology. But many doctors are shying away from implementation, in part because they fear that the fast and loose ethics of the market can’t mesh with a medical culture of constant commitment to quality care.
Susan Jaffe’s article for the Center for Public Integrity examines doctors’ fears about adopting any given software suite. According to Jaffe, “570 different electronic health systems certified by private organizations for non-hospital settings may be used to qualify for the” stimulus funds. The long-term consequences of the choice make the jam-shopping examples in Barry Schwartz’s book The Paradox of Choice seem quaint:
The systems can vary in appearance, content, organization and special features. Some can be customized by users in different ways, at no cost or some cost, or not at all. Some are compatible with other systems now, eventually or, some critics say, maybe never. . . . The costs of the systems remain daunting, despite the bonuses, particularly in areas that have been hit hard by an ailing economy.
The pricetag varies widely depending on the type and size of the medical practice, whether new computers are purchased and the extent of customization, among other things. Software alone can cost from $2,000 to $10,000 per doctor. All told, the cost jumps to about roughly $20,000 per doctor, according to a regional extension center consultant who advises physicians in northeast Ohio. On top of that, manufacturers charge hefty annual fees for technical support and periodic upgrades that together can amount to about 35 percent of the upfront costs. The systems are priced in a way that does not make comparison shopping “easy or necessarily valid,” said Dottie Howe, a spokeswoman for the Ohio regional extension center. There is no basic price because each company offers different components, features, options, and level of technical support. . . .
Most manufacturers will also charge the doctors to move the information in their current system to the new one. There could be extra [ongoing, monthly] charges to connect to other systems too.
Doctors have also been burned by sharp operators that emphasize slick salesmanship over solid service:
[T]he Southwest Family Physicians group is worried . . . They bought an electronic health record system five years ago that is now nearly obsolete. The manufacturer was taken over by another company that provides minimal technical support . . . “The salesman said ‘you’re buying a Cadillac, this is going to be the greatest thing,’ ” [one doctor] recalled. But that system can’t display an X-Ray image or send a prescription electronically to a pharmacy. “We’ve got the Model T Ford,” he said.
It does appear that regional extension centers are doing some work to keep pricing reasonable. Jaffe’s article focuses on Ohio, where five “preferred vendors” “agreed to charge prices ‘as good as or better than’ prices offered to other regional extension centers, to provide onsite assistance when a practice turns on its electronic health record system for the first time, offer technical support for at least six years, and limit annual cost increases for continuing technical support, among other things.” But consider the bizarrely proprietary nature of pricing data:
Whether the five preferred vendors offer a better deal than their non-preferred competitors is not known because the state regional extension center doesn’t have pricing information from non-preferred vendors, said Howe, the spokeswoman for the state’s regional extension center. Pricing from the preferred vendors are confidential, she said. And despite their preferred status, the five companies do not guarantee that eligible health care providers who purchase their systems will receive the government’s bonus payments.
I discussed the troubling degree of secrecy in health care before, and I’m very sad to see it persist here. The doctors in Jaffe’s story are making reasonable demands: to be able to understand the nature of the commitment they are making, to avoid big financial losses, and not to be burned by fly-by-night operators attracted only by the government subsidy money. They want to assure that the basic health care values of access, cost-control, and quality are reflected in the software they use.
We are seeing the opening stages of a battle between a medical sector committed to maintaining its own autonomy and traditions, and a tech sector that wants to commoditize health data in as standardized a form as futures markets homogenized corn grades, or credit scores tranched residential mortgage backed securities. Commenting on the demise of Google Health, an informatics expert said that “Google is unwilling, for perfectly good business reasons, to engage in block-by-block market solutions to health-care institutions one by one, and expecting patients to actually do data entry is not a scalable and workable solution.” To be sure, the company can’t expect to make the same profit margins in the health sector as it does in the online ad business. But the “instant millions” ethos of Silicon Valley doesn’t fit well with a sector where we are in principle committed to serving everyone, regardless of ability to pay.
Economist John Van Reenen has observed that the US has a particularly innovative economy in part because our markets are so good at crushing badly run firms. It’s probably good that garden equipment suppliers, toothpaste makers, and pie bakers know they can be out of business in a month or two if they’re “off their game” for a short time. But if I just entrusted three years of medical records to a vendor who suddenly went out of business, I’d take little comfort in the idea that a marginally better competitor had knocked it out of the market. The transition to a new vendor can be slow and costly—doctors in Jaffe’s story speak of seeing 1/3 to 1/2 less patients over weeks or months as they learn a new system.
At a Yale SOM Health Care conference in 2009, the Chief Medical Officer of a major player in the field once remarked to me that choosing an HIT vendor is “like a marriage—you don’t end the relationship lightly.” I first thought that remark was self-serving. But the more one examines the HIT field, the more important it appears to get standard recordkeeping, support capabilities, and interoperability right at the outset, rather than leaving doctors to negotiate the wreckage of several generations of battling systems. Think about how chaotic online music sales seemed before iTunes. Perhaps Apple (whose iPads are already beloved by many docs) is going to bring a swift and highly profitable order to this field, too. I hope the ONC and other decisionmakers will well-regulate whatever behemoth eventually emerges, vindicating the public values that competition and innovation are unlikely to promote.
Photo credits to Aleksandar Šušnjar, Jakub Halun and loki11.
I look forward to reconnecting with everyone who is attending the health law professors conference in Chicago. My presentation will be applying some of the ideas of Scott Peppet (on self-quantification and unraveling) to personal health records. I found these ideas from Peppet’s post on biometric identification particularly interesting:
The biometric technologies firm Hoyos (previously Global Rainmakers Inc.) recently announced plans to test massive deployment of iris scanners in Leon, Mexico, a city of over a million people. . . . [T]he company’s roll-out strategy is explicitly premised on the unraveling of privacy created by the negative inferences & stigma that will attach to those who choose not to participate. Criminals will automatically be scanned and entered into the database upon conviction. Jeff Carter, Chief Development Officer at Hoyos, expects law abiding citizens to participate as well, however. Some will do so for convenience, he says, and then he expects everyone to follow: “When you get masses of people opting-in, opting out does not help. Opting out actually puts more of a flag on you than just being part of the system. We believe everyone will opt-in.” (For the full interview, see Fast Company’s post on the project.)
I’ve previously looked at the limits of individualist accounts of autonomy in work on pharmaceuticals (here and here), and scholars like Robert Ahdieh are questioning individualism in law & economics generally. As Nic Terry has argued, many of the critiques of CDHC apply to PHRs, and vice versa.
As of a few years ago, “it wasn’t illegal to hire and fire people based on their smoking habits” in 21 states. I think there will be many difficult questions raised in coming years by the growth of medical records of all types, and how many secondary uses of them are permitted. For example, some dating sites will now verify the income and assets of their users. How soon before they (and other certification and evaluation intermediaries) start vouching for health profiles? Does law have a role in these situations? I’ll try to explore these questions, and I’ll post more details about the presentation after getting some feedback.
Filed under: Accountable Care Organization, Hospital Finances, Physician Compensation
One of the many $64,000 questions in the accountable care organization (ACO) debate has been who should lead these organizations. In a policy adopted in November 2010, the American Medical Association (AMA) made clear its view that ACOs must be physician-led. The American Hospital Association (AHA) refrained (at least in its public letter to CMS) from asserting its entitlement to the ACO helm, based, for example, on its management experience and pools of capital. Instead, it simply urged CMS to “defer details of the organization, such as leadership and management structure, to each ACO.”
CMS seems to have heeded the AHA’s advice because its recently released proposed rule does not directly take on this normative debate. (See Summary of CMS Proposed Rule on Accountable Care Organizations recently posted by Jordan T. Cohen for an overview of the proposed rule.) While “ACO participants must have at least 75 percent control of the ACO’s governing body” to be eligible for participation in the Shared Savings Program (proposed Section 425.5(d)(8)), the definition of “ACO participant” in the proposed rule includes physicians and hospitals, among others (proposed Section 425.4).
Similarly, the proposed rule simply requires that the “ACO’s operations must be managed by an executive, officer, manager, or general partner whose appointment and removal are under the control of the organization’s governing body and whose leadership team has demonstrated the ability to influence or direct clinical practice to improve efficiency processes and outcomes” (proposed Section 425.5(9)(ii)). The proposed rule does not address who or what would make the best such leader.
The proposed rule, however, clearly preserves a role for physicians to form and lead ACOs. For example, it recognizes that ACOs may be comprised of professionals in group practice arrangements and networks of individual practices, independent of hospitals (proposed Section 425.5(b)).
In addition, “[c]linical management and oversight [of the ACO] must be managed by a full-time senior-level medical director . . . who is a board-certified physician . . .,” and “[a] physician-directed quality assurance and process improvement committee must oversee an ongoing action-oriented quality assurance and improvement program” (proposed Sections 425.5(9)(iii) and (iv)).
The proposed rule also builds in a preference for ACOs comprised of all physicians or physician groups with fewer than 10,000 assigned beneficiaries by proposing to exempt them from the 2 percent net savings threshold adjustment under the one-sided model (proposed Section 425.9(c)(4)(i)). It also proposes to vary confidence intervals, which affect the minimum savings rate, by the size of the ACO in the one-sided model “to improve the opportunity for groups of solo and small practices to participate in the Shared Savings Program” (Preamble to proposed rule at Section II.F.10).
But on a practical level, the specifics of CMS’ proposal may — unintentionally, perhaps — give hospitals the greater chance to take the reins, at least initially. An apparently leaked CMS internal discussion document reflects some level of concern that physicians may have a hard time taking the lead with ACOs.
The proposed rule’s regulatory impact analysis estimates that the average start-up investment and first year operating expenditures for an ACO in the Shared Savings Program will be $1,755,251. In addition, the proposed rule uses a 6-months claims run-out (proposed Section 425.7(a)). Presumably, that means ACOs — assuming they satisfy all program requirements — will not see a dime of shared savings for more than eighteen months. CMS also proposes to withhold 25 percent of any earned shared savings accrued in a given year to ensure repayment of any losses to the Medicare program in subsequent years of the three-year ACO agreement (proposed Section 425.5(d)(6)(iii)).
Even if private physicians can amass the capital to make these upfront investments, there of course is no guarantee they will regain their outlays. A recent study published online by the New England Journal of Medicine, as reported by the American Medical Association, found that participants in CMS’ Physician Group Practice Demonstration did not recoup, at least in the initial years of the demonstration, all of the money they invested to establish ACOs. As the AMA summarized:
Early adopters, for the most part, did not recoup their set-up costs in the first three years of operation. The 10 integrated health systems that were studied spent an average of $1.7 million to take part in the demonstration project. Eight received no shared savings payments in the first year of the project. Six got a payment in the second year, and five received a bonus in the third year.
The Everett Clinic in Washington, for example, reportedly spent approximately $1 million on infrastructure for its ACO but recouped only $129,268 in shared savings during the first four years of the demonstration project.
According to a 2007 report from the National Center for Health Statistics (NCHS), in 2003-04, 80.6 percent of office-based medical practices in the United States consisted of one or two practitioners and 94.8 percent had five or fewer practitioners. The risks associated with forming an ACO are considerable for these smaller practices to absorb, especially when, at best, the ACO will see 75 percent of its portion of any shared savings upwards of eighteen months down the road and could instead be responsible for its share of losses. It is not clear how many small practices are willing and able to assume these risks without some substantial financial or management support. Not surprisingly, the AMA’s statement on the proposed ACO rule specifically identifies “the large capital requirements to fund an ACO” as a significant barrier that must be addressed if physicians in all practice sizes and settings will be able to successfully lead and participate in ACOs.
Another aspect of the proposed rule that may present a particular challenge to independent physicians is proposed Section 425.11(b)’s requirement that “[a]t least 50 percent of an ACO’s primary care physicians must be meaningful [Electronic Health Records (EHR)] users, using certified EHR technology as defined in §495.4, in the [Health Information Technology for Economic and Clinical Health (HITECH)] Act and subsequent Medicare regulations by the start of the second performance year in order to continue participating in the Shared Savings Program.”
Physician practices indisputably have increased their use of EHR systems in recent years. According to the National Ambulatory Medical Care Survey conducted by NCHS (reported here), only 17 percent of physicians in 2008 reported that they had a “basic” EHR system (which is defined as having electronic patient demographic information, patient problem lists, patient medication lists, clinical notes, orders for prescriptions, and laboratory and imaging results). Recent NCHS data (reported here) show that that number has climbed nearly 50 percent to 24.9 percent of office-based physicians.
But basic use of EHRs is not sufficient under the proposed rule, which requires “meaningful use.” Survey data from the Office of the National Coordinator for Health Information Technology, as reported here, show that only 41.1 percent of office-based physicians plan to apply for billions of federal dollars in EHR incentive payments that are available to Medicare and Medicaid providers under the HITECH Act, compared with 80.8 percent of acute care non-federal hospitals. Additionally, as reported here, a recent survey from the Medical Group Management Association (MGMA) found that only 13.6 percent of medical practices that have adopted EHRs and plan to apply for the EHR Meaningful Use incentives currently are able to satisfy the fifteen core criteria necessary to establish that they are meaningful users. Medical practices have a long row to hoe.
But the news is not all bad for physicians. The MGMA survey also found something that suggests this issue is far from resolved on a theoretical or practical level. As reported here, “almost 20 percent of responding independent medical practices that owned EHRs said that they had optimized their uses of EHRs” whereas “[o]nly 8.8 percent of responding hospitals — or [integrated delivery system (IDS)] — owned practices with EHRs said they had optimized their EHR use.”
Almost certainly, it is not just a coincidence that physicians are devoting their energy to becoming meaningful EHR users just as the first EHR Meaningful Use incentive payments are available. If CMS or private foundations develop additional incentive programs to help smaller practices cover the start-up costs associated with forming an ACO, the individual physician could still be in this game. Notably, the AMA’s brief statement on the proposed ACO rule reiterates its recommendation to CMS to increase access to loans and grants for small practices as part of this puzzle. It remains to be seen if any such programs are viable in this fiscal climate.
As promised, future posts will address the normative question of who should lead ACOs.
The Washington Post recently featured Lena Sun’s reporting on why many physicians are wary of adopting an electronic medical records system. As noted in the piece,
Many are aware that beginning this year, health-care professionals who effectively use electronic records can each receive up to $44,000 over five years through Medicare or up to $63,750 over six years through Medicaid. But to qualify, doctors must meet a host of strict criteria, including regularly using computerized records to log diagnoses and visits, ordering prescriptions and monitoring for drug interactions. And starting in 2015, those who aren’t digital risk having their Medicare reimbursements cut.
Deven McGraw, director of the health privacy project at the Center for Democracy & Technology, complains that, despite all these requirements, patient confidentiality concerns are being neglected:
But no federal regulations clearly require that doctors turn the data encryption on or prevent those who don’t do so from getting paid. . . . “This is a point of frustration,” said McGraw, who sits on an advisory group that sought unsuccessfully to prevent those who violate privacy regulations of the federal Health Insurance Portability and Accountability Act, or HIPAA, from getting incentive money.
Some older doctors may find it easier to retire than to get on board with new EMR systems. We frequently hear complaints about Luddite doctors resisting technology that has long been adopted by other sectors. But, as one commentator recently insisted, a doctor is not a bank. To get a sense of how frustrated doctors can become because of the new health IT (and the legal contracts that accompany it), check out this parody website for the faux firm Extormity. It announces a memorable experience for doctor clients/conscripts:
At the confluence of extortion and conformity lies Extormity, the electronic health records mega-corporation dedicated to offering highly proprietary, difficult to customize and prohibitively expensive healthcare IT solutions. Our flagship product, the Extormity EMR Software Suite, was recently voted “Most Complex” by readers of a leading healthcare industry publication.
I loved this description of a firm committed to maximizing the value of it’s intellectual property:
The Extormity EMR Software Suite is built on a proprietary software model renowned for its complexity. This proprietary platform and all of its components must be procured and implemented as a complete package we call the Extormity BundleTM (which describes both our comprehensive package and its associated cost).
Operating the Extormity Bundle requires a phalanx of servers, which of course need to be replicated for redundancy. Fortunately, Extormity acts as a value-added reseller of these servers, which we pre-load with operating software. This allows us to mark-up the cost of the servers and charge for server configuration. In addition, the server software carries with it steep annual license fees.
Let’s hope the ONC’s ongoing regulatory process can help reduce the risk of Extormity-style raw deals for doctors. Given the recent flap over the FDA’s effective imprimatur for an extreme drug price increase, no DC agency should set in motion a process that could lead to prohibitively expensive fees for an essential aspect of health care.
X-Posted: Health Law Prof Blog.
Filed under: Electronic Medical Records, Private Insurance
As ACA implementation lumbers ahead, and challenges to it slouch toward the Supremes, the U.S. health care system’s arbitrary old ways continue to mystify and frustrate. Consider this story on one person’s quest to obtain insurance:
Most employees assume that if they lose their job and the health coverage that comes along with it, they’ll be able to purchase insurance somewhere. . . .My husband, teenage daughter and I were all active and healthy, and I naïvely thought getting health insurance would be simple. . . .
Then the first letter arrived — denied. . . .What were these pre-existing conditions that put us into high-risk categories? For me, it was a corn on my toe for which my podiatrist had recommended an in-office procedure. My daughter was denied because she takes regular medication for a common teenage issue. My husband was denied because his ophthalmologist had identified a slow-growing cataract. Basically, if there is any possible procedure in your future, insurers will deny you. . . .
As I filled out more applications, I discovered a critical error in my strategy. The first question was “Have you ever been denied health insurance”? Now my answer was yes, giving the new companies reason to be wary of my application. I learned too late that the best tactic is to apply simultaneously to as many companies as possible, so that you don’t have to admit to a denial.
As was recently reported, “50 to 129 million (19 to 50 percent of) non-elderly Americans have some type of pre-existing health condition.” The “health care market” is sending a strong signal: don’t step out of the system if you have any continuing need for even minor care.
But what’s more worrisome are the types of information circulating about you that you aren’t even aware of. Consider this story from Businessweek about the profiling of insurance applicants by third-party intermediaries:
Most consumers and even many insurance agents are unaware that Humana, UnitedHealth Group , Aetna (AET), Blue Cross plans, and other insurance giants have ready access to applicants’ prescription histories. These online reports, available in seconds from a pair of little-known intermediary companies at a cost of only about $15 per search, typically include voluminous information going back five years on dosage, refills, and possible medical conditions. The reports also provide a numerical score predicting what a person may cost an insurer in the future. . . .
[A] 57-year-old safety consultant in the oil and gas industry, says he tried to explain that the medications weren’t for serious ailments. The blood-pressure prescription related to a minor problem his wife, Paula, had with swelling of her ankles. The antidepressant was prescribed to help her sleep—a common “off-label” treatment doctors advise for some menopausal women. But drugs for depression and other mental health conditions are often red flags to insurers. Despite his efforts to reassure Humana, the phone interview with the company representative “just went south,” Walter recounts. He and his wife remain uninsured [as of 2008].
Health-related data from a wild west of unregulated intermediaries may spread to employers and other decisionmakers, just as credit scores have migrated from the bank context to influencing insurance pricing, and credit histories now influence employers. Sharona Hoffman has observed that “It is not uncommon for employers to obtain applicants’ and employees’ medical records. According to one source, every year, over ten million authorizations for release of medical information are signed by workers prior to the commencement of employment.” She has predicted disturbing possibilities arising out of that access to data:
Existing laws, including the ADA, GINA, HIPAA, and their state counterparts, provide important assurances to applicants and employees but are insufficient to guarantee that they will suffer no ill consequences as a result of EHR disclosure to employers. Employees may be especially concerned in times of recession, knowing that financial pressures make workers with health problems particularly unattractive to employers. Employers or their hired experts may develop complex scoring algorithms based on EHRs to determine which individuals are likely to be high-risk and high-cost workers. In addition, in times of financial difficulty, limited resources may be available to implement technology and policies that will secure EHR confidentiality.
Secondary uses of health data could be a very lucrative niche for profilers of the future.
Given these possibilities, individuals should at least have the right to access and correct the health data that intermediaries have compiled about them. The FTC recognized this right, and “forced the [insurance] industry to begin disclosing the use of prescription information under . . . the Fair Credit Reporting Act. . . . Copies of prescription reports are supposed to be available to consumers at no charge under federal law.” This is a small step forward. But if the “scores” assessing individual risk are compiled according to proprietary algorithms, the consumer may still feel “in the dark,” unable to adequately influence the presentation of herself to the insurer.
As Esther Dyson has stated in another context, mysterious data flows can jeopardize individual autonomy:
The comforting thing about the kind of data that Facebook primarily deals with is that it’s public. If your friends and other people can see it, so can you.
More troubling is the data you don’t even know about – the kind of data about your online activities collected by ad networks and shared with advertisers and other marketers, and sometimes correlated with offline data from other vendors. By and large, that’s information you can’t see – what you clicked on, what you searched for, which pages you came from and went to – and neither can your friends, for the most part. But that information is sold and traded, manipulated with algorithms to classify you and to determine what ads you see, what e-mails you receive, and often what offers are made to you. Of course, some of that information could go astray.
Online advertisers already slice and dice population segments (and distribute opportunities & exposure to ads) via marketing discrimination. Will the “e-health revolution” bring their methods out of cyberspace, and into the deadly serious business of offering employment and insurance based on estimates of health status that applicants can’t understand or challenge?