Wednesday, March 31, 2010

The ONC Whitepaper on Consent

Last week was a busy one for healthcare IT. In addition to the DEA Interim Final Rule on e-prescribing of controlled substances, the launch of NHIN Direct, and the introduction of new ONC interoperability framework processes, HHS released the Whitepaper on Consent.

The entire document and its 3 appendixes are worth reading. The Executive summary contains a great classification of consent models found throughout the world:

No consent
Health information of patients is automatically included—patients cannot opt out

Opt-out
Default is for health information of patients to be included automatically, but the patient can opt out completely

Opt-out with exceptions
Default is for health information of patients to be included, but the patient can opt out completely or allow only select data to be included

Opt-in
Default is that no patient health information is included; patients must actively express consent to be included, but if they do so then their information must be all in or all out

Opt-in with restrictions
Default is that no patient health information is made available, but the patient may allow a subset of select data to be included.

Appendix A is a very helpful list of State-Led Examples of Exchange in the U.S

For more details about the Massachusetts efforts to date, including the educational materials we used, see my blog about patient privacy preferences.

Appendix B is an overview of Selected State Laws which can be empowering as we implement consent models.

Appendix C contains examples of Exchange in Other Developed Countries.

I've worked closely with the county council in Jonkoping, Sweden which has a very high percentage of EHR and hospital information system adoption

The consent whitepaper was timed perfectly to align with the HIT Standards Committee review of existing standards for storing and transmitting consent preferences.

Well done!

Tuesday, March 30, 2010

The ONC Interoperability Framework

In my summary of the March HIT Standards Committee meeting I mentioned the new ONC Interoperability Framework and the related RFPs. Here's the detail I promised in my previous blog about ONC. Thanks to Doug Fridsma for this overview and his hard work on it.

ONC announced several projects to support Standards and Interoperability Framework and Nationwide Health Information Network (NHIN).

Over ten requests for proposals were released in February 2010 under the existing contract vehicle: National Institutes of Health (NIH) Information Technology Acquisition and Assessment Center (NITACC) CIO-SP2 Task Order. The funding will support activities for two years that are designed to develop the standards, tools, interoperability framework, and technical infrastructure to support the overall goals of improving adoption of HIT. Key areas for RFP include:

ONC anticipates leveraging the National Information Exchange Model (NIEM) for health care and develop consistent process for use case development. Working closely with consumers, providers, government organizations and other stakeholders, ONC will identify real-world needs, prioritize them through a governance process, and create explicit, unambiguous documentation of the use cases, functional requirements and technical specifications for interoperability.


The harmonization process integrates different views of health care information into a consistent view. This process will include merging related concepts, adding new concepts, and mapping concepts from one view of health care information into another view. This process will also identify gaps that can point the way towards development of new interoperability standards. ONC anticipates leveraging NIEM process to support data exchange harmonization.


Standards Development
In order to meet the needs of the use cases and increased use of HIT, there will be a need to modify or extend the existing standards or develop new standards. ONC will work with standards development organizations and with research organizations to extend existing ones, or develop new standards as necessary. 



Tools and Standards Repository
To accelerate the development, use, maintenance and adoption of interoperability standards across the industry, and to spur innovation, ONC will develop tools to facilitate the entire standards lifecycle and maximize re-use of concepts and components, including tools and repository for browsing, selecting, and implementing appropriate standards.


In order to be able to test and implement the standards in real-life settings, they must be specified to a higher degree of detail. This project will focus on the development of interoperability specifications that are independent of specific software architecture (a platform-independent model, or PIM) as well as interoperability specifications that are specific to the NHIN architecture (a platform-specific model, or PSM). 


The NHIN architecture is a specific network architecture that realizes health information interoperability specifications based on open standards. This project will focus on the refinement and management of the NHIN Architecture to meet emerging needs of the health care market.
A reference implementation is the fully instantiated software solution that is analyzed to be compliant with the standards and serves as a “reference” to other software developers of what an interoperable solution looks like. The reference implementation will be accessible as a public resource with compiled code, source code and supporting documentation. 


Integration Testing
The current NHIN testing infrastructure needs to be refined to test and validate emerging needs of the network and planned NHIN capabilities as they are identified. ONC will work with NIST where NIST will provide testing tools to validate that particular implementation conforms to a set of standards specification; and ONC will support the development of an integration testing “harness” that will test how a particular component that has satisfied conformance testing requirements integrates into the reference implementation. 


NHIN Demonstrations and Emergent Pilots
Although a reference implementation provides value to the community through a thorough assessment of the technology; support for established standards, and vetting within the HHS, consumer, and other stakeholders, a reference implementation will need to be refined through real-world pilots and demonstrations. ONC will support efforts in the refinement of the reference implementation and interoperability specifications, through limited number of real world demonstration and pilots. 


NHIN Operations and Infrastructure
This project will focus on activities related to operational and infrastructure support for the ongoing demonstrations and production pilots of health information exchange across a trusted network. 



Each project will focus on specific activities within each area as well as collaboration across all other projects addressing overall effectiveness of the Standards and Interoperability framework, certification and NHIN that is critical to the wider adoption of HIT. ONC expects to award one contract for each project for a two-year project period to qualified applicants.

Monday, March 29, 2010

E-Prescribing Controlled Substances

Last week, the Drug Enforcement Administration released its long awaited Interim Final Rule on e-Prescribing of Controlled Substances

It's 334 pages long, but the most important portion is section § 1311.115 which describes the need for two factor authentication when prescribing controlled substances. Here's the detail

(a) To sign a controlled substance prescription, the electronic prescription application must require the practitioner to authenticate to the application using an authentication protocol that uses two of the following three factors:
(1) Something only the practitioner knows, such as a password or response to a challenge question.
(2) Something the practitioner is, biometric data such as a fingerprint or iris scan.
(3) Something the practitioner has, a device (hard token) separate from the computer to which the practitioner is gaining access.
(b) If one factor is a hard token, it must be separate from the computer to which it is gaining access and must meet at least the criteria of FIPS 140-2 Security Level 1, as incorporated by reference in § 1311.08, for cryptographic modules or one-time-password devices.
(c) If one factor is a biometric, the biometric subsystem must comply with the requirements of § 1311.116.

In a previous blog, I wrote about the many technologies which support strong authentication.

For e-Prescribing of controlled substances BIDMC will investigate 3 approaches

*The use of fingerprint biometrics using web-based software from Bio-Key as described in my cool technology blog.

*The use of hard tokens such as those provided by RSA.

*The use of cell phones as a two factor authentication device such as sending a PIN number via SMS after each e-prescribing session. Anakam has a complete suite of tools to implement this workflow.

Although there will be some burden/inconvenience imposed on clinicians through the use of two factor authentication, I believe it will ultimately save time. Why?

Today's e-prescribing workflow is fractured. I can write for Lipitor with fully electronic NCPDP 8.1 formatted, vocabulary controlled, end to end secure transactions. However I write for Oxycontin with a pen and paper. I have to split my time between a screen and a pen for the same encounter with the same patient depending on the drug I'm writing for. In the Emergency department, approximately 30% of all prescriptions are for controlled substances (i.e. pain control after trauma).

With fully electronic workflows, I can write for all meeds, digitally sign the enter order set, get a PIN sent to my cell phone in 2 seconds and then send the transactions to the pharmacy of the patient's choice without a pen, paper or hassle.

I look forward to our controlled substance e-prescribing pilots. Ultimately it will be a win/win/win for patients, providers, and pharmacies.

Friday, March 26, 2010

Cool Technology of the Week

Many Massachusetts homes have experienced flooding this month, so we're all a bit focused on plumbing.

I've had two plumbing issues recently, both involving interesting technology fixes.

I live in a 100 year old house with fragile plumbing and electrical infrastructure. Recently, the plumbing on two old pedestal bathroom sinks clogged to the point that no plunger or drain cleaner could clear them. In an old New England house, the bathroom sinks are often plumbed back to back together, making a plumbing snake impossible to use. The only option is to open the wall and replace the offending pipe…or so I thought until I discovered Kinetic Water Ram technology.

The idea is simple - use compressed air to create a shock wave of moving water at 5000 psi. The wave moves inside the pipe, not against the pipe walls, so it will not burst the plumbing. These devices are used by plumbers to clear very challenging clogs. Typically a plumber bills $150 for a visit. For $250, you can purchase one of your own.

Here's a video of how it is used.

The great news - I'll never need to use chemical drain cleaners or a plunger again. One device clears bathtubs, sinks, toilets etc. Clogs and accumulated corrosion deposits are both cured with a shock wave of water.

My wife and daughter thanked the home CIO for solving the problem.

In the recent floods, hundreds of basements in the Boston Metrowest area were flooded and damaged. Although my basement survived without damage, I realized that our 20 year old sump pump was a single point of failure. If the sump pump failed, we'd be flooded. If the electricity failed during a storm, we'd be flooded. Hence I investigated "disaster recovery" hardware for basements. I found the Wayne battery backup sump pump.

Last weekend, I replaced our 20 year old sump pump with a new pedestal pump and discharge hose. This weekend, I'll add the disaster recovery system.

The end result will be a 2300 gallon per hour primary pump with a 2300 gallon per hour battery backup pump that should last for a day of pumping with a 75 amp battery.

Thus, I'll be covered for pump failure and power failure. The home CIO does for the basement what the work CIO does for the data center.

Thursday, March 25, 2010

The Girl with 2 Brains

Last Thursday I wrote about the Yin to my Yang exploring the synergy between my left brain and my wife's right brain.

My daughter Lara turns 17 next week and she's definitely the girl with 2 brains (or a whole brain).

I cannot draw a stick figure (my attempts at drawing a human look more like a dinner fork than the Venus de Milo).

My daughter took a blank piece of paper and a pencil then drew the self portrait above.

Her greatest academic strength is math. She can visualize problems involving vector forces, geometry, or trigonometric functions then break them into solvable component parts. To me, the hardest part of advanced math and engineering is setting up the problem correctly, not solving it.

She's just completed her first resume. Today's high school students are expected to master college level topics, develop disciplined work habits at an early age, and complement their academics with sports/music/art/volunteer work, which she's tried to do in a balanced way. My own experience as a student was that I was not the smartest student in the class, but I was the most persistent due to minimal sleep needs, a great tolerance for any kind of discomfort - cold/fatigue/hunger, and a sense of impatience for the future.

My daughter has a different set of skills - a whole brain that can process the analytical and visual with equal competency, an ability to think about the greater good rather than personal gain, and a sense that anything is possible. She does not believe in political half truths. She does not judge success by a bank balance. She does not believe the ends justifies the means. She believes that the nice guy (or gal) can finish first.

I would like to believe that idealists can succeed through persistence and determination, always staying true to their values. Watching day to day activities in Washington has convinced me that it's critically important to have a strong moral compass.

Her current college search criteria on CollegeBoard.com are

Rural or Suburban location
Under 10,000 students
Strong Asian Studies/Japanese language program (for the right brain)
Strong Environmental Engineering program (for the left brain)
Studio art resources
If possible, a competitive collegiate archery team (she's ranked 6th in the US)

It's my hope that she has the best of both her parents without the downsides of either.

At very least, she can write a college essay entitled "Why I have a whole brain"!

Wednesday, March 24, 2010

The March HIT Standards Committee Meeting

Today's HIT Standards Committee included important discussions about NHIN Direct and a new Interoperability Framework supported by several ONC RFPs.

We began the meeting with a summary of the work in progress.

The Clinical Operations Workgroup is focused on vocabulary starter sets and ensuring implementation guidance is available.

The Clinical Quality Workgroup is focused on quality measure retooling to ensure meaningful use measures are EHR friendly.

The Privacy and Security Workgroup is focused on understanding all the consent standards currently available from different Standards Development Organizations and implementation guide writers.

The Implementation Workgroup is focused on creating a starter kit to accelerate EHR adoption and interoperability. Yesterday, I summarized the Implementation Workgroup "starter kit" testimony. During the meeting today the Workgroup synthesized the 10 lessons learned from the testimony into 3 major themes :

*Provide transparency to all the available resources - funding, tools, and technologies
*Clarify the requirements of meaningful use data exchanges through the use of FAQs and other online resources
*Provide simple interoperability guides with enough detail and samples so that a typical IT professional could implement interoperability

We discussed the best way to include specific implementation guidance in the Interim Final Rule, realizing that legal restrictions may limit our choices. In our IFR comment letter we recommend that broad families of standards be specified along with detailed implementation guide "floors" which will be amended through guidance letters issued outside the regulation. This strategy enables short term specificity and long term evolution/innovation. If the legal interpretation is that we cannot issue implementation guidance letters outside of regulation, there are existing government models that we can consider as alternatives i.e.

*NIST issues regular updates to the Federal Information Processing Standards (FIPS)
*CMS issues regular updates to the Physician Quality Reporting Initiative (PQRI)
*Private sector organizations provide updated implementation guidance via voluntary consensus groups (i.e. CAQH, WEDI, IHE)
*Open source communities provide continuous version releases. Although not a regulation or a single solution, such work provides reference implementations that can be widely adopted by stakeholders and become defacto standards.

We'll await legal guidance to determine next steps.

Next, Doug Fridsma presented NHIN Direct. David Blumenthal offered an introduction that identified NHIN direct as a "project" not a "product" that is designed to be responsive to customer requests, especially from small practices.

NHIN Direct does not replace existing NHIN standards, policies, and software. Instead NHIN Direct will explore simple data transport strategies for point to point communication. Over the next 6 months, it will explore the use of SMTP/TLS, REST, and SOAP implementations with running code. It will provide a way to transport data, not the only way.

Data exchanges required by stage one of Meaningful Use include e-prescribing, public health lab reporting, syndromic surveillance, immunization, and patient summary exchange (both provider to provider and provider to patient). The scope of NHIN Direct does not include new content/vocabulary standards, master patient indexes, or aggregations of data for quality reporting. It's complementary to existing NHIN Connect work and state HIE efforts. It is not to be feared and there is no reason for states to slow existing efforts while the NHIN Direct experiment is in process.

Next, Doug presented a framework for interoperability comprised of 7 components.

*Use Case Development and functional requirements
*Standards development
*Harmonization of Core Concepts
*Implementation specifics
*Pilot Projects
*Reference Implementation
*Conformance Testing

Several RFPs have been issued to support these efforts. They will leverage the lessons learned from HITSP and I'm confident that the HITSP efforts will be foundational to this next phase of work. I see the Harmonization of Core Concepts RFP as the evolution of HITSP and I suspect many HITSP volunteers will be involved, regardless of how the contract is awarded.

This seven step process will use the National Information Exchange Model (NIEM) approach as means to organize the work. Important aspects of the work ahead include:

*A b ottom up process to define requirements based on data exchanges that are needed to achieve meaningful use and meet the business priorities of stakeholders
*Delivery of fully integrated, well specified implementation guidance
*Electronic test scripts to ensure conformance and an active feedback loop to improve standards once testing has identified deficiencies

David Blumenthal emphasized that NIEM approaches, although used by the Department of Justice and Homeland Security, have absolutely no possibility of facilitating entry of healthcare data into law enforcement databases.

Carol Bean and Steve Posnack reported on the Certification NPRM temporary and permanent processes. Key points included

*Certification applies equally to EHRs and EHR modules
*Permanent certification separates the testing lab function from the certification function
*There will be multiple testing labs and certification organizations that will compete on price and service offerings. Accreditation processes for testing labs and certification organizations will ensure consistency among service providers.
*Site certification methods will be used for self developed EHRs
*No double certification will be necessary i.e. a site could purchase vendor products which are certified and self build portions of an EHR which will be site certified. There is no need to seek additional certification for the combination of the built and bought products. Making them work together to achieve meaningful use is the responsibility of the implementing organization.

A great meeting today. I look forward to the work ahead as we continue to provide tools, technologies, and educational materials in support of meaningful use data exchanges.

Tuesday, March 23, 2010

The Implementation Workgroup Starter Kit

On March 8, the Implementation Workgroup of the HIT Standards Committee held a day of hearings as part of the effort to create an "Implementation Starter Kit" which accelerates EHR adoption and interoperability.

The goals of the hearing were to
* Describe challenges and successes that may be instructive to others.
* Provide advice to help others with implementation.
* Contribute tools and technologies that can be made available to the public and private sector, such as roadmaps, blueprints.

Here are the top 10 lessons learned from those hearings

1. Provide emerging HIE guidelines to assist providers. The Nationwide Healthcare Information Network (NHIN effort) is posting a implementation guide that will include: policy, “trusted relationship”, standards, services, and four use cases.

2. Disseminate knowledge of tools and utilities. The National Cancer Institute (NCI) is providing software developer kits with vocabularies and metadata, vendor utilities, and specification documents. The National Institute of Standards and Technology (NIST) is providing conformance testing tools.

3. Communicate details of all available funding sources i.e.
Regional extension centers (RECs) $643 million
Health Information Exchange $564 million
Workforce Training Programs $118 million
Beacon Communities $235 million
Strategic Health Advanced Research Projects (SHARP) $60 million
Nationwide Health Information Network/Standards and Certification $64.3 million

4. Focus on workflow challenges such as clinician friendly approaches to workflow redesign, change management, and training.

5. Lengthen the current implementation timelines in key challenging areas such as reporting of quality data.

6. Develop standards for data exchange that ensure that the data will be trusted such as rich metadata (who created the data, for what purpose in what workflow), further definition of the transport layer, and message routing standards. Exporting the data is easy, but how do we trust the inbound data into our systems from an external source (i.e., HIE)? What data should be in a PHR? What is the interplay between the state privacy laws and interoperability?

7. Drive collaboration between the software vendors to advance ARRA. Host an EMR software vendor summit to create synergies between vendors.

8. Create detailed implementation guidance for interoperability standards. Additional standards will be required for some use cases such as HIEs communicating with other HIEs.

9. Leverage open source models, where practical, such as the work of the Veterans Administration.

10. Innovate to improve speed of adoption. New business models and innovation are required. Utilize disruptive innovation to accelerate the road to adoption of meaningful use.

Tomorrow, the HIT Standards Committee will hear the report from the Implementation Workgroup and I'll post those materials, as well as a full summary of the meeting.

Monday, March 22, 2010

Massachusetts Data Protection Regulations Update

Many of you will need to explain the latest Federal and State security mandates to your organizations. Here's the letter I sent out on Friday. Feel free to use it as a template for your own communications.

-------------
In 2007, Massachusetts became one of 45 states that require companies to report the loss or theft of personal information. (For more information on the data breach law see MGL ch. 93H http://www.mass.gov/legis/laws/mgl/gl-93h-toc.htm)

In Massachusetts, personal information is defined as a state resident’s last name and first name or first initial as well as any one or more of the following:
• Social Security Number;
• Driver’s License Number or state-issued identification card number; or
• Financial account number, or credit or debit card number, with or without the necessary security code.

In addition to passing a data breach law, Massachusetts passed regulations that set out requirements for how businesses must protect personal information. (See 201 CMR 17.00 http://www.mass.gov/Eoca/docs/idtheft/201CMR1700reg.pdf) Those regulations became effective on March 1, 2010.

In general, those regulations require BIDMC to protect personal information in the same way that it already protects patient information.

Minimum Necessary Standard – We must make sure that the people with access to personal information have a legitimate need for that access based on their job functions and that the access granted is the minimum necessary to fulfill that function.

Information Security Program – We must employ an information security program that ensures that personal information (in any form) is not used by or disclosed to people who do not meet the minimum necessary standard. As we do with patient data, we must ensure that:
• Users must provide a unique user id and password to access personal information;
• Computers that are used to access or transmit personal information have up-to-date patches and anti-virus software;
• Personal information on laptops and other portable devices is encrypted; and
• Personal information transmitted wirelessly or over the Internet is encrypted.

Employee Training and Enforcement – We must conduct employee training and make sure that our community is complying with these requirements for protecting this data.

System Monitoring – We must monitor our information system to ensure that outside parties don’t gain access to personal information and that BIDMC Users with such access are active BIDMC employees who are authorized to have such access.

Incident Response – Where someone does gain unauthorized access to personal information, we must respond promptly to limit the harm caused and use the lessons learned from the to continually improve our information security program.

Third Parties, Vendors, Contractors – We must ensure that third parties who require access to our personal information commit to and are capable of meeting the same requirements for protecting the data.

Annual Review – Finally, we must review how well our information security program is working and make the changes necessary to appropriately protect our data.

For many of you, this new state law will simply mean that the protection we already provide for patient data must be expanded to include personal information. In most cases, the systems being used to access and transmit personal information already meet this standard.

Over the last year we have been updating our existing IS policies and creating some new policies to respond to recent changes in federal and state information security law and to better inform you about what you need to do to help us protect patient and staff data.

We expect to have these approved and available to you within the next couple of months. We will also be updating our information security training program to reflect these changes.

In the interim, if you have any questions, please contact the IS Security team at issecurity@bidmc.harvard.edu. They will be happy to answer any questions that you may have.

Protecting private and sensitive data is something the BIDMC community already takes very seriously. With your cooperation, we can ensure that our protections for personal information meet the standard already provided to patient data.


Friday, March 19, 2010

Cool Technology of the Week

My parents recently moved to a new home on a hillside in Southern California. It has a great view and frequent gentle breeze. My father and I were talking about windpower as means of adding green energy to their property. Green energy sounded great, but I was not sure it was ready for the mainstream of the average homeowner. It's not as if you can buy wind turbines or Bloom boxes at Home Depot.

Whoops - it never pays to bet against the rapid advancement of technology. You can buy a complete home wind turbine at Lowe's right now for under $600.

The Southwest Windpower 400 Watt Wind Generator generates 400 Watts at 28mph, has 3 Carbon fiber composite blades to ensure low wind noise, and electronic torque control for overspeed protection up to 110mph

What can you do with 400 watts?

Remember back to physics - Watts= Amps * Voltage i.e. work is done at a rate of one watt when one ampere flows through a potential difference of one volt. 1W=1V×1A

In my cool technology of the week on February 26, I outlined my effort to replace the light bulbs in my house with high efficiency LEDs.

I've found that I can light an entire room brightly with 50 watts of LED power (each 40 watt equivalent bulb uses only 8 watts to generate 350 lumens of light)

That means I could easily light my entire home with wind power.

Hey Dad, maybe wind power from Lowe's for Father's Day?

Thursday, March 18, 2010

The Yin to my Yang

Thirty years ago this month (at 17), I won a speaking contest in a California statewide competition. Kathy Greene won a related statewide art competition. At the time I remember marveling at her use of color in oil paintings of California's missions. She recalled a geeky public speaker who could spin an interesting story.

On August 31, 1980, I was assigned to the Lagunita dorm at Stanford. So was Kathy Greene.

We started dating on September 1, 1980. We just celebrated our 25th wedding anniversary.

Within 24 hours of our time together, I realized that she was the Yin to my Yang. I was math, science, engineering, black and white, digital 0's and 1's, Zen, and monk-like asceticism. She was art, music, culture, color, analog, Victorian clutter, and Joie de vivre. I was completely left brain, she was completely right brain. Together we were a whole brain. On September 2, 1980 we agreed to support each other throughout our education - I would do her math and she would do my art. Together, we could do everything.

Back then, Stanford cost almost $15,000 per year and we needed funding after our scholarships ran out. I went to the Stanford Law library, studied the US tax code and wrote a tax computation program (call it early TurboTax) that businesses could use to write payroll checks on CP/M and early DOS computers. Kathy wrote the manual, designed the advertising, and did all the corporate graphics. We sold thousands of copies from my dorm room.

I was asked to create something special for Steve Wozniak's 33rd birthday and I designed electronic greeting cards with synchronized audio and video that ran on 1980's computers. I patented the idea and included the odd concept that someday there will be a big network connecting everyone that would enable sending of electronic greeting cards between computers. (Next time you send an e-card, you can thank me for the royalty free license!). Kathy created all the graphics and digital artwork.

We've traveled the world, survived medical education, and raised a 17 year old together. She's introduced me to the cultures of the Far East, the music of Simon and Garfunkel, and the art of Maxfield Parrish.

She's been faculty at the School of the Boston Museum of Fine Arts, faculty at Bentley College, and a studio artist in South Boston.

She recently started her own blog - Art that is Life and opened the NK Gallery in Boston's South End.

She's my best friend.

It's great to marry the first person you date - I've been able to invest all my energy in a single life relationship. I think it will last.

Wednesday, March 17, 2010

Purging Files

I was recently asked if we purge older, untouched files from our storage systems.

This is a very tricky question because of the many compliance, medical-legal, and privacy requirements of a healthcare institution.

Short answer - we do not purge data for active employees. With the number of organizations (4 hospitals, 3 physician organizations, a community health center etc.), home directories and department shares we have it is almost impossible for us to determine centrally within IT what has business value and what is obsolete personal data that should be deleted.

How should organizations approach the complex of problem of what data to save and what to delete?

In my opinion, the best way to manage this is to setup storage quotas and increase them as people needed more space. The pro - it discourages unbridled storage growth. The con - it does cause additional overhead for the Help Desk and Storage Team, and from a compliance/e-Discovery standpoint would encourage users to permanently destroy files.

At BIDMC, we have tried desktop archiving and run into issues with archive software products not supporting all desktop clients equally (works on Windows but not on Mac or Linux). The solution we now are pursuing is to move the older files to the cheapest tier of storage (although maintaining anything we find forever) with relative transparency to the end customer. We use a storage virtualization appliance from F5 (formerly Acopia) to do this.

A purging/archiving policy should include a defined policy that states files are archived for x years, after which files should be moved into an extended retention folder which we will archive and keep, all other folders will be periodically purged of data beyond the stated policy retention period.

We have a data retention policy that governs our business records including paper and electronic for clinical, financial and administrative records. These retentions are governed by applicable law, e.g. 20 years for clinical record content. The retention schedules are included as an appendix to the policy.

We have some log content that is overwritten as storage runs out, i.e. first-in, first-out. How long we save log files is dependent on the content involved. For logs related to clinical record access, we save forever.

We do delete files and email accounts for terminated employees after a grace period. The grace period is to make sure there is no need for the data by the person's manager and the employee will not return to work at BIDMC or an affiliate. The current grace period is 270 days.

Periodically, we have litigation hold involving a subset of our records; primarily Windows files and email. For those accounts subject to the lit hold, we retain them for whatever duration Legal requests.

We are including a capital budget request for next FY for a more robust eDiscovery capability that will allow us to index and search our backup copies of our email and Windows files.

Purging/archiving requires a great deal of thought, senior management/board sponsorship and and rigid enforcement to be effective. With the cost of storage dropping, we will continue to store everything in the short term. However, in the long term this becomes challenging to maintain, so ideally we'll use a combination of quotas and cost effective tiering of data to balance the need for retention, compliance, and business value.

Tuesday, March 16, 2010

Partial Credit for Meaningful Use

Over the past few weeks, I've had the opportunity to review numerous NPRM comment letters from professional groups and hospitals. Although the issues vary widely depending on the size, IT sophistication, and resources of the commenting organizations, one theme is clear throughout - the desire for partial credit if meaningful use best efforts do not quite meet the threshold required for stimulus funding.

All believe that it is unfair to ask for 25 projects to be done perfectly in order to qualify for the first dollar of stimulus funding i.e. what if 23 projects are done perfectly but 2 are not achievable due to local market or infrastructure issues? What if 70% of all ambulatory prescriptions are e-prescribed instead of the required 75%?

Comments have included:
*The requirement that ALL measures be met will slow the adoption and meaningful use of EHRs
*The number of required measures is unrealistic for Stage 1
*The thresholds for measures are too high

All conclude CMS should maintain strong incentives for high levels of use, but eliminate the “all or none” thresholds for providers to qualify as meaningful users, at least for Stage 1.

I've seen two detailed proposals to address the partial credit problem - one from the HIT Policy Committee and one from the American Hospital Association.

The HIT Policy Committee has recommended a partial credit approach called the 3-1-1-1-0 proposal. You can read their recommendations on the ONC website.

The idea is that organizations should be permitted to defer fulfillment of a small number of meaningful use criteria and still qualify for incentive payment. The deferment would last until Stage 2 criteria apply. To prevent providers from bypassing an entire priority area (e.g., skip all of patient engagement), the 3-1-1-1-0 proposal allows professionals and hospitals to qualify for Stage 1 incentives if they defer no more than the specified number of objectives in each category, as indicated in this table.

The HIT Policy Committee idea includes the 2011 recommendations as they are written today and takes into account the fact that 2013 and 2015 recommendations are still a work in progress.

The American Hospital Association has recommended a different approach - suggesting that all criteria for meaningful use (stage 1,2,3) be specified now and enabling hospitals to travel a glide path of implementation from 25% to 100% until 2017 (the graphic above).

The logic is that software implementation life cycles take 24 months and it's hard to change software 3 times for 3 stages. Rather, working on all stages over a multi-year period provides time for technology, policy, and process changes to be coordinated in a phased way.

The only problem with this idea is that we really do not know what technology capabilities and policy priorities we'll have in 2017, so declaring them all now seems premature.

My opinion, aligned with the HIT Policy Committee recommendations, is that we should designate a core set of meaningful use requirements (i.e. 10 or so must haves), permit providers to select a given number of additional qualifying measures among a set of optional measures (i.e. choose any 5 from a menu of 10), and enable providers who meet substantially all of a measure to be considered meaningful users.

Furthermore, CMS could scale payment amounts to the level of use. For example, a provider who demonstrates ambulatory CPOE usage at 25% would receive partial credit for that metric. Usage at 50% ,75%, and 80% (the NPRM) goal would receive increasingly higher levels of credit.

Regardless of the approach chosen, it's clear that small and large providers alike want some provision for partial credit. I look forward to the CMS comment disposition process which will address this theme.

Monday, March 15, 2010

In ONC I Trust

It's my nature to question authority.

Whether it's religion, politics, or even my local administrative leadership, authority figures must earn my trust.

Earning that trust is not easy. As folks who work closest with me know, I believe that much of Dilbert is based on true case studies.

Over the past year, I've worked very closely with many people at ONC - David Blumenthal, John Glaser, Judy Sparrow, Farzad Mostashari, Chuck Friedman, Carol Bean, Doug Fridsma, Chris Brancato, Jonathan Ishee, Arien Malec (on loan to ONC for 8 months), and Jodi Daniel. I've worked with HHS CTO Todd Park. I've worked with US CTO Aneesh Chopra.

They've earned my trust.

The ONC folks work long hours, nights, and weekends. They do not have a dogmatic philosophical, industry, or architectural bias. They are simply trying to move the ball forward to improve healthcare quality and efficiency using IT tools.

Meaningful Use is a brilliant construct. If it were not for meaningful use, the stimulus would simply be a hardware and software purchasing program. Clinicians would waste government dollars buying technology and never use it (or use it in limited ways such as revenue cycle automation). I've seen numerous technology programs fail because clinicians just give the technology to their kids or sell it on eBay. Meaningful use requires metrics of adoption of the measure of success. Clinicians only receive stimulus dollars AFTER they have fully adopted the technology.

NHIN Direct is a powerful idea. My blog is filled with entries suggesting that we need a reference implementation for simple transport of data packages (X12, NCPDP, HL7 v2, CDA, CCR) among payers, providers and patients. NHIN Direct will assemble energetic, well intentioned people to create open source software that solves real world transport problems. I'm serving on the NHIN Direct Implementation Group. We'll have running code, implementation guidance, and data use agreements by October.

I've enjoyed my 5 years harmonizing standards as part of HITSP. The tireless volunteers really made a difference. But there were issues. The AHIC Use Cases were overly complex. The Interoperability Specifications, which were designed to support the AHIC Use Cases, tightly coupled transport and content standards. It was challenging to use a portion of a use case to solve a limited real world problem. In HITSP's final contract year, the Tiger Teams did remarkable work creating highly reusable content, vocabulary, transport and security modules called capabilities and service collaborations that were much more aligned with ARRA and easier for implementers to understand.

The new Standards Harmonization framework being proposed by ONC using the National Information Exchange Model (NIEM) is something to be embraced, not feared. I've been misquoted saying something like "we'll extend the Department of Justine infrastructure to include healthcare." That's not at all what I said. My actual comments reflected on the wisdom of the NIEM methodology which follows the HITSP Tiger Team approach - define the business needs and find the parsimonious data content, vocabulary and transport standards to meet that need. NIEM methodology is consistent with CDA, CCR, and simple transport. It does not replace the decades of work that have already been done. Instead it provides a methodology for defining needs, selecting and developing standards, and implementing those standards in a testable, sustainable way. Over the next few weeks, I'll write about the several recent RFPs that embrace NIEM methodologies issued including

*Office of the National Coordinator (ONC) CIO-SP2i Solicitation Number 10-233-SOL-00070 entitled "Standards and Interoperability Framework – Use Case Development and Functional Requirements for Interoperability

*Office of the National Coordinator (ONC) CIO-SP2i Solicitation Number 10-233-SOL-00072 entitled "Harmonization of Standards and Interoperability Specifications."

*Office of the National Coordinator (ONC) CIO-SP2i Solicitation Number 10-233-SOL-00080 entitled "Standards and Interoperability Framework Standards Development."

I've written letters of support for responses to all these RFPs.

I was recently asked about the Certification NPRM and if the temporary process and permanent process might create market confusion by changing certification criteria after 2 years and requiring that clinicians replace the systems acquired under the temporary process. My answer was simple - ONC leaders would not let that happen. The people there understand that this is a journey and will ensure that change is managed as evolutionary phases, not revolutionary quantum leaps.

Finally, I trust the HIT Policy Committee and HIT Standards Committees. These folks are good people, with diverse backgrounds, and different points of view. You will not see hegemony of any single person or organization. All their calls and work are done in open public forums. They have included the best people with the greatest good of patients as their driving motivation.

We live in remarkable times, which I've called the "Greatest Healthcare IT generation" and the "Healthcare IT Good Old Days"

My advice - trust the ONC folks and Federal Advisory Committees. Join the process. Be open about your opinions. Feel free to disagree with any idea or policy. Democracy is messy, but the folks at ONC today have the right people and processes in place to harness our energy and turn it into guidance we can all embrace.

Friday, March 12, 2010

Cool Technology of the Week


I have long believed that fuel cell technology has the potential to be a high quality, green energy source that gives us alternatives to burning coal or relying on oil imports.

Of course, the promise of fuel cells has been slowed by high costs, and complex technology.

On 60 minutes a few weeks ago, Bloom Energy introduced it's next generation fuel cell technology - the Energy Server.

It's already in production at eBay, Walmart, Staples, and Google.

How does it work?

Here's a flash animation.

The company and its technology are still a bit mysterious. There are detractors who think this may be the next Cold Fusion. There are questions of reliability, maintainability and practicality




Holy grail? Maybe. Cool Technology? Definitely.

Thursday, March 11, 2010

Subject Matter Experts

A challenge in all IT organizations is achieving a balance between central control and local/departmental autonomy. Our approach is to clearly define roles and responsibilities such that IT is responsible for infrastructure, databases, security, interfaces, and data integrity while partnering with the business owner for subject matter expertise. Here's the detail:

There are immediate and long-term elements of an application implementation and the ongoing support of the application that are beyond what the IT Department can reasonably be expected to manage, maintain or support.

These may include, but not necessarily be limited to: customer financial responsibilities associated with managing or maintaining the application; vendor or sales contacts and interactions specific to the product lifecycle, licensing or application functionality; user management issues specific to the use, functionality, workflow or impact of the application.

Some of these responsibilities may be assumed by the customer’s or department’s management staff (“Application Owner”) while others may be assumed by a staff member who possesses the depth of knowledge or expertise associated with or required to run the customer’s or department’s daily operation.

This person is usually referred to as the “Subject Matter Expert” or “SME”. The SME is very important to not only assisting in the application’s development and implementation but in also maintaining the long-term use and effectiveness of the application within the department.

The responsibilities outlined below are those that require ownership by the Application Owner and by the SME. They are specific to financial management, vendor relationship management or non-technical maintenance and support requirements that can be most effectively handled by the owner or SME. To clarify the use of the term “non-technical”: there is no requirement or expectation that the owner or SME will possess either knowledge or understanding about the workstation or server operating system, hardware or configuration or the programming, support or configuration of the software application.

A. Solution / Application Financial Management
* Budgeting and procurement of funds associated with the ongoing management and maintenance the Solution/Application throughout its lifecycle, which may include but may not be limited to: hardware, software, licensing, maintenance and support contacts, and other equipment and/or services/agreements that may be required.
* Budgeting and procurement of funds that may be required to secure the services of a third-party vendor(s) for any solution component (hardware or software) that is not supported by the IS Department. The sponsor/ owner is responsible for all expenses and liability associated with any agreement(s) or service(s) contracted between them and the vendor.

B. Vendor Relationship Management
* Manage vendor relationship and maintain current vendor-related information; i.e., Account Manager and contact information
* Maintain product line awareness through routine communications with the vendor or your Sales Account Manager.
* Manage application licensing requirements (identify/forecast needs) and communicate expenses or requirements to your department’s financial resource.

C. Subject Matter Expert (SME) and Use Management
* Possess an understanding of the department’s business or clinical workflow requirements.
* Serve as the contact person to whom any HIPAA or compliance issues may be directed, specifically related to the data being entered into the application.
* Serve as the department’s liaison to the IS Help Desk to facilitate IS involvement in basic troubleshooting, problem identification and escalation paths to technical experts or the vendor.
* Maintain availability as the primary contact to whom the IS Department can communicate outages or technology issues that may impact the area’s productivity or ability to provide critical services.
* Identify, establish and maintain downtime procedures to deal with IT outages that may negatively impact the application’s availability and potentially the department’s productivity or ability to meet its mission or objectives.
* Facilitate departmental user training to ensure efficiency and optimal productivity by providing on-the-job training or working with the vendor to arrange for third-party training.
* Participate with IT staff and vendor representatives in discussions that involve the implementation of application (version) changes that may have an impact the department’s workflow, productivity or the application’s (end-user’s) functionality.
* Maintain (working) knowledge of the application user interface; i.e., the ways in which the product is designed to interact with the user in terms of text menus, checkboxes, text or graphical information and keystrokes, mouse movements required to control the application; as well as, report creation/generation and other information about the use of the application that is essential to the day-to-day operation and productivity of the department.
* Perform any routine vendor-recommended or -required user-related performance or integrity checks.
* Create user-specific policies and/or procedures that may be required to address appropriate departmental use and functionality.
* Describe how critical the application is in meeting the mission and objectives of the department in providing services or in maintaining productivity.
* Maintain awareness of vendor version updates that may impact the department’s workflow, productivity or application’s (end-user’s) functionality.
* The primary contact for vendor and the recipient of media related to application upgrades, updates, patches or other application related materials or information.
* Work with IS to coordinate/facilitate required software upgrades/updates that may impact departmental/user productivity.
* Provide application-level account management responsibilities that may include: Defining user access rights or authorizing user accounts.
* Provide on-site first response for end-user issues related to end-user application performance or functionality issues.

The responsibilities listed above are discussed in detail by IT throughout the application’s implementation and will be reviewed with the SME when the Support Level Agreement (SLA) is finalized. If at any time during these discussions there are items that are unclear, ambiguous or that cannot be adequately managed or maintained by the SME or a resource within the application owner’s department, they should be immediately identified and discussed with the IS Manager.
If, during the application implementation or subsequent support discussions, it is determined that the solution/application is identified as being critical to the provision of patient care or to a core or enterprise-wide critical function, the application owner or SME will be asked to identify a back up Subject Matter Expert.

I hope this is helpful to your application implementations. It works for us!

Wednesday, March 10, 2010

Introducing NHIN Direct

Over the past 5 years, I've worked with many talented standards developers, implementation guide writers, and software vendor engineers. We've crafted use cases, selected standards, harmonized gaps/redundancies and written interoperability specifications.

I'm very happy with our achievements in content and vocabulary standards. We have excellent momentum and accelerating adoption.

Transmission is still an area requiring work. FHA Connect is a good start, but is challenging for small providers who have different use cases - the push of healthcare data from provider to provider, provider to payer, and provider to public health in support of Meaningful Use. Interoperability specifications and profiles for transmission have been written using combinations of existing standards from many SDOs. The resulting documents are not simple for smaller organizations with limited resources to use.

When I ask about creating simpler approaches, I'm told that these guides were the best that could be done to address the use cases with existing standards.

Here's a very controversial point - what if the standards we are starting with as we write interoperability specifications and profiles are not appropriate for creating simple, easy to use, internet-based data exchange that works for small organizations with limited resources?

The answer - we need a new, simpler approach that leverages REST, simple SOAP, and SMTP for data exchange. I believe NHIN Direct is that approach.

Here's a few highlights from the NHIN Direct FAQ page

What is NHIN Direct?
NHIN Direct is the set of standards, policies and services that enable simple, secure transport of health information between authorized care providers. NHIN Direct enables standards-based health information exchange in support of core Stage 1 Meaningful Use measures, including communication of summary care records, referrals, discharge summaries and other clinical documents in support of continuity of care and medication reconciliation, and communication of laboratory results to providers.

Why NHIN Direct?
There is a need to extend the NHIN to support a broader set of participants and providers through a simple, standards-based, widely deployed and well-supported method for providers to securely transport health information using the Internet in support of the core Meaningful Use outcomes and measures.

What is the relationship between NHIN Direct and the currently described NHIN Architecture?
The currently described NHIN Architecture describes a method for universal patient lookup and document discovery and exchange between National Health Information Organizations, including Federal providers such as the Veterans Health Administration, Department of Defense Military Health System, RHIOs, and large IDNs. NHIN Direct supports cases of pushed communication between providers, hospitals, laboratories, and other health settings of care.
The current members of the NHIN Collaborative will be able to support the NHIN Direct model, and providers and enabling organizations for NHIN Direct will scale to support to support the discovery and exchange use cases. Both models are required and will be in use at the same time for the same participants, depending on the information exchange needs.

Does NHIN Direct replace the current NHIN model? Or is NHIN Direct the current NHIN model on “training wheels”?
No. NHIN Direct and the current NHIN model support different use cases and are coequal in a system of robust nationwide health information exchange.

How will the specifications and standards for NHIN Direct be developed?
The specifications and standards will be developed in a rapid, open process intended to draw from a varied set of stakeholders representing both public and private providers and technology enablers.

What NHIN Direct doesn’t solve
In order to create rapid innovation, we are deliberately constraining the scope of NHIN Direct to a spare set of specifications and standards that solve a well-defined pain point. Unless a particular capability is essential to support the core use cases, we will leave it out or defer it to a later day. In doing so, we do not intent to devalue any particular health information exchange area or need, but merely to define a scope that both advances the state of nationwide health information exchange and is achievable in the short term.

How can I or my organization participate?
There are three basic ways to participate
1. A core group of NHIN Direct stakeholders will come together frequently from March through the end of the year to develop iteratively the core enabling specifications and service descriptions, and test those specifications with working code in both demonstration and real-world implementation contexts. To enable close collaboration, the core group is expected to include 5-8 stakeholders who commit to active participation, code development and contribution, and, most importantly, to implement the resulting specifications and services in a real-world setting that demonstrates the core use cases.
2. The NHIN Direct work will be conducted in an open manner, with ample opportunities for participation. We welcome comment and feedback, working code, code contribution to the open source reference implementation, and implementations of the specifications in different technologies.
3. Technology enablers may passively participate in the standards development work, by monitoring the work and resulting specifications, implementation guides and reference technology implementations, and then actively participate in late 2010 and in 2011 by building the core NHIN Direct services into EHRs, HIEs, and other healthcare technology implementations.


The NHIN Direct effort philosophy is expressed in design rules

The golden standards rule of "rough consensus, working code" will be applied to this effort.

Discuss disagreements in terms of goals and outcomes, not in terms of specific technical implementations.

The NHIN Direct project will adhere to the following design principles agreed to by the HIT Standards Committee from the feedback provided to the Implementation Workgroup

Keep it simple; think big, but start small; recommend standards as minimal as possible to support the business goal and then build as you go.

Don’t let “perfect” be the enemy of “good enough”; go for the 80% that everyone can agree on; get everyone to send the basics (medications, problem list, allergies, labs) before focusing on the more obscure.

Keep the implementation cost as low as possible; eliminate any royalties or other expenses associated with the use of standards.

Design for the little guy so that all participants can adopt the standard and not just the best resourced.

Do not try to create a one size fits all standard, it will be too heavy for the simple use cases.

Separate content standards from transmission standards; i.e., if CCD is the html, what is the https?

Create publicly available controlled vocabularies & code sets that are easily accessible / downloadable

Leverage the web for transport whenever possible to decrease complexity & the implementers’ learning curve (“health internet”).

Create Implementation Guides that are human readable, have working examples, and include testing tools.

I look forward to the efforts of NHIN Direct. As always with emerging technologies, I'm eager to be an early adopter, beta tester, and active contributor.

Tuesday, March 9, 2010

The HIT Standards Committee Comments on the IFR

On Friday, the HIT Standards Committee submitted its comments on the Interim Final Rule to ONC. Below is a summary of the documents we submitted, which will soon be posted on the publicly available comments site.

Clinical Operations
1. Recognizing that standards evolve and regulations are hard to change, we recommended that the IFR specify broad families of standards, stating the major version of each standard, accompanied by a detailed implementation guide that serves as a floor. For example HL7 Version 2 should be used for laboratory result reporting and the HL7 2.5.1 Implementation Guide is the recommended floor. HL7 2.5.1 will evolve, which is fine, since any new implementation guidance will still be in the HL7 Version 2 family, although it could be HL7 2.6, 2.7 etc. Such an approach futureproofs the regulation. Here are the families of standards we recommended

Patient Summary - HL7 Version 3 Clinical Document Architecture (of which CCD is one example) and the ASTM E2369 CCR
Medications - NCPDP Script
Administrative Transactions - X12 4010 and 5010
Quality Reporting - XML
Population Health including labs, biosurveillance and immunizations - HL7 Version 2

2. The implementation guides we recommended as floors are

Patient Summary - HITSP C32 2.5

Medications - NCPDP 8.1 and 10.6 (realizing that 10.6 is the emerging standard but 8.1 is the Medicare Part D requirement)

Administrative Transactions - CAQH Core 1 Operating rules applied to 4010 and to 5010 as that guidance becomes available

Quality Reporting - PQRI XML 2008 Registry

Public Health Labs
HL7 US Realm Interoperability Specification - Lab Result Message to EHR ORU^R01, HL7 Version 2.5.1, July 2007
HL7 2.5.1 Laboratory Result Reporting to Public Health (Release 1)

Public Health Surveillance
Public Health Information Network HL7 Version 2.5 Message structure specifications for national condition reporting, Final Version 1.0, August 18. 2007, CDC
Message Structure Specification v 1.0 Errate and Clarification 05/23/2008, CDC

Public Health Immunizations
Implementation Guide for Immunization Data Transactions using Version 2.3.1 of the HL7 Standard Protocol (published 6/2006), CDC

3. We also recommended several vocabulary starter sets and mappings to accelerate adoption of controlled terminologies (examples are the NLM SNOMED-CT Core problem set subset, RxNorm, and the LOINC most frequently ordered subset). We also recommended that cross maps be made generally available such as

SNOMED CT to ICD9/10 mapping
SNOMED CT to CPT4/HCPCS mapping
SNOMED CT to LOINC mapping

4. We recommended that Vital Signs be encoded in SNOMED or LOINC

5. We asked for clarification that the scope of required standards is limited to external exchanges between organizations.

Clinical Quality
1. We recommended that 2011 include a controlled vocabulary for medication allergies at the drug level as is needed for drug/allergy interaction checking and reporting. UNII (part of Stage II recommendations) is at the ingredient level, which is more than is needed for 2011 quality measure reporting.

2. We recommended that 2011 include a controlled vocabulary for Vital Signs - SNOMED and LOINC with LOINC preferred. Vital signs are needed in 2011 for hypertension control and body mass index reporting.

3. We recommended that 2011 include standardized Units of Measure - UCUM. Standardized units are required to calculate measures that use lab results, medication dosages, and vital signs.

4. We noted that CCR is a fine patient summary standard, but for other uses such as reporting the actors/actions/events in clinical workflow needed for quality measurement, CCD is preferred over CCR.

5. We noted that changing from paper-based attestation to PQRI XML to a CDA-based reporting standard - 3 changes in 3 years - would be burdensome. We recommended that instead professionals and hospitals follow existing CMS requirements at the time of reporting. For example, hospitals currently have well specified requirements and supporting processes for reporting to the CMS Hospital Compare website.

Privacy and Security
1. Recognizing that EHRs and EHR Modules may be used to achieve meaningful use, we recommended that an organization consider the security implications of using a collection of modules. Some modules may not involve data exchange and thus do not need certain security features such as encryption. Others may need to comply with the same data exchange protections as a complete EHR. Using the same terminology embraced by HIPAA, was suggested that security of EHR Modules should be "addressable" - analyzed and secured appropriately by the implementing organization.

2. The IFR preamble contains lists of example standards. We considered including these examples in the regulation itself but ultimately recommended that they not be specified, since security standards evolve rapidly. Instead we recommended that a list of acceptable technology standards be included in the certification process.

3. The IFR requires providers to implement online access to records for their patients. We felt that the definition of "online" was ambiguous. We recommended that as a floor, the provider should make an electronic copy available to the patient such as via a simple download.

4. There are two popular forms of encryption - Symmetric and Asymmetric (public key). We recommended that both be acceptable and that AES be required when using Symmetric key approaches.

5. I frequently blog about the need for standardized, simple approaches to data transmission. Tomorrow's blog will be about the exciting NHIN Direct effort to accelerate this work. The IFR provides very vague transmission standards guidance - REST and SOAP. There are really two choices - no guidance or very specific guidance. Vague guidance is not helpful. We recommended that the provision listing REST and SOAP without additional detail be removed.

6. We noted that Accounting of Disclosures is a 2015 Meaningful Use criterion yet ARRA requires accounting for disclosure in 2009 (or the date of acquisition) for entities which acquire EHRs after January 1, 2009. We recommend a joint meeting of the Policy and Standards Committee Workgroups on Privacy and Security to align the ARRA, Meaningful Use, and Standards timelines. We also recommended that the Office of Civil Rights consider the ASTM E2147 standard as a simple list of required data elements for audit and disclosure.


The comment period closes in one week and I look forward to the next revision of the Interim Final Rule as it evolves into regulation guiding all our interoperability efforts.

Monday, March 8, 2010

If It's Tuesday, This Must be Tokyo

A quick but eventful trip to Japan from March 3-7. My trip was funded by the University of Tokyo as part of an academic visit and not related to any company or product.

Two hours after landing in Narita, I had dinner with my hosts in Tokyo at La Rochelle, a French-Japanese fusion restaurant run by Iron Chef Hiroyuki Sakai . Chef Sakai prepared several novel vegan dishes for me using fresh Japanese mushrooms and vegetables.

The joy of the 14 hour time difference between the US East Coast and Japan is that I can work the Japanese day and the Boston day in the same 24 hour period. After the welcome dinner, the Boston business day began and I worked on several projects related to Meaningful Use- Federal, State and Local.

The morning brought 12 hours of lecturing, meeting, and greeting with Japanese healthcare policy and technology experts, discussing the Japanese version of the healthcare stimulus plan. (250 billion Yen, which is approximately 2.8 billion dollars). Critical issues for the Japanese are data security, data sharing consent, standards, reducing competitive disincentives to healthcare information exchange, and lack of EHR adoption among ambulatory clinicians.

After a great conference day, I said goodbye to my hosts and had 36 hours on my own before my flight left. My evening was filled with collaboration among members of the HIT Standards Committee to finish the comments on the Interim Final Rule (will be my blog tomorrow) and putting the finishing touches on a Health Affairs article (it seems to be a yearly tradition that I go to Japan and spend my nights writing a Health Affairs article for the annual Healthcare IT issue). At 7am I left the hotel and began the adventure I described in Friday's blog - a traverse of the Takao ridges. A truly remarkable experience.

In case you find yourself in Tokyo, here's my brief description of the hike.

From anywhere in Tokyo, take the Yamanote line to Shinjuku station. From there, transfer to the Keio line, and take a limited express bound for Kitano. Once there, take a local bound for Takao-san-guchi. Exit the station to the right and you'll find yourself at the the trailhead. There are 6 possible trails up Mt. Takao. I recommend trail #1 - the longest, most scenic and less traveled. However, Mt. Takao is a very popular destination, so less crowded is relative. The good news is that few people travel the ridge beyond Mt. Takao. After reaching the top, go to the East end of the summit and take the trail to Shiro-yama, the next summit. At that point, the crowds disappear. The trail to the next peak, Kagenobo-yama is an amazing ridge filled with cedar trees (sugi). The trail to Meio-toge is isolated, wild, and follows the track of the Japanese version of the Appalachian Trail (Kanto Fureai-no-michi). The final peak is Jimba-san and its famous war horse statue. A descent via Wada pass to the Jimba Kogen-shita bus stop, a bus ride on the #32 and #5 buses to Hachioji, and a train from either Keio or JR Hachioji station to Shinjuku completes the day. Book time for the hike is 5-7 hours (plus 3 hours for train/bus travel) and the distance covered is 12 miles, with a few thousand feet of elevation gain.

This morning, I checked out at 6am, took the Chuo line to Shinjuku, stashed my bags in a locker (fully electronic - you are given a PIN number to open the locker instead of a key), took the Yamanote line to Shinagawa station, then the Keihin Kyuko Rapid Limited Express to Miura-kaigan on the Miura Peninsula, an area of wide sandy beaches, rocky coastline, and a great old lighthouse. After walking the Peninsula, I took the Zushi line train to Jimmu station and hiked the temple route (the photo above), passing plum blossoms in bloom, Jizo statues, and the peak of Mt. Takatori. I was completely alone the entire trip.

From there, I took the Keikyu line back to Shinagawa, the Yamanote line back to Shinjuku, picked up my bags, and took the Narita Express to the airport, changing from my hiking clothes into a suit while on the train. I just landed from my 18 hour commute back to Boston.

In all my travels above, a rudimentary knowledge of Japanese really helped, since most signs in the wilderness are only in Japanese, such as this one which indicates the way to the train station (good luck figuring it out since both West and South seem to the be right answers). How did I get by? The following expression:

Sumimasen (fill in the name of where you want to go) wa doko desu ka?

which translates into

Excuse me, (placename) it where is?

This saved the day many times, since maps, signs, and guidebooks were often wrong.

The Japanese people are gracious, helpful, and eager to provide directions to foreign travelers.

I succeeded in my quest to find the road less traveled. Of course, the fact that it was 38 degrees, raining and trails were ankle deep with mud may be have been a disincentive to other travelers.

A great trip! Thanks so much to my friends in Japan who made it happen.

Friday, March 5, 2010

The Pleasures and Risks of Solo Hiking

My trip to Japan this week is a 15 hour commute, two days in Tokyo speaking/meeting with colleagues, one day hiking along mountain ridges 2 hours west of Tokyo, and then a flight back to Boston.

In my travels around the world, I'm always looking for the road less traveled. In the past few years, that's included walking the Seven Hills of Rome, exploring archeological sites in the Middle East, climbing mountains in Austria, and kayaking across the Baltic Sea.

Because of the logistics, physical conditions, and specialized gear needed to do these activities, I've often traveled alone, going Into the Wild.

When I travel alone I take extra precautions, packing a bit of extra food, a spare layer of warm clothing, and posting my itinerary with someone who can call for a rescue party if needed.

I will not do solo unroped climbing, solo travel in avalanche areas or solo kayaking in water that is colder than 50 degrees.

There are a few pleasures to travel alone - a pace that you define yourself based on your personal energy level, no time commitments, and simpler logistics.

There are risks - an injury while on a cold mountain ridge can lead to hypothermia, frostbite, or death. The margin of safety while hiking alone in wilderness areas, especially in winter, can be thin.

A hiking partner enables you to share the memories and relive the experiences. This winter, I've hiked alone many weekends, but also hiked with one of my colleagues from BIDMC who is an experienced alpinist.

Tomorrow's hike in Japan will cover 20 kilometers of the North Takao Ridge, from Mt. Takao to Jimba-san, traversing 3 peaks and stretching my knowledge of the Japanese rail and bus systems to get to the trailhead. (Addendum - I added a photo above of the Ridge and cedars in the mist that I captured on my Blackberry while hiking)

So, I'm off to experience the Japanese wilderness alone, with minimal risk, and play my Japanese flute from the peaks.

Buckaroo Banzai would be proud.

Thursday, March 4, 2010

The Certification NPRM arrives

The last of three HITECH related regulations was released by HHS on March 2 - the NPRM on Certification. It's 184 pages and will soon be published in the Federal Register.

Its major feature is that it creates two certification processes - a temporary one to ensure there is a path to certification in time for Stage 1 of Meaningful Use and a more permanent one.

My sense is that ONC consulted NIST and realized that the time to set up a comprehensive multi-organizational certification process would take until late 2010, giving no time for products to be certified before the Stage 1 Meaningful Use funding milestone (January 2011)

Here's a brief overview of the Certification NPRM from the FAQ Section of the HHS IT Website.

As I've posted for the other regulations, here's a bookmarked version of the NPRM on Certification. Thanks to Michelle Wood for this one.

Robin Raiford completed a bookmarked version of the NPRM on Certification as published in the Federal Register. Thanks for all your effort!

Tuesday, March 2, 2010

Dispatch from HIMSS

I've just finished my day in Atlanta and am beginning a commute to Tokyo.

Every year, I describe my top 10 impressions from HIMSS. Here's my summary of the event for 2010

1. Meaningful Use is everywhere. Vendors are promising EHRs, modules, appliances, and services to help clinicians achieve it. I had dinner on Monday night in a small Indian vegetarian restaurant. Sitting next to me were 3 engineers from Bangalore who were arguing about the details of Meaningful Use in between bites of vegetable curry. I could not escape Meaningful Use anywhere!

2. Certification is everywhere. It's particularly ironic that many vendors claimed their systems were certified, even though the certification NPRM was just released today, making compliance with the new certification process in time for HIMSS impossible.

3. Cloud computing, Software as a Service and ASP models are popular tactics to accelerate EHR rollouts. There are still lingering concerns about how to ensure privacy in a cloud environment.

4. Several firms such as Intersystems, Axolotol, and Medicity are offering HIE platforms that include many of the standards noted in the IFR. The marketplace for HIE products is just emerging and it's hard to predict who will become the market leader.

5. The Continuity of Care Document is gaining traction. I found many vendors supporting CCD exports from their EHRs. A company called M*Modal , has developed natural language processing technology that captures dictated content in its original context (ontology-driven
rules) as a CDA document.

6. Consultants abound. It's clear that Regional Extension Centers and Health Information Exchanges will require expertise and staffing from professional firms. They all had large booths at HIMSS.

7. 30,000 people attended, including 10,000 I did not recognize (just kidding). It's clear to me that many IT professionals, even those with limited healthcare domain expertise, attended HIMSS to better understand how they could participate in the euphoria of HITECH stimulus dollars.

8. Self service kiosks for patient identification and self-registration are now mainstream. Just as we print our airline boarding passes, we can now use credit cards or biometrics to check into ambulatory care appointments and automatically settle all co-pay balances.

9. Image exchange in the cloud is being offered by several vendors. As I mentioned in Monday's blog, Symantec announced an appliance for small clinician offices that cloud enables all imaging modalities using a facebook-like social networking invitation to share/view images.

10. PHRs and patient engagement are becoming more mainstream. Google and Microsoft continue to innovate in the non-tethered PHR marketplace.

I left HIMSS with a feeling of hope. Our industry is vibrant, clinicians are engaged, our goals are clear, and resources are becoming available.

I'll be commuting over the next 24 hours, but when I land, I'll publish my analysis of the Certification NPRM.