I was recently asked what computers I personally run in the office and at home.
I exclusively use a Dell Optiplex D420 1.06 Ghz Core Solo with 1 Gig of RAM running Ubuntu Gutsy Gibbon Linux. The total weight of this 12" laptop with a 3 hour battery is 3lbs 4oz. Boot times average 40 seconds from power on to network connected. I run only open source software for all work that I do - OpenOffice 2.3, Firefox 2.008, and Evolution Email 2.12 connected to our corporate Exchange Server.
My wife (a graphic artist) uses a Macbook 13" 2Ghz Core Duo with 2 Gigs of RAM running OS X Leopard. The total weight of this laptop with a 2 hour battery is 5lb 2oz. Boot times average 35 seconds from power on to network connected. She runs a full suite of Adobe CS3 products.
My daughter (a high school student using Microsoft Office, instant messaging and multi-player role playing games) uses a Lenovo T61 2Ghz Core 2 Duo with 2 Gigs of RAM running Vista. The total weight of this laptop with a 2 hour battery is 5lbs 4oz. Boot times average 60 seconds from power on to Ctrl-Alt-Delete sign-on. Power on to network connected averages 2 minutes total with a clean Vista install. Although it's not a Gibbon or Leopard, Vista is certainly a resource consumptive beast.
Early this week, I installed Leopard on our Macbook. The ugrade from Tiger was automatic and effortless. One great feature is that Macs with Leopard can now search for files on Windows and Linux machines. From my wife's Macbook, I was able to search files on Ubuntu and Vista seamlessly using the Spotlight application. Our entire household is now sharing and searching files over our 802.11a wireless network. To enable Windows and Linux to search the Mac, I had to enable SMB Filesharing in Preferences, Sharing.
So what did I think of the new Apple Leopard OS X operating system?
As a CIO, I am most concerned about reliability, security, and interoperability. As a technology user, I care about simplicity, intuitiveness, and getting my work done efficiently. I evaluated Leopard wearing both hats.
Most users do not have time for training, so I explored Leopard's new features without the benefit of any instructional materials to evaluate their intuitiveness. My first exercise was to use Finder to explore PDFs, documents, spreadsheets and presentations via Cover Flow and Quick Look, two related document preview features that I think of as "sip" and "drink". Cover flows shows the first page of any document or multimedia file. Quick Look enables a detailed review of the entire file without having to launch an application. These worked well for all my file types.
I then moved on to the Spotlight feature to search files all computers in the household. Mac, Windows and Linux were instantly searchable.
Generally, I keep my Mac desktop uncluttered, using the Dock to manage applications and the Finder to locate needed files. The new Stacks feature enables me to create a "fan" or "grid" of my most commonly used files and automatically organizes my internet downloads. I found Stacks helpful in that fewer keystrokes are needed to retrieve commonly used files.
As part of a project to assess flexible work arrangements (see my blog entry on Flexible Work arrangements), I've recently been testing video teleconferencing, instant messaging, and groupware presentation tools. Since the new iChat features promised enhanced collaboration tools, I was eager to test drive the new remote desktop and document sharing capabiltiies. I found incredible clarity with the Mac to Mac videoteleconferencing tools which incorporate the H.264 video and AAC-LD audio codecs. Shared powerpoint presentations and shared desktops worked well over Google Talk and AOL's AIM.
Finally, I tested Time Machine, a backup and restore system similar to snapshot features I've used on IBM/Lenovo laptops. Having the ability to restore any deleted or reverse any accidental edit worked perfectly for me and will be very popular among users.
So, how would I rate these new features as a CIO? During all of my testing, Leopard booted, performed, suspended, restored, and shutdown without a hitch. Security of OS X is strong because of its Unix foundation. Interoperability of iChat and Spotlight across operating systems was excellent.
As a technology user, I found the new GUI features easy to navigate without training. As with previous releases of Mac OS X, the operating system hides the complexity of Unix in a way that enhances my ability to focus on work rather than the OS. My only complaint is one that I've voiced before: I want to install Leopard on non-Apple hardware, so that I can have OS X support on my 12" road warrior Dell subnotebook. In the meantime, I'll continue running Ubuntu Gutsy Gibbon, as it's the closest thing to Leopard that runs on a Dell. Steve Jobs - when's the subnotebook coming?
Wednesday, October 31, 2007
Tuesday, October 30, 2007
The End of Polio
A medical student today will go through their entire career without ever seeing a case of polio. The polio virus was feared throughout the early 20th century, leaving millions paralyzed or dead. During Summer and Autumn, polio epidemics spread human to human with this highly contagious disease. In the 1940's and 50's, negative pressure ventilators called the "iron lung" were used to support
Personal Health Records
The exact definition of a personal health record (PHR) is still evolving, but personal health records hold the promise to make patients the stewards of their own medical data. PHRs may contain data from payer claims databases, clinician electronic health records, pharmacy dispensing records, commercial laboratory results and patient self-entered data. They may include decision support features, convenience functions such as appointment making/requesting referrals/medication refill workflow, and bill paying. However, most PHRs are not standards-based and few support an easy way to transport records among different PHR products.
The current landscape of PHRs includes four basic models:
Provider Hosted Patient Portal to the clinicians's Electronic Health Record - in this model, patients have access to provider record data from hospitals and clinics via a secure web portal connected to existing clinical information systems. Examples of this approach include MyChart at the Palo Alto Medical Clinic, Patientsite at Beth Israel Deaconess Medical Center, and MyChart at the Cleveland Clinic. The funding for provider-based PHRs is generally from the marketing department since PHRs are a powerful way to recruit and retain patients. Also, the Healthcare Quality Department may fund them to enhance patient safety since PHRs can support medication reconciliation workflows.
Payer Hosted Patient Portal to the payer claims database - in this model, patients have access to administrative claims data such as discharge diagnoses, reimbursed medications, and lab tests ordered. Few payer hosted systems contain actual lab data, but many payers are now working with labs to obtain this data. Additionally, American's Health Insurance Plans (AHIP) are working together to enable the transport of electronic claims data between payers when patients move between plans, enhancing continuity of care. The funding for payer-based PHRs is based on reducing total claims to the payer through enrollment of patients in disease management programs and enhancing coordination of care.
Employer Sponsored - in this model, employees can access their claims data and benefit information via a portal hosted by an independent outsourcing partner. An example of this is the collaborative effort of Pitney Bowes, Wal-Mart, Intel and others to offer Dossia, an open source application which enables patients to retrieve their own data. The funding for employer-based personal health records is based on reducing total healthcare costs to the employer through wellness and coordination of care. A healthy employee is a more productive employee.
Vendor hosted - several vendors are releasing products in 2007-2008 to serve as a secure container for patients to retrieve, store and manipulate their own health records. Microsoft's HealthVault includes uploading and storage of records as well as a health search engine. In a recent New York Times article, Google is reported to be offering similar features in late 2007. The business model for these PHRs is generally based on attracting more users to advertising-based web-sites, although the PHR itself may be ad free.
All of these models will be empowered by data standards for demographics, problem lists, medications, allergies, family history, the genome, labs, and text narrative. The Health Information Technology Standards Panel (HITSP) completed an initial set of interoperability specifications for demographics medications, allergies and advanced directives in 2006. In 2007, it completed problem lists, labs, and text narrative over networks and on physical media such as thumb drives and DVDs. In 2008, it will complete family history and the standards required to securely transmit genomic information. These standards will be used as part of Commission on Certification of Health Information Technology (CCHIT) certification criteria for electronic health records and personal health records over the next 3 years.
Another aspect of interoperability is interfacing home monitoring devices such as glucometers, scales, blood pressure cuffs and spirometers to personal health records. At present, most patients using these devices must manually type results into PHRs or call them into a provider because of the lack of uniform data standards in devices, EHRs, and PHRs. In 2008, HITSP will identify standards to ensure vital signs and glucose monitoring devices are interoperable. Continua is building a great foundation for this process by working with IEEE, HL7 and other SDOs to identify the most appropriate standards to support device interoperability and to identify gaps in current standards. HITSP and Continua will work collaboratively on standards selection in 2008 and beyond.
Privacy and Security are critical to health data exchanges between PHRs and EHRs. Privacy is the policy which protects confidentiality. Security is the technical means to ensure patient data is released to the right person, for the right reason, at the right time to protect confidentiality. The US currently lacks a uniform private policy for clinical data exchange. Local implementations are high variable and some organizations use opt-in consent, others use opt-out. The personal health record can help address this lack of policy. By placing the patient at the center of healthcare data exchange and empowering the patient to become the steward of their own data, patient confidentiality becomes the personal responsibility of every participating patient. Patients could retrieve their records, apply privacy controls, and then share their data as needed with just those who need to know. Since policies are local, the security standards built into PHRs need to be flexible enough to support significant heterogeneity.
HITSP has selected the security standards for the country which include audit trails, consent management, role-based access control, federated trust, and authentication. Personal Health Record vendors and device manufacturers will be empowered by these standards, which outline best practices for securing patient identified data transmitted between systems. Once patients trust the security of the network used to exchange data, adoption of personal health records and data exchange among payers and providers will marketly increase. Eventually, PHRs could also hold consent information, recorded via the HITSP Consent Management Interoperability Specfication, that can provide an easily queriable source for patient health information exchange preferences.
The evolution of today's paper-based, non-standardized, unstructured text medical record into a fully electronic, vocabulary controlled, structured interoperable document shared among patients and providers will be a journey. Standards are key and recent work in this area now provides the foundation for personal health records. Security technology exists today that is good enough. Early experiences with PHRs demonstrate high patient satisfaction, reduced phone volume to provider offices and less litigation by patients sharing medical decisionmaking with their clinicians. The time to implement PHRs is now and the only barriers are organizational and political, not technological.
The current landscape of PHRs includes four basic models:
Provider Hosted Patient Portal to the clinicians's Electronic Health Record - in this model, patients have access to provider record data from hospitals and clinics via a secure web portal connected to existing clinical information systems. Examples of this approach include MyChart at the Palo Alto Medical Clinic, Patientsite at Beth Israel Deaconess Medical Center, and MyChart at the Cleveland Clinic. The funding for provider-based PHRs is generally from the marketing department since PHRs are a powerful way to recruit and retain patients. Also, the Healthcare Quality Department may fund them to enhance patient safety since PHRs can support medication reconciliation workflows.
Payer Hosted Patient Portal to the payer claims database - in this model, patients have access to administrative claims data such as discharge diagnoses, reimbursed medications, and lab tests ordered. Few payer hosted systems contain actual lab data, but many payers are now working with labs to obtain this data. Additionally, American's Health Insurance Plans (AHIP) are working together to enable the transport of electronic claims data between payers when patients move between plans, enhancing continuity of care. The funding for payer-based PHRs is based on reducing total claims to the payer through enrollment of patients in disease management programs and enhancing coordination of care.
Employer Sponsored - in this model, employees can access their claims data and benefit information via a portal hosted by an independent outsourcing partner. An example of this is the collaborative effort of Pitney Bowes, Wal-Mart, Intel and others to offer Dossia, an open source application which enables patients to retrieve their own data. The funding for employer-based personal health records is based on reducing total healthcare costs to the employer through wellness and coordination of care. A healthy employee is a more productive employee.
Vendor hosted - several vendors are releasing products in 2007-2008 to serve as a secure container for patients to retrieve, store and manipulate their own health records. Microsoft's HealthVault includes uploading and storage of records as well as a health search engine. In a recent New York Times article, Google is reported to be offering similar features in late 2007. The business model for these PHRs is generally based on attracting more users to advertising-based web-sites, although the PHR itself may be ad free.
All of these models will be empowered by data standards for demographics, problem lists, medications, allergies, family history, the genome, labs, and text narrative. The Health Information Technology Standards Panel (HITSP) completed an initial set of interoperability specifications for demographics medications, allergies and advanced directives in 2006. In 2007, it completed problem lists, labs, and text narrative over networks and on physical media such as thumb drives and DVDs. In 2008, it will complete family history and the standards required to securely transmit genomic information. These standards will be used as part of Commission on Certification of Health Information Technology (CCHIT) certification criteria for electronic health records and personal health records over the next 3 years.
Another aspect of interoperability is interfacing home monitoring devices such as glucometers, scales, blood pressure cuffs and spirometers to personal health records. At present, most patients using these devices must manually type results into PHRs or call them into a provider because of the lack of uniform data standards in devices, EHRs, and PHRs. In 2008, HITSP will identify standards to ensure vital signs and glucose monitoring devices are interoperable. Continua is building a great foundation for this process by working with IEEE, HL7 and other SDOs to identify the most appropriate standards to support device interoperability and to identify gaps in current standards. HITSP and Continua will work collaboratively on standards selection in 2008 and beyond.
Privacy and Security are critical to health data exchanges between PHRs and EHRs. Privacy is the policy which protects confidentiality. Security is the technical means to ensure patient data is released to the right person, for the right reason, at the right time to protect confidentiality. The US currently lacks a uniform private policy for clinical data exchange. Local implementations are high variable and some organizations use opt-in consent, others use opt-out. The personal health record can help address this lack of policy. By placing the patient at the center of healthcare data exchange and empowering the patient to become the steward of their own data, patient confidentiality becomes the personal responsibility of every participating patient. Patients could retrieve their records, apply privacy controls, and then share their data as needed with just those who need to know. Since policies are local, the security standards built into PHRs need to be flexible enough to support significant heterogeneity.
HITSP has selected the security standards for the country which include audit trails, consent management, role-based access control, federated trust, and authentication. Personal Health Record vendors and device manufacturers will be empowered by these standards, which outline best practices for securing patient identified data transmitted between systems. Once patients trust the security of the network used to exchange data, adoption of personal health records and data exchange among payers and providers will marketly increase. Eventually, PHRs could also hold consent information, recorded via the HITSP Consent Management Interoperability Specfication, that can provide an easily queriable source for patient health information exchange preferences.
The evolution of today's paper-based, non-standardized, unstructured text medical record into a fully electronic, vocabulary controlled, structured interoperable document shared among patients and providers will be a journey. Standards are key and recent work in this area now provides the foundation for personal health records. Security technology exists today that is good enough. Early experiences with PHRs demonstrate high patient satisfaction, reduced phone volume to provider offices and less litigation by patients sharing medical decisionmaking with their clinicians. The time to implement PHRs is now and the only barriers are organizational and political, not technological.
A Green Approach to Storage
Recently, several folks from the press have asked me about "green" approaches to data storage, since we have 200 Terabytes (that's 2,000,000,000,000 bytes) of medical records online. Over the past year, we've begun the journey to reduce the power consumption of storing data.
1. Higher capacity drives. Just a few years ago, our drives were a few dozen gigabytes each and we had to keep a large number of drives powered up in our data center. Now, higher capacity drives (750 Gigabytes each) means fewer devices consuming less power. We also use Serial Advanced Technology Attachment (SATA) drives, which rotate slower and consume less power. Many folks were worried about the performance of these slower rotating drives. We did a pilot and put half our users on the SATA drives and half on Fiber Channel. Then we moved everyone to SATA. No one noticed the change.
2. Reducing the amount of data stored. My needs for storage are growing at 25% per year and despite all my attempts to reduce demand (i.e. deleting all MP3 files on the network every data), demand is hard to control. Recently, we've used de-duplication techniques to help with this. Our email system keeps only copy of an attachment sent to our 5000 employees, not 5000 copies. Our backup systems deduplicate files, so only one copy is stored. We've seen a 50% reduction in the space needed for archiving because of this, reducing the amount of storage devices and the electricity needed to power them.
3. Spin-down and slow-down technologies - turning off unused drives. Spin-down is controversial. Some believe the benefits are over-stated because periodic wake-ups of the disk for integrity checks, etc may consume more energy and shorten disk life cycles. Many vendors seem to favor slow-down technologies. They especially see this for backup media such as Virtual Tape Libraries (disk emulating tape). Long term, there is a prediction solid state drive (Flash RAM drives) costs will drop and permit more frequent use. These are more energy efficient and have no moving parts, making them easier to manage from an energy perspective.
1. Higher capacity drives. Just a few years ago, our drives were a few dozen gigabytes each and we had to keep a large number of drives powered up in our data center. Now, higher capacity drives (750 Gigabytes each) means fewer devices consuming less power. We also use Serial Advanced Technology Attachment (SATA) drives, which rotate slower and consume less power. Many folks were worried about the performance of these slower rotating drives. We did a pilot and put half our users on the SATA drives and half on Fiber Channel. Then we moved everyone to SATA. No one noticed the change.
2. Reducing the amount of data stored. My needs for storage are growing at 25% per year and despite all my attempts to reduce demand (i.e. deleting all MP3 files on the network every data), demand is hard to control. Recently, we've used de-duplication techniques to help with this. Our email system keeps only copy of an attachment sent to our 5000 employees, not 5000 copies. Our backup systems deduplicate files, so only one copy is stored. We've seen a 50% reduction in the space needed for archiving because of this, reducing the amount of storage devices and the electricity needed to power them.
3. Spin-down and slow-down technologies - turning off unused drives. Spin-down is controversial. Some believe the benefits are over-stated because periodic wake-ups of the disk for integrity checks, etc may consume more energy and shorten disk life cycles. Many vendors seem to favor slow-down technologies. They especially see this for backup media such as Virtual Tape Libraries (disk emulating tape). Long term, there is a prediction solid state drive (Flash RAM drives) costs will drop and permit more frequent use. These are more energy efficient and have no moving parts, making them easier to manage from an energy perspective.
Sunday, October 28, 2007
It's Post Secret Day
Every Sunday the new Post Secrets are online. They only last a week. Don't miss out at this sad, amusing, wonderful look at mankind.
Kill a Watt!
As a followup to my Friday post, here are a few more details of our "kill-a-watt" project. This is a multi-year effort by the staff to "green-up" the data center and IT electrical footprint that extends to the medical center.
1. Eliminate old monitors - Conversion from cathode ray tube-based (CRT) to LCD flat panel computer monitors. The typical power consumption for a CRT is .41 kwh's per day. A year ago, we had 7,500 CRT's. Today, we have reduced the number to 2,400 having replaced the others with 15" or 17" LCD flat panel monitors. The latter consume an average of .29 kwh's per day (17") or .11 kwh's per day (15"). Based on the effort to date, we have lowered the annual electrical power consumption for computer monitors from 1,122,375 kwh's per year to approximately 459,189 kwh's per year. This includes the impact of 500 additional monitors that were new to the environment.
2. Server Consolidation - Many of our computer servers in the data center support only one application. This is often requested by the application vendor. Virtualization software such as VMWare is now available that allows multiple instances of a computer operating system to exist on the same server. Each instance can be configured in a way that assures needed server resources (memory, CPU, etc) are available. We have retired over 20 servers in the past year through this technique.
3. Computer Center Utilities - There have been several initiatives related to the data center. Prior to 2007, we were tracking to exceed 200KW's of peak power consumption in the data center by 2010 based on the growth of electrical demand. The latter was caused by new storage, servers and switches. We were also seeing a 1:1 ratio between the data center load (servers, storage, switches) and the mechanical systems required to cool the data center and light it. In other words, for every watt used in the data center, we needed another watt to cool the heat produced.
Our goal was to stay well below 200KW peak load in the data center and reduce the 1:1 ratio. While we are not done, we have essentially achieved our goal. Today, we run at about 160KW at peak load; below the 170KW~ peak loads we have seen in the prior year. This is despite the increased number of terabytes of storage and applications the medical center has demanded we support. This has come about by consolidating servers, replacing older, less energy efficient devices, and reconfiguring equipment.
The data center load to mechanical (HVAC) load has also improved. Instead of 1:1, we are now running at about 1 watt of data center use for every .7 watt of mechanical load. This has been the result of better thermal management (e.g. reducing underfloor cables, using perforated inserts, shutting down an unneeded A/C unit, adjusting humidistats to increase efficiency, using blanking panels and other methods to better direct air-flow (i.e. hot aisle/cold aisle enforcement), and other techniques.
1. Eliminate old monitors - Conversion from cathode ray tube-based (CRT) to LCD flat panel computer monitors. The typical power consumption for a CRT is .41 kwh's per day. A year ago, we had 7,500 CRT's. Today, we have reduced the number to 2,400 having replaced the others with 15" or 17" LCD flat panel monitors. The latter consume an average of .29 kwh's per day (17") or .11 kwh's per day (15"). Based on the effort to date, we have lowered the annual electrical power consumption for computer monitors from 1,122,375 kwh's per year to approximately 459,189 kwh's per year. This includes the impact of 500 additional monitors that were new to the environment.
2. Server Consolidation - Many of our computer servers in the data center support only one application. This is often requested by the application vendor. Virtualization software such as VMWare is now available that allows multiple instances of a computer operating system to exist on the same server. Each instance can be configured in a way that assures needed server resources (memory, CPU, etc) are available. We have retired over 20 servers in the past year through this technique.
3. Computer Center Utilities - There have been several initiatives related to the data center. Prior to 2007, we were tracking to exceed 200KW's of peak power consumption in the data center by 2010 based on the growth of electrical demand. The latter was caused by new storage, servers and switches. We were also seeing a 1:1 ratio between the data center load (servers, storage, switches) and the mechanical systems required to cool the data center and light it. In other words, for every watt used in the data center, we needed another watt to cool the heat produced.
Our goal was to stay well below 200KW peak load in the data center and reduce the 1:1 ratio. While we are not done, we have essentially achieved our goal. Today, we run at about 160KW at peak load; below the 170KW~ peak loads we have seen in the prior year. This is despite the increased number of terabytes of storage and applications the medical center has demanded we support. This has come about by consolidating servers, replacing older, less energy efficient devices, and reconfiguring equipment.
The data center load to mechanical (HVAC) load has also improved. Instead of 1:1, we are now running at about 1 watt of data center use for every .7 watt of mechanical load. This has been the result of better thermal management (e.g. reducing underfloor cables, using perforated inserts, shutting down an unneeded A/C unit, adjusting humidistats to increase efficiency, using blanking panels and other methods to better direct air-flow (i.e. hot aisle/cold aisle enforcement), and other techniques.
MRSA- it's tiny microbes
With all of the continued interest in MRSA (methacillin resistant staph aureus), it is a good time to remember just how bacteria work. According to author Bill Bryson..."if you are in good health and averagely diligent about hygiene, you will have a herd of about one trillion bacteria grazing on your fleshy plains-about a hundred thousand of them on every square centimeter of skin. You are for
Thursday, October 25, 2007
Some Like it Hot
As a Prius-driving vegan, I'm doing everything I can to reduce my carbon impact on the planet. This also includes an effort to build "green" data centers. My next few posts will be about the power consumed by the technology we use in healthcare. It's estimated that between 1.5 and 3.5% of all power generated in the US is now used by computers.
I recently began a project to consolidate two data centers. We had enough rack space, enough network drops, and enough power connections, so the consolidation looked like a great idea way to reduce operating costs. All looked good until we looked at the power and cooling requirements of our computing clusters and new racks of blade servers. For a mere $400,000 we could run new power wiring from electrical crypts to the data center. However, the backup generators would not be able to sustain the consolidated data center in the event of a total power loss. So, we could install a new $1 million dollar backup generator. Problem solved? The heat generated by all this power consumption would rapidly exhaust the cooling system, driving temperatures up 10 degrees. We investigated floor tile mounted cooling, portable cooling units, and even rack mounted cooling systems. All of these take space, consume power and add weight. At the end of the planning exercise, we found that the resulting new data center cost per square foot would exceed the cost of operating two less densely packed data centers. We looked at commercial data hosting options and ran into the same issue. Power limits per rack meant half full racks and twice as much square footage to lease, increasing our operating costs.
At my CareGroup data center, we recently completed a long term planning exercise for our unused square footage. Over the past few years, we've met increasing customer demand by adding new servers and power has not been a rate limiting step. However, as we retire mainframe, mini and RISC computing technologies and replace them with Intel/AMD-based blades, the heat generated will exceed our cooling capacity long before real estate and power are exhausted.
The recent rise in the cost of energy has also highlighted that unchecked growth in the number of servers is not economically sustainable. In general, IT organizations have a tendency to add more capacity rather than take on the more difficult task of controlling demand, contributing to growth in power consumption.
Power consumption and heat is increasing to the point that data centers cannot sustain the number of servers that the real estate can accommodate. The solution is to deploy servers much more strategically. We’ve started a new “Kill-a-watt” program and are now balancing our efforts between supply and demand. We are more conservative about adding dedicated servers for every new application, challenging vendor requirements when dedicated servers are requested, examining the efficiency of power supplies, and performing energy efficiency checks on the mechanical/electrical systems supporting the data center.
We have also begun the extensive use of VMware, Xen and other virtualization techniques. This means that we can host farms of Intel/AMD blades running Windows or Linux, deploying CPU capacity on demand without adding new hardware. We're connecting two geographically distant data centers together using low cost dark fiber and building "clouds" of server capacity. We create, move and load balance virtual servers without interrupting applications.
Managing a data center is no longer simply a facilities or real estate task. We've hired a full time power engineer to manage the life cycle of our data center, network closets and disaster recovery facilities. New blade technologies, Linux clusters, and virtualization are great for on demand computing, but power and cooling are the new infrastructure challenge of the CIO.
I recently began a project to consolidate two data centers. We had enough rack space, enough network drops, and enough power connections, so the consolidation looked like a great idea way to reduce operating costs. All looked good until we looked at the power and cooling requirements of our computing clusters and new racks of blade servers. For a mere $400,000 we could run new power wiring from electrical crypts to the data center. However, the backup generators would not be able to sustain the consolidated data center in the event of a total power loss. So, we could install a new $1 million dollar backup generator. Problem solved? The heat generated by all this power consumption would rapidly exhaust the cooling system, driving temperatures up 10 degrees. We investigated floor tile mounted cooling, portable cooling units, and even rack mounted cooling systems. All of these take space, consume power and add weight. At the end of the planning exercise, we found that the resulting new data center cost per square foot would exceed the cost of operating two less densely packed data centers. We looked at commercial data hosting options and ran into the same issue. Power limits per rack meant half full racks and twice as much square footage to lease, increasing our operating costs.
At my CareGroup data center, we recently completed a long term planning exercise for our unused square footage. Over the past few years, we've met increasing customer demand by adding new servers and power has not been a rate limiting step. However, as we retire mainframe, mini and RISC computing technologies and replace them with Intel/AMD-based blades, the heat generated will exceed our cooling capacity long before real estate and power are exhausted.
The recent rise in the cost of energy has also highlighted that unchecked growth in the number of servers is not economically sustainable. In general, IT organizations have a tendency to add more capacity rather than take on the more difficult task of controlling demand, contributing to growth in power consumption.
Power consumption and heat is increasing to the point that data centers cannot sustain the number of servers that the real estate can accommodate. The solution is to deploy servers much more strategically. We’ve started a new “Kill-a-watt” program and are now balancing our efforts between supply and demand. We are more conservative about adding dedicated servers for every new application, challenging vendor requirements when dedicated servers are requested, examining the efficiency of power supplies, and performing energy efficiency checks on the mechanical/electrical systems supporting the data center.
We have also begun the extensive use of VMware, Xen and other virtualization techniques. This means that we can host farms of Intel/AMD blades running Windows or Linux, deploying CPU capacity on demand without adding new hardware. We're connecting two geographically distant data centers together using low cost dark fiber and building "clouds" of server capacity. We create, move and load balance virtual servers without interrupting applications.
Managing a data center is no longer simply a facilities or real estate task. We've hired a full time power engineer to manage the life cycle of our data center, network closets and disaster recovery facilities. New blade technologies, Linux clusters, and virtualization are great for on demand computing, but power and cooling are the new infrastructure challenge of the CIO.
Rate Your Doctor
I think the internet will have a huge impact on patient satisfaction of how they are treated by doctors. The idea of internet sites that rate doctors and hospitals has been around for about 5 years. In the past the sites have been difficult to view, some cost money and they were not user friendly. But like anything new...it may now have reached a tipping point.I logged onto Ratemd.com and was
Wednesday, October 24, 2007
The Open Source desktop has finally arrived
Over the past year, I've run Windows Vista, Mac OS X (Tiger), and several flavors of Linux (Ubuntu, Debian, SUSE, Fedora, Red Hat Enterprise). Although it is true that Mac OS X is my favorite operating system for multimedia support, user experience and stability, it only runs on Apple hardware. As a road warrior CIO, I need a 12" subnotebook weighing 2 pounds, which does not yet exist in Apple's product line.
For the past 6 months, I've been running Ubuntu Feisty Fawn Linux with Open Office, Evolution email, and Firefox on a Dell D420 subnotebook. It has been good enough for all my computing needs, providing secure, reliable, easy to use software with a low total cost of ownership. However, there have been a few annoyances. Suspend/resume works about 80% of the time. When it does not work, my matchstick mouse locks up and I have to close the lid, resume again and all is well. Occasionally, upon waking from suspend, my laptop does not reconnect to my home wireless network. The Evolution email client can be slow to synchronize with Microsoft Exchange because it does not cache email locally and thus downloads all email headers from scratch whenever the email application is started. However, all office productivity, browser, and multimedia applications work flawlessly.
Tonight, as part of the normal Ubuntu updating process, I clicked on Update Manager to automatically replace the entire Linux operating system on my laptop with the next release of Ubuntu, Gutsy Gibbon. The new operating system downloaded, automatically installed itself and resolved every problem I have ever had with Linux. Congrats to the folks at Canonical who maintain Ubuntu and to the folks at Novell who have significantly upgraded their Evolution email client to meet the needs of Microsoft Exchange users.
I am completely confident in saying that "Linux your grandmother could use" has now arrived. With Ubuntu Gutsy Gorilla, everyone can have a free desktop/laptop operating system with all the productivity tools necessary to get your work done.
Two years ago, when I had dinner with Steve Ballmer, I explained that healthcare needs highly reliable, lower cost, more secure desktop software. He countered that new features are the highest priority of Microsoft customers and that it would be impossible to create a lightweight, reliable, low cost, secure version of Microsoft operating systems and applications given the demand for ever increasing features.
I am a realist and recognize that Microsoft provides many enterprise software products that will continue to have a strong presence in healthcare. However, now that Ubuntu is good enough, I expect that more people will try it and experience the advantages of running software with just the right balance of features, speed, and stability. And it's always free.
To try it yourself, go to http://www.ubuntu.com/getubuntu
Let me know how it goes!
For the past 6 months, I've been running Ubuntu Feisty Fawn Linux with Open Office, Evolution email, and Firefox on a Dell D420 subnotebook. It has been good enough for all my computing needs, providing secure, reliable, easy to use software with a low total cost of ownership. However, there have been a few annoyances. Suspend/resume works about 80% of the time. When it does not work, my matchstick mouse locks up and I have to close the lid, resume again and all is well. Occasionally, upon waking from suspend, my laptop does not reconnect to my home wireless network. The Evolution email client can be slow to synchronize with Microsoft Exchange because it does not cache email locally and thus downloads all email headers from scratch whenever the email application is started. However, all office productivity, browser, and multimedia applications work flawlessly.
Tonight, as part of the normal Ubuntu updating process, I clicked on Update Manager to automatically replace the entire Linux operating system on my laptop with the next release of Ubuntu, Gutsy Gibbon. The new operating system downloaded, automatically installed itself and resolved every problem I have ever had with Linux. Congrats to the folks at Canonical who maintain Ubuntu and to the folks at Novell who have significantly upgraded their Evolution email client to meet the needs of Microsoft Exchange users.
I am completely confident in saying that "Linux your grandmother could use" has now arrived. With Ubuntu Gutsy Gorilla, everyone can have a free desktop/laptop operating system with all the productivity tools necessary to get your work done.
Two years ago, when I had dinner with Steve Ballmer, I explained that healthcare needs highly reliable, lower cost, more secure desktop software. He countered that new features are the highest priority of Microsoft customers and that it would be impossible to create a lightweight, reliable, low cost, secure version of Microsoft operating systems and applications given the demand for ever increasing features.
I am a realist and recognize that Microsoft provides many enterprise software products that will continue to have a strong presence in healthcare. However, now that Ubuntu is good enough, I expect that more people will try it and experience the advantages of running software with just the right balance of features, speed, and stability. And it's always free.
To try it yourself, go to http://www.ubuntu.com/getubuntu
Let me know how it goes!
Tuesday, October 23, 2007
Patient Safety- Run the OR Like a Jet Plane
We all know about the poor patient who goes in for a right kidney operation and ends up having the "good" left one removed instead. Or the patient with the allergy who is given the wrong anesthesia and has a reaction. Or the patient who is given the wrong blood type. These things happen despite the fact that doctors and nurses are doing their very best to heal under really tough circumstances.I
Monday, October 22, 2007
Exploring Instant Messaging
Per yesterday's post, over the next few months I'll be piloting the policies, technology and governance of flexible work arrangements. I live by Blackberry email, cell phone, web, and remote data access via SSLVPN. To expand my communications horizons, I'll be testing Instant Messaging, various forms of video teleconferencing, blogs, wikis, collaboration tools and group authoring tools.
Here's my summary of the Instant Messaging experience to date. As an email guy, it has taken some getting used to. I've done IM via AOL's Instant Messaging (AIM), MSN Messanger, Yahoo, Google Talk, a local Jabber server at BIDMC, and Skype's chat features.
To me, effective chat needs to work across all platforms, so I tested all of these platforms with my Ubunu Feisty Fawn Linux laptop, my Macbook, and my Dell Optiplex 745 desktop running Windows XP. For Linux, I used Pidgin and Gajim open source instant messaging programs. For the Mac I used iChat (AIM and Jabber) plus downloaded clients from MSN, Yahooa and Skype and for the Windows machine I downloaded clients from AOL, MSN, Yahoo, and Skype. For Google Talk, I also tried the Google web client that's part of Gmail by using a Firefox browser on all three computers.
My first impression is that IM can be an effective communication tool for realtime emergent situations, for quick questions (when is the meeting?), and for brainstorming as a group. One frustration is that my collaborators have accounts on different IM platforms. With email, I simply send to the address of each server used by my collaborators. With IM, I must login to the same service each is using. My Linux clients enable me to login to multiple IM services simultaneously, but I still need to create and remember the credentials to accounts on all these services.
Some of these services support video and voice chat. Here's what I found
AIM - Windows AIM client supports chat/voice/video, Mac iChat supports chat/voice/video, Linux Pigdin/Gaim supports chat only. All use the proprietary OSCAR protocol.
MSN - Windows Messaging client supports chat/voice/video, Mac aMSN open source application supports chat/video, Linux aMSN open source applications supports chat/video. All use the proprietary MSNP protocol
Google talk - Hosted implementation of the industry standard Extensible Messaging and Presence Protocol (XMPP) protocol and the extended Jingle protocol. Works with any standards-based chat client such as Trillian for windows, iChat for Mac, Pidgin/Gajim for Linux. Supports audio via a downloadable Google talk client for windows, iChat for the Mac. No audio support for Linux.
Yahoo - Windows Yahoo client supports chat/voice/video, Mac Yahoo client supports chat/video, Linux Pidgin/Gaim supports chat only. All use the proprietary Yahoo! Messenger Protocol
Skype - Windows Skype client supports chat/voice/video, Mac Skype client supports chat/voice/video. Linux client supports chat and voice only. All use the proprietary Skype protocol.
Bottom line - All provide text chat on Linux, Macs and Windows. Skype provides voice on all these platforms. None of these services provide video and voice on all platforms.
My conclusion is that text works very well as long as your collaborators are on the same IM service. Voice is problematic across platforms and has very uneven quality that's a function of many bandwidth bottlenecks from desktop to desktop via the internet and the IM service provider. Combined Voice and Video across platforms is not yet possible with IM.
Has anyone had a different experience?
More to follow as the exploration continues.
Here's my summary of the Instant Messaging experience to date. As an email guy, it has taken some getting used to. I've done IM via AOL's Instant Messaging (AIM), MSN Messanger, Yahoo, Google Talk, a local Jabber server at BIDMC, and Skype's chat features.
To me, effective chat needs to work across all platforms, so I tested all of these platforms with my Ubunu Feisty Fawn Linux laptop, my Macbook, and my Dell Optiplex 745 desktop running Windows XP. For Linux, I used Pidgin and Gajim open source instant messaging programs. For the Mac I used iChat (AIM and Jabber) plus downloaded clients from MSN, Yahooa and Skype and for the Windows machine I downloaded clients from AOL, MSN, Yahoo, and Skype. For Google Talk, I also tried the Google web client that's part of Gmail by using a Firefox browser on all three computers.
My first impression is that IM can be an effective communication tool for realtime emergent situations, for quick questions (when is the meeting?), and for brainstorming as a group. One frustration is that my collaborators have accounts on different IM platforms. With email, I simply send to the address of each server used by my collaborators. With IM, I must login to the same service each is using. My Linux clients enable me to login to multiple IM services simultaneously, but I still need to create and remember the credentials to accounts on all these services.
Some of these services support video and voice chat. Here's what I found
AIM - Windows AIM client supports chat/voice/video, Mac iChat supports chat/voice/video, Linux Pigdin/Gaim supports chat only. All use the proprietary OSCAR protocol.
MSN - Windows Messaging client supports chat/voice/video, Mac aMSN open source application supports chat/video, Linux aMSN open source applications supports chat/video. All use the proprietary MSNP protocol
Google talk - Hosted implementation of the industry standard Extensible Messaging and Presence Protocol (XMPP) protocol and the extended Jingle protocol. Works with any standards-based chat client such as Trillian for windows, iChat for Mac, Pidgin/Gajim for Linux. Supports audio via a downloadable Google talk client for windows, iChat for the Mac. No audio support for Linux.
Yahoo - Windows Yahoo client supports chat/voice/video, Mac Yahoo client supports chat/video, Linux Pidgin/Gaim supports chat only. All use the proprietary Yahoo! Messenger Protocol
Skype - Windows Skype client supports chat/voice/video, Mac Skype client supports chat/voice/video. Linux client supports chat and voice only. All use the proprietary Skype protocol.
Bottom line - All provide text chat on Linux, Macs and Windows. Skype provides voice on all these platforms. None of these services provide video and voice on all platforms.
My conclusion is that text works very well as long as your collaborators are on the same IM service. Voice is problematic across platforms and has very uneven quality that's a function of many bandwidth bottlenecks from desktop to desktop via the internet and the IM service provider. Combined Voice and Video across platforms is not yet possible with IM.
Has anyone had a different experience?
More to follow as the exploration continues.
An About Face on Flexible Work Arrangements
In my 10 years as a CIO, I've strongly believed that productivity is best when everyone works in close physical proximity, so that you get the benefit of the "over the cubicle" effect of being able to brainstorm with colleagues ad hoc, respond to urgent issues as a group and build trust among team members.
But the world has changed, and new factors need to be considered. First, the commuting needed to bring everyone together has become burdensome and expensive. Commutes can now take two hours or more, and gas prices are causing hardship. At the same time, environmental consciousness about the carbon impact of those long commutes is on the rise. Second, Internet connections are becoming faster, more reliable and cheaper. I have a 20Mbit/sec. fiber connection in my basement for $40 a month.
We also have many more means of communication: e-mail, instant messaging, blogs, wikis, WebEx ,videoconferencing. Face-to-face meetings that take weeks to schedule are no longer sufficient for the pace of IT change and the level of service demands.
How should a CIO react to this changing landscape? I believe we have to explore the entire spectrum of flexible work arrangements.
Are in-person meetings really necessary? I find that a kick-off meeting to initiate a project works best if the team assembles in person. Collaborators can introduce themselves and build a common framework for working together. Thereafter, conference calls, online collaboration tools and e-mail are sufficient.
Is 8 a.m. to 5 p.m. the best way to staff an office? Not if it implies hours on the road each way. If working from 10 a.m. to 7 p.m. reduces the commute by an hour each way, it's likely that productivity and staff satisfaction will rise.
Is being in the office even necessary? For some jobs, the interruptions of the office may actually reduce productivity. Some structured time in a home office may be preferable.
Of course, there are issues.
A home office needs infrastructure support - networks, desktops and connection to the corporate phone system. Figuring out the best way to service hundreds of remote locations will require planning and pilots. The technology may not need to be complicated, though. Videoconferencing isn’t always necessary, for example, since phone calls and Web-based remote presentation tools are very efficient.
Accountability for employees with flexible work arrangements is key, so you may need management tools to monitor specific project milestones and productivity goals. But you may be pleasantly surprised. In a recent pilot in Massachusetts, a major health insurer found that productivity for 200 staffers working from home rose 20%; only two participants had performance issues.
Equity is another problem. Some staffers, such as those doing direct desktop service or training, need to be on-site. But you can still offer some flexibility, letting them put in four 10-hour days, say, or giving them every other Friday off.
Security and privacy are other concerns, and they loom especially large for me, since my IT organization is part of a large health care provider. If protected health data is to be accessible in employees’ homes, we will need to investigate biometric devices, re-examine application time-outs, strengthen surveillance of audit logs and ensure end-to-end security from data center to the home.
Over the next year, I'll be piloting the technologies, policies and business processes needed to manage technology professionals in flexible work arrangements. I expect that retention, productivity and employee satisfaction will rise as time spent commuting falls. I'll keep you updated on the progress - from my home office.
A few of my posts will be based on a monthly column I do for Computerworld. For those posts, the legal folks require that I add the following:
Copyright 2007 by Computerworld Inc., One Speen Street, Framingham, Mass. 01701. Reprinted by permission of Computerworld. All rights reserved.
But the world has changed, and new factors need to be considered. First, the commuting needed to bring everyone together has become burdensome and expensive. Commutes can now take two hours or more, and gas prices are causing hardship. At the same time, environmental consciousness about the carbon impact of those long commutes is on the rise. Second, Internet connections are becoming faster, more reliable and cheaper. I have a 20Mbit/sec. fiber connection in my basement for $40 a month.
We also have many more means of communication: e-mail, instant messaging, blogs, wikis, WebEx ,videoconferencing. Face-to-face meetings that take weeks to schedule are no longer sufficient for the pace of IT change and the level of service demands.
How should a CIO react to this changing landscape? I believe we have to explore the entire spectrum of flexible work arrangements.
Are in-person meetings really necessary? I find that a kick-off meeting to initiate a project works best if the team assembles in person. Collaborators can introduce themselves and build a common framework for working together. Thereafter, conference calls, online collaboration tools and e-mail are sufficient.
Is 8 a.m. to 5 p.m. the best way to staff an office? Not if it implies hours on the road each way. If working from 10 a.m. to 7 p.m. reduces the commute by an hour each way, it's likely that productivity and staff satisfaction will rise.
Is being in the office even necessary? For some jobs, the interruptions of the office may actually reduce productivity. Some structured time in a home office may be preferable.
Of course, there are issues.
A home office needs infrastructure support - networks, desktops and connection to the corporate phone system. Figuring out the best way to service hundreds of remote locations will require planning and pilots. The technology may not need to be complicated, though. Videoconferencing isn’t always necessary, for example, since phone calls and Web-based remote presentation tools are very efficient.
Accountability for employees with flexible work arrangements is key, so you may need management tools to monitor specific project milestones and productivity goals. But you may be pleasantly surprised. In a recent pilot in Massachusetts, a major health insurer found that productivity for 200 staffers working from home rose 20%; only two participants had performance issues.
Equity is another problem. Some staffers, such as those doing direct desktop service or training, need to be on-site. But you can still offer some flexibility, letting them put in four 10-hour days, say, or giving them every other Friday off.
Security and privacy are other concerns, and they loom especially large for me, since my IT organization is part of a large health care provider. If protected health data is to be accessible in employees’ homes, we will need to investigate biometric devices, re-examine application time-outs, strengthen surveillance of audit logs and ensure end-to-end security from data center to the home.
Over the next year, I'll be piloting the technologies, policies and business processes needed to manage technology professionals in flexible work arrangements. I expect that retention, productivity and employee satisfaction will rise as time spent commuting falls. I'll keep you updated on the progress - from my home office.
A few of my posts will be based on a monthly column I do for Computerworld. For those posts, the legal folks require that I add the following:
Copyright 2007 by Computerworld Inc., One Speen Street, Framingham, Mass. 01701. Reprinted by permission of Computerworld. All rights reserved.
Security Standards for the Country
Lack of nationwide security standards has been a major challenge to creating interoperable health records which protect the confidentiality of patients.
On October 15, the national Healthcare Information Technology Standards Panel (HITSP) approved by consensus all the standards needed to record patient consent for sharing data, enable secure communication, restrict records via appropriate access control, and document consistent audit trails of every lookup.
These National standards are described in a series of documents.
HITSP_v1.1_2007_TN900 - Security and Privacy.pdf
HITSP_v1.1_2007_C19 - Entity Identity Assertion.pdf
HITSP_v1.1_2007_C26 - Nonrepudiation of Origin.pdf
HITSP_v1.1_2007_T15 - Collect and Communicate Security Audit Trail.pdf
HITSP_v1.1_2007_T16 - Consistent Time.pdf
HITSP_v1.1_2007_T17 - Secured Communication Channel.pdf
HITSP_v1.1_2007_TP20 - Access Control.pdf
HITSP_v1.1_2007_TP30 - Manage Consent Directives.pdf
HITSP_v2.0.2_2007_TP13 - Manage Sharing of Documents.pdf
Security standards are the foundation for all current and future HITSP work. They will be thoroughly tested over the next year via the upcoming Nationwide Healthcare Information Network Contracts given to 9 groups
CareSpark
Delaware Health Information Network
Indiana University
Long Beach Network for Health
Lovelace Clinic Foundation
MedVirginia
New York eHealth Collaborative
NorthCarolinaHealthcareInformationand Communications Alliance, Inc.
West Virginia Health Information Network
and will be incorporated into Certification Commission for Healthcare Information (CCHIT) Technology criteria over the next few years.
The standards selected will be very familiar to CIOs, since many are commonly used internet standards such as X.509 certificate exchange, Web Services (WS-Trust, WS-Federation, WS-Security) and SAML. Some standards may be new to CIOs such as the OASIS Extensible Access Control Markup Language (XACML) and the HL7 Consent standards, but these are truly the most appropriate standards based on HITSP harmonization readiness criteria:
Suitability – The standard is named at a proper level of specificity and meets technical and
business criteria of use case
Compatibility – The standard shares common context, information exchange structures, content or data elements, security and processes with other HITSP harmonized standards or adopted frameworks as appropriate
Preferred Standards Characteristics – Approved standards, widely used, readily available, technology neutral, supporting uniformity, demonstrating flexibility and international usage are preferred
Standards Development Organization and Process – Meet selected criteria including balance, transparency, developer due process, stewardship and others.
With these national standards, payers, providers, employers, labs, pharmacies, and patients have a framework which can support the diversity of regional, state and federal privacy policies. HITSP does not make policy, but provides the security interoperability specifications to support whatever data sharing decisions are made locally.
This is a very exciting development on the journey toward interoperability in healthcare.
On October 15, the national Healthcare Information Technology Standards Panel (HITSP) approved by consensus all the standards needed to record patient consent for sharing data, enable secure communication, restrict records via appropriate access control, and document consistent audit trails of every lookup.
These National standards are described in a series of documents.
HITSP_v1.1_2007_TN900 - Security and Privacy.pdf
HITSP_v1.1_2007_C19 - Entity Identity Assertion.pdf
HITSP_v1.1_2007_C26 - Nonrepudiation of Origin.pdf
HITSP_v1.1_2007_T15 - Collect and Communicate Security Audit Trail.pdf
HITSP_v1.1_2007_T16 - Consistent Time.pdf
HITSP_v1.1_2007_T17 - Secured Communication Channel.pdf
HITSP_v1.1_2007_TP20 - Access Control.pdf
HITSP_v1.1_2007_TP30 - Manage Consent Directives.pdf
HITSP_v2.0.2_2007_TP13 - Manage Sharing of Documents.pdf
Security standards are the foundation for all current and future HITSP work. They will be thoroughly tested over the next year via the upcoming Nationwide Healthcare Information Network Contracts given to 9 groups
CareSpark
Delaware Health Information Network
Indiana University
Long Beach Network for Health
Lovelace Clinic Foundation
MedVirginia
New York eHealth Collaborative
NorthCarolinaHealthcareInformationand Communications Alliance, Inc.
West Virginia Health Information Network
and will be incorporated into Certification Commission for Healthcare Information (CCHIT) Technology criteria over the next few years.
The standards selected will be very familiar to CIOs, since many are commonly used internet standards such as X.509 certificate exchange, Web Services (WS-Trust, WS-Federation, WS-Security) and SAML. Some standards may be new to CIOs such as the OASIS Extensible Access Control Markup Language (XACML) and the HL7 Consent standards, but these are truly the most appropriate standards based on HITSP harmonization readiness criteria:
Suitability – The standard is named at a proper level of specificity and meets technical and
business criteria of use case
Compatibility – The standard shares common context, information exchange structures, content or data elements, security and processes with other HITSP harmonized standards or adopted frameworks as appropriate
Preferred Standards Characteristics – Approved standards, widely used, readily available, technology neutral, supporting uniformity, demonstrating flexibility and international usage are preferred
Standards Development Organization and Process – Meet selected criteria including balance, transparency, developer due process, stewardship and others.
With these national standards, payers, providers, employers, labs, pharmacies, and patients have a framework which can support the diversity of regional, state and federal privacy policies. HITSP does not make policy, but provides the security interoperability specifications to support whatever data sharing decisions are made locally.
This is a very exciting development on the journey toward interoperability in healthcare.
Sunday, October 21, 2007
The Top 10 Things a CIO Can Do to Enhance Security
Harvard Medical School networks are attacked every few seconds 24 hours a day, 7 days a week. These attacks come from such diverse locations as Eastern Europe and Eastern Cambridge (MIT students). In general, protecting the privacy of 3 million patient records is a Cold War. Hackers innovate, Information Technology departments protect, Hackers innovate and the process continues. Providing security is a journey and we have been on the path to security best practices for many years. The following is my top 10 recommendations to guide this journey
1. Policies/Governance - Without policies and governance, enforcing security best practices is impossible. Do you allow IM or not? Do you allow modems to be attached to computers without IT approval? Can data be copied onto a thumb drive and transported off site? Such major policy questions must have definitive answers and sanctions for violating these policies must be enforced. BIDMC's current technology policy is found here
2. Risk assessment and stratification - Do you consider the HIV status of patients to be the same security priority as protecting the data integrity of the library catalog? Probably not. We have established 4 classifications of risk:
**** Internet connected clinical data which is patient identified. Compromise of a passwords could lead to access of thousands of patients records
*** Internet connected clinical data which is patient identified. Compromise of a passwords could lead to access to one patient record
** Internet connected clinical data which is not patient identified. Compromise of passwords could lead to access of aggregate data without patient identifiers
* No patient records available
Our journey to enhance security focuses on **** and *** data first. By ensuring our latest technologies and techniques protect our most sensitive data, we apply our people and budgets to the areas of greatest risk.
3. Firewalls - many years ago, we used the "Blanche Dubois" approach to security - a firewall that empowered academic collaboration but relied on the "kindness of strangers". One of our first security enhancements in the 1990's was to replace our permissive firewall (allow anything except where prohibited) with a restrictive firewall (deny everything except were permitted). During this process we eliminated 99% of our publicly available IP addresses, eliminated peer to peer traffic, and created a demilitarized zone (DMZ) for our web servers.
4. Intrusion detection and prevention/Host intrusion protection - recognizing that operating systems are patched continuously and that applications have vulnerabilities, there are attacks that take advantage of the time between a patch being released and a patch being applied. We've employed software that provides "zero day" protection - eliminating the kinds of traffic between servers that are suggestive of attacks or questionable behavior. We do this network wide and on individual servers, especially our web servers.
5. Remote access methods - the security of the network is only as good as its weakest point. Remote access technologies such as SSLVPN, Metaframe, and Remote Desktop via thin client computing devices minimize the threat of viruses from remote access points. Ideally, all computers accessing protected healthcare information should have up to date operating system patches, up to date antivirus software and no software which could compromise the security of the device (i.e. peer to peer file sharing)
6. Network Access Controls - in most institutions, hackers wanting to access a hospital network can walk in the front door, unplug and existing computer and access the network with whatever nefarious devices they choose. Less malevolent is the traveling vendor who plugs a laptop into the network to do a demo, giving viruses and spyware on that laptop full access to the hospital networks. Technologies such as Cisco's Network Admission Control and
Microsoft's Network Access Protection restrict network access to known machines containing the right versions of the right software needed to ensure end to end security.
7. Vulnerability Assessment - Many healthcare applications have vulnerabilities which can lead to inappropriate disclosure of patient data. Typical vulnerabilities include buffer overflows, SQL/Javascript injection attacks, and cross server scripting attacks. Hiring "white hat" hackers to perform penetration testing of mission critical applications, networks, and operating systems helps identify potential problems before security is compromised. Even if vendors do not repair these deficiencies, Host intrustion protection software can mitigate risks by surrounding systems with an extra layer of vigilance, stopping attacks before they start.
8. Provisioning/Authentication/Authorization - Having robust processes to grant passwords only to qualified users, terminate accounts when staff leave the organization and enable only the "minimum need to know" access to clinical data are foundational to good security. When passwords are issued, they should be strong (non-English words, mixed case, numbers and letters, greater than 8 characters long etc.), expire at a reasonable internal (at least yearly), and be role-based. Registration clerks should not be able to access medication lists or psychiatric notes, only those demographic data elements needed to perform their duties.
9. Anti-virus/Anti-Spyware - The design of Windows operating systems, in which all internal "services" run as the administrator, creates a vulnerable environment that necessitates the need for anti-virus and anti-spyware software.
10. Audit trails - Authorized Internal users can be even more of threat than external hackers. Collecting audit trails and implementing a program to monitor accesses is essential. Has one account accessed more than 20 patients a day? Has more patient been examined by more than 20 accounts? Who is accessing employee healthcare records? Who is accessing the record of a famous athlete or actress? Audit trails and tools to mine audit data help answer these questions.
These ten areas are a starter kit to appropriate security in a healthcare organization. Security cannot be an afterthought, it is project that must be resourced. A well trained and staffed security team is essential to success. To keep our organizations secure, I have a full time Security Officer and a team of security professionals maintaining our firewall rules, intrustion detection/prevention software, and our auditing systems. Compliance with HIPAA is a key motivator to implement good security, but most important is retaining the trust of our patients. We are the stewards of their data and our security systems are the last defense against breaches of confidentiality.
1. Policies/Governance - Without policies and governance, enforcing security best practices is impossible. Do you allow IM or not? Do you allow modems to be attached to computers without IT approval? Can data be copied onto a thumb drive and transported off site? Such major policy questions must have definitive answers and sanctions for violating these policies must be enforced. BIDMC's current technology policy is found here
2. Risk assessment and stratification - Do you consider the HIV status of patients to be the same security priority as protecting the data integrity of the library catalog? Probably not. We have established 4 classifications of risk:
**** Internet connected clinical data which is patient identified. Compromise of a passwords could lead to access of thousands of patients records
*** Internet connected clinical data which is patient identified. Compromise of a passwords could lead to access to one patient record
** Internet connected clinical data which is not patient identified. Compromise of passwords could lead to access of aggregate data without patient identifiers
* No patient records available
Our journey to enhance security focuses on **** and *** data first. By ensuring our latest technologies and techniques protect our most sensitive data, we apply our people and budgets to the areas of greatest risk.
3. Firewalls - many years ago, we used the "Blanche Dubois" approach to security - a firewall that empowered academic collaboration but relied on the "kindness of strangers". One of our first security enhancements in the 1990's was to replace our permissive firewall (allow anything except where prohibited) with a restrictive firewall (deny everything except were permitted). During this process we eliminated 99% of our publicly available IP addresses, eliminated peer to peer traffic, and created a demilitarized zone (DMZ) for our web servers.
4. Intrusion detection and prevention/Host intrusion protection - recognizing that operating systems are patched continuously and that applications have vulnerabilities, there are attacks that take advantage of the time between a patch being released and a patch being applied. We've employed software that provides "zero day" protection - eliminating the kinds of traffic between servers that are suggestive of attacks or questionable behavior. We do this network wide and on individual servers, especially our web servers.
5. Remote access methods - the security of the network is only as good as its weakest point. Remote access technologies such as SSLVPN, Metaframe, and Remote Desktop via thin client computing devices minimize the threat of viruses from remote access points. Ideally, all computers accessing protected healthcare information should have up to date operating system patches, up to date antivirus software and no software which could compromise the security of the device (i.e. peer to peer file sharing)
6. Network Access Controls - in most institutions, hackers wanting to access a hospital network can walk in the front door, unplug and existing computer and access the network with whatever nefarious devices they choose. Less malevolent is the traveling vendor who plugs a laptop into the network to do a demo, giving viruses and spyware on that laptop full access to the hospital networks. Technologies such as Cisco's Network Admission Control and
Microsoft's Network Access Protection restrict network access to known machines containing the right versions of the right software needed to ensure end to end security.
7. Vulnerability Assessment - Many healthcare applications have vulnerabilities which can lead to inappropriate disclosure of patient data. Typical vulnerabilities include buffer overflows, SQL/Javascript injection attacks, and cross server scripting attacks. Hiring "white hat" hackers to perform penetration testing of mission critical applications, networks, and operating systems helps identify potential problems before security is compromised. Even if vendors do not repair these deficiencies, Host intrustion protection software can mitigate risks by surrounding systems with an extra layer of vigilance, stopping attacks before they start.
8. Provisioning/Authentication/Authorization - Having robust processes to grant passwords only to qualified users, terminate accounts when staff leave the organization and enable only the "minimum need to know" access to clinical data are foundational to good security. When passwords are issued, they should be strong (non-English words, mixed case, numbers and letters, greater than 8 characters long etc.), expire at a reasonable internal (at least yearly), and be role-based. Registration clerks should not be able to access medication lists or psychiatric notes, only those demographic data elements needed to perform their duties.
9. Anti-virus/Anti-Spyware - The design of Windows operating systems, in which all internal "services" run as the administrator, creates a vulnerable environment that necessitates the need for anti-virus and anti-spyware software.
10. Audit trails - Authorized Internal users can be even more of threat than external hackers. Collecting audit trails and implementing a program to monitor accesses is essential. Has one account accessed more than 20 patients a day? Has more patient been examined by more than 20 accounts? Who is accessing employee healthcare records? Who is accessing the record of a famous athlete or actress? Audit trails and tools to mine audit data help answer these questions.
These ten areas are a starter kit to appropriate security in a healthcare organization. Security cannot be an afterthought, it is project that must be resourced. A well trained and staffed security team is essential to success. To keep our organizations secure, I have a full time Security Officer and a team of security professionals maintaining our firewall rules, intrustion detection/prevention software, and our auditing systems. Compliance with HIPAA is a key motivator to implement good security, but most important is retaining the trust of our patients. We are the stewards of their data and our security systems are the last defense against breaches of confidentiality.
Saturday, October 20, 2007
Transparent Pricing for Patients
There is a big push toward having patients be smarter consumers of health care as a way to control costs. Employers are pushing for medical savings accounts (where the patient has a pot of money they spend on health care or just save) and more and more insurance products have high deductibles and more cost sharing by the patient. The simple way of explaining it is that if someone else is paying
Wednesday, October 17, 2007
Surgical Model
In my medical training, oh so many years ago, we learned from cadavers. While this was a good way to learn basic anatomy, the physiology of how the body worked was a slower process. Thanks to Unbounded Medicine for this look at the way students can learn now. This reproduction of a patient was crafted with animal organs that really give the student a much better idea of how the body functions.
Tuesday, October 16, 2007
MRSA - It's preventable
Everywhere I turned today, I was engaged in discussions about methicillin resistant staph aureus (MRSA) and tonight I read a new article in JAMA that says it is twice as prevalent as we thought.MRSA is a common skin bacteria-Staphylococcus aureus-that has become a "bug on steroids" and is resistant to penicillin, methicillin and other drugs that used to kill it flat. It has developed over time
Monday, October 15, 2007
Healing Environments
While the housing market has bombed, new hospital buildings are the rage in California due to a law that says they need to be seismically (earthquake) safe. As hospitals are planning the hospital of the future, many are using architectural design to reduce stress and promote safety and healing.What type of building promotes safety and healing? We know what doesn't work. I've practiced in
Friday, October 12, 2007
Plastics and Chemicals That Can Harm
The chemical, bisphenol-A (BPA), is used to produce polycarbonate plastic and epoxy resins and is found (get this!) in water bottles, baby bottles, food containers, compact discs and dental sealants. The chemical can leach into foods, be inhaled or enter by other routes and the US Centers for Disease Control and Prevention found this chemical in the urine of 95% of people they sampled. BPA is
Thursday, October 11, 2007
Withdrawl of Kid's Cough Medicines
McNeil Pharmaceutical has stepped up to the plate and done the right thing by voluntarily withdrawing a number of cough and cold preparations that may be harmful to kids under the age of 2. Recent findings show these medications can be overused by parents and there are no safe guidelines for tiny tikes and these preparations. McNeil has begun by informing physicians of the following:We have
Wednesday, October 10, 2007
Tuesday, October 9, 2007
Sport Concussions
One of my most "googled" blogs was "A bump on the head". Falls and head injuries are common and an estimated 300,000 sports-related concussions (also known as mild traumatic brain injury) occur annually in the United States. Researchers estimate that 63,000 of those occur in high schoolers playing football.The tough thing about concussions is that there is no marker or test to know if a person
Sunday, October 7, 2007
Medicare Drug Plan - Profitable to Insurers
The New York Times has an article that is no surprise to physicians and pharmacists who care for patients. Guess what? The wonderful Medicare Drug Plan for seniors has turned into a cash cow for insurance companies that administer it. Duh! Since when do Insurers ever do anything with the government that doesn't fatten their pockets? Audits, conducted by the Department of Health and Human
Friday, October 5, 2007
Doctors and Email
I tried to phone a patient last night with the results of her bone density test. She wasn't home and the thought of playing phone tag for the next few days was not appealing. I asked her husband for her email address and emailed her the results with my recommendations. Mission accomplished!Everyone emails. My son chats online with all of his friends together every night. You can order shoes
Thursday, October 4, 2007
Thunderstorms and iPods
Letters have been circulating in the New England Journal of Medicine about the potential dangers of iPods (and MP3 players) and their ear wires attracting lightning to strike during a thunderstorm. An initial report of a jogger wearing his iPod and being struck by lightning came from Vancouver, BC. The patient did not lose consciousness, but he had amnesia, perforated eardrums and a fractured
Tuesday, October 2, 2007
Give an Hour - mental health for vets
Just when I feel so discouraged about the direction our country is headed, I get a lift by learning about a new organization that is developing a national network of mental health professionals that will provide free care for returning Iraq and Afghanistan vets and their families."Give an Hour" is a non-profit of volunteer mental health professionals across the United States that will donate an
Monday, October 1, 2007
How to Interpret Medical Studies
We are bombarded with news of medical breakthroughs every day. How can you know what studies are valid and important, and which ones are just fluff? Here are some ways to tell the difference:How many people were in the study? The more the betterWho were the subjects, researchers and sponsors? The funding source of the study is important and might change the motives. Do the researchers have
Subscribe to:
Posts (Atom)