When I was 18 years old, Sybex published my first book, "The Best of CP/M Software". That was followed by "Real World Unix", a roadmap to implementing Unix servers in small to medium size businesses, and "Espionage in the Silicon Valley", a collection of true stories about my experience working in high tech in the early 1980's.
Each of these was published in the traditional way. I submitted proposals to publishers, received a small advance, wrote manically for months, reviewed galleys, and then thousands of first editions rolled off the presses. The process was time consuming, expensive, and involved many stakeholders.
The Cool Technology of the Week is LULU.COM, which completely revolutionizes the publishing process by supporting on demand publishing of any author at low cost. At CareGroup, we recently used Lulu to publish a book about the retirement of one of our longstanding employees.
Here's how it works:
1. Publish -You upload your manuscript and photos, then use their online formatting tools to specify the layout, size and binding. There are no set up fees - you just pay the cost of the book when you buy it, as you would in a bookstore. Lulu does not inventory any books, they are digitally produced on demand when ordered.
2. Sell - Lulu books are listed in the Lulu marketplace, but you can also get an ISBN number so the book will be available on Amazon.com, in retail stores, libraries and schools.
3. Connect - Lulu includes social networking tools to link together a community of digital media creators.
When I compare the process of on demand book publishing to the workflow I used 30 years ago, I'm amazed at how democratic publishing has become. Like blogging, anyone can become a published author with formal publisher. In fact, if I wanted to bind together my first 100 blog posts in book form, I could do so without a set up fee on Lulu, call the result something like "True Confessions of a CIO" and make it available on Amazon. This is my 80th blog post, maybe I'll do that after my 100th.
Thursday, January 31, 2008
Wednesday, January 30, 2008
United Health hit with $3.5 million fine
United Health has been hit with a $3.5 million dollar fine by the Department of Managed Health and it couldn't happen to a more deserving company! The department of Insurance investigated the company and found it had violated more than 130,000 claims and improperly denied 30% of claims that were reviewed during the investigation. That is not an accident. Underpaying or denying 130,000 claims
Medication Reconciliation
One of the most challenging Joint Commission requirements for hospitals is supporting medication reconciliation workflow. This means that at every transition of care, providers must verify the medications that each patient is taking to ensure they are getting just the right dose of just the right medication. Why is this a challenge? Imagine that an 87 year old taking 14 cardiac medications visits an orthopedist for knee pain. The orthopedist must carefully record her current doses of Amiodarone, Lasix, Lipitor, and Zestril, examine her knee, prescribe medications, and document the visit, all within 12 minutes. For specialists, this may require a knowledge of medications they do not often prescribe and may take substantial effort if the patient has been referred from an outside provider and thus no medication history is available in the orthopedist's electronic health record.
To date, most hospitals and clinician offices have addressed this Joint Commission requirement by implementing paper processes. Very few vendors offer electronic solutions for medication reconciliation.
In July of 2007, BIDMC was visited by Joint Commission. We had been working on medication reconciliation workflow and software solutions for a year prior to this visit. During their visit, we were live with automated outpatient medication reconciliation, paper-based emergency department medication reconciliation, and paper-based inpatient medication reconciliation. The Joint Commission visitors were impressed with our software engineering but noted that we did not have 100% utilization of the software in our specialty clinics. We had 45 days to ensure 100% compliance, verified with an audit.
Over those 45 days, the Medical Executive Committee modified our clinical documentation policies to require medication reconciliation as part of retaining staff privileges. We assembled a multi-disciplinary committee to understand the workflow implications of medication reconciliation. We cleaned up historical medications by deleting all medications that had not been acted upon in three years. We temporarily hired 5 extra RNs to call patients at home and enter medication lists prior to their visits. When not calling patients, these RNs, who were located in our busiest clinics, cleaned up historical medication lists to ensure they were formatted properly for e-Prescribing renewals.
Our committee recommended hundreds of software changes to make the outpatient reconciliation process easier and we implemented all of them. Enhancements included the ability for any clinician in any clinic to quickly enter, update, or annotate (“patient is not taking as prescribed”) any medication, even if they were not the original prescriber. Of course all medication changes were documented in an audit trail. We enabled providers to enter patient self reported medications and we displayed an alert if the exact dose was unknown.
We modified our eprescribing systems to reflect community-wide medication history. This means that when a clinician writes for any medication, a history of every medication that has been dispensed at any US pharmacy appears and can be used as an aid to reconciliation.
We modified our emergency department clinical documentation system to support reconciliation of patient medications during ED evaluations and ensured all medications dispensed in the ED became part of the patient's active medication list.
On the inpatient side, we enabled discharge medications to be automatically converted to outpatient medications and made all discharge medication summaries available to every clinician. We are just finishing a completely automated inpatient medication reconciliation system which will enable physicians to verify all medications electronically while preparing the initial inpatient history and physical.
Our paper-based medication reconciliation processes will soon be retired.
The greatest challenge to implementing all these solutions was documenting the logical workflow required by clinicians, then getting the buy in of all the users of the system, given that it added substantial work to their day.
At this point, 5 months after our final joint commission audit, physicians are realizing the benefit of an up to date medication list and are saving time in the medication renewal process, receiving more accurate decision support when prescribing new medications, and have better documentation of patient compliance with their therapies. Patients are more satisfied as well, since they know all clinicians will use the same accurate medication list across our hospital and practices, making patient/provider communications about medications much easier.
It may have been one of our most challenging experiences, but in the end, all involved agree it was worth the pain.
To date, most hospitals and clinician offices have addressed this Joint Commission requirement by implementing paper processes. Very few vendors offer electronic solutions for medication reconciliation.
In July of 2007, BIDMC was visited by Joint Commission. We had been working on medication reconciliation workflow and software solutions for a year prior to this visit. During their visit, we were live with automated outpatient medication reconciliation, paper-based emergency department medication reconciliation, and paper-based inpatient medication reconciliation. The Joint Commission visitors were impressed with our software engineering but noted that we did not have 100% utilization of the software in our specialty clinics. We had 45 days to ensure 100% compliance, verified with an audit.
Over those 45 days, the Medical Executive Committee modified our clinical documentation policies to require medication reconciliation as part of retaining staff privileges. We assembled a multi-disciplinary committee to understand the workflow implications of medication reconciliation. We cleaned up historical medications by deleting all medications that had not been acted upon in three years. We temporarily hired 5 extra RNs to call patients at home and enter medication lists prior to their visits. When not calling patients, these RNs, who were located in our busiest clinics, cleaned up historical medication lists to ensure they were formatted properly for e-Prescribing renewals.
Our committee recommended hundreds of software changes to make the outpatient reconciliation process easier and we implemented all of them. Enhancements included the ability for any clinician in any clinic to quickly enter, update, or annotate (“patient is not taking as prescribed”) any medication, even if they were not the original prescriber. Of course all medication changes were documented in an audit trail. We enabled providers to enter patient self reported medications and we displayed an alert if the exact dose was unknown.
We modified our eprescribing systems to reflect community-wide medication history. This means that when a clinician writes for any medication, a history of every medication that has been dispensed at any US pharmacy appears and can be used as an aid to reconciliation.
We modified our emergency department clinical documentation system to support reconciliation of patient medications during ED evaluations and ensured all medications dispensed in the ED became part of the patient's active medication list.
On the inpatient side, we enabled discharge medications to be automatically converted to outpatient medications and made all discharge medication summaries available to every clinician. We are just finishing a completely automated inpatient medication reconciliation system which will enable physicians to verify all medications electronically while preparing the initial inpatient history and physical.
Our paper-based medication reconciliation processes will soon be retired.
The greatest challenge to implementing all these solutions was documenting the logical workflow required by clinicians, then getting the buy in of all the users of the system, given that it added substantial work to their day.
At this point, 5 months after our final joint commission audit, physicians are realizing the benefit of an up to date medication list and are saving time in the medication renewal process, receiving more accurate decision support when prescribing new medications, and have better documentation of patient compliance with their therapies. Patients are more satisfied as well, since they know all clinicians will use the same accurate medication list across our hospital and practices, making patient/provider communications about medications much easier.
It may have been one of our most challenging experiences, but in the end, all involved agree it was worth the pain.
Tuesday, January 29, 2008
Grand Rounds - don't miss it
If you want to keep up with the medical blogosphere, check out this week's Grand Rounds hosted by Kim at Emergiblog. Check it out!.
Measuring Programmer Productivity
In my role as CIO at CareGroup and Harvard Medical school, I oversee nearly 100 software developers. Many organizations purchase software or outsource development to avoid managing developers, but we Build AND Buy and will continue to do so. I was recently asked to share the metrics we use to evaluate our developers. Here is our framework:
1. How may applications does the developer maintain? Applications can vary in size and complexity, so that must also be documented. This helps us understand how many staff are needed as our application portfolio expands.
2. How many new applications can a developer create annually? Some programmers develop version 1.0 of new applications, while others maintain existing applications or modules. This helps us understand our development skill set and our ability to take on new development projects.
3. What is the lifecycle stage of each application assigned to a developer? (New, mature, need to replace within 2 years). This is a reasonable way to measure work completed and forecast upcoming work.
4. How much bug free code is developed per year? This is an imperfect measure, since it measures quantity, not quality, but it is useful to understand as a proxy for coding productivity.
5. Are the application stakeholders satisfied with the quantity and quality of application features developed?
We piloted these measures at Harvard Medical School by
1. Inventorying all the major web applications by developer, including the technologies used.
2. Categorizing this inventory into applications which are new and those which are in maintenance. We also rated each application for intensity of support (High, Medium, Low) based on the frequency of code change that has been historically required to maintain it
3. Rating each application's lifecycle stage.
4. Documenting the number of application releases by each developer over the past year.
5. Creating a survey to measure user satisfaction with each application.
In the end, we published a detailed scorecard for each developer and a summary for all developers. This data alone doesn't tell the whole story, but it does help us plan and better manage our development staff.
1. How may applications does the developer maintain? Applications can vary in size and complexity, so that must also be documented. This helps us understand how many staff are needed as our application portfolio expands.
2. How many new applications can a developer create annually? Some programmers develop version 1.0 of new applications, while others maintain existing applications or modules. This helps us understand our development skill set and our ability to take on new development projects.
3. What is the lifecycle stage of each application assigned to a developer? (New, mature, need to replace within 2 years). This is a reasonable way to measure work completed and forecast upcoming work.
4. How much bug free code is developed per year? This is an imperfect measure, since it measures quantity, not quality, but it is useful to understand as a proxy for coding productivity.
5. Are the application stakeholders satisfied with the quantity and quality of application features developed?
We piloted these measures at Harvard Medical School by
1. Inventorying all the major web applications by developer, including the technologies used.
2. Categorizing this inventory into applications which are new and those which are in maintenance. We also rated each application for intensity of support (High, Medium, Low) based on the frequency of code change that has been historically required to maintain it
3. Rating each application's lifecycle stage.
4. Documenting the number of application releases by each developer over the past year.
5. Creating a survey to measure user satisfaction with each application.
In the end, we published a detailed scorecard for each developer and a summary for all developers. This data alone doesn't tell the whole story, but it does help us plan and better manage our development staff.
Monday, January 28, 2008
Data Center Costs
One of the great challenges of being a CIO is that I am asked to anticipate demand and deploy the infrastructure and applications needed by stakeholders, while tightly controlling costs. This means that I am rewarded for meeting demands just in time, but penalized for under provisioning (not meeting demand) or over provisioning (spending too much).
Even in a world of virtualization and on demand computing, many IT products and services have a significant lead time to deploy. Creating a new data center, adding significant new power feeds and expanding cooling capacity are examples. To help understand the relationship of real estate, power, cooling, storage, network bandwidth and costs, we've created a web-based model which we're now sharing with the IT community. Feel free to access it here.
You'll see that this dashboard enables you to change costs per square foot, costs per kilowatt, total server/storage capacity and other variables to create a customized cost dashboard. At Harvard Medical School, we've used this dashboard during the budget process to achieve predictable budgeting. CFOs do not like surprises, so agreeing on a predictable cost model that directly relates demand, supply and cost, makes life for the CIO much easier.
I hope you find this useful!
Even in a world of virtualization and on demand computing, many IT products and services have a significant lead time to deploy. Creating a new data center, adding significant new power feeds and expanding cooling capacity are examples. To help understand the relationship of real estate, power, cooling, storage, network bandwidth and costs, we've created a web-based model which we're now sharing with the IT community. Feel free to access it here.
You'll see that this dashboard enables you to change costs per square foot, costs per kilowatt, total server/storage capacity and other variables to create a customized cost dashboard. At Harvard Medical School, we've used this dashboard during the budget process to achieve predictable budgeting. CFOs do not like surprises, so agreeing on a predictable cost model that directly relates demand, supply and cost, makes life for the CIO much easier.
I hope you find this useful!
Sunday, January 27, 2008
Electronic Records for Non-Owned Doctors
In my previous post The top 10 things that will keep me up at night in 2008, the number one project is providing electronic health records (EHR) to non-owned physicians.
This will be the first of a series of posts about this very complex project. My posts will detail the cost challenges, the partnership we've built to execute the project, and the technical approach we've taken. Since this is truly a work in process, I'll publish these posts each week over the next few months.
Since 2002, my IT teams have provided electronic health records to every owned/closely affiliated clinician of Beth Israel Deaconess, using our home built webOMR software. We even have a Medical Executive Committee policy mandating the use of electronic health records for owned clinicians by July 30, 2008. 'Use' is carefully defined, since we want all our clinicians to update the problem list, create an electronic note for each encounter, and perform all medication management (e-Prescribing, medication reconciliation, allergy documentation) electronically. Of interest, recent data collected by David Blumenthal of Massachusetts General Hospital concludes that only 4% of clinicians in the US have this level of use of fully functional electronic health records.
Since Stark safe harbors now enable hospitals to fund up to 85% of the non-hardware implementation costs of private practice electronic health records, my teams are now expected to provide EHR solutions for all the non-owned affiliated BIDMC affiliated physicians in Eastern Massachusetts. This is a very different project than providing applications and infrastructure to owned clinicians, which we manage, on a network, which we manage.
The planning for the project includes the following major issues:
1. Designing Governance – My typical steering committees are drawn from hospital senior management, employed clinicians and hospital staff. The governance of a community-wide electronic health record system must include members of the physicians' organization, private physicians, community hospital executives, and legal experts.
2. Modeling costs – The 'hydraulics' of the project budget are quite complex. The hospital wants to support implementation for as many physicians as possible but has limited capital. The physicians want as much implementation subsidy as possible since by Stark regulation they have to fund all hardware and ongoing support themselves. The number of doctors implemented, the level of subsidy and total costs for all stakeholders are interrelated but each party has different goals.
3. Planning for distributed users – These non-owned clinicians are widely dispersed throughout Massachusetts and New England in urban and rural settings. Bandwidth varies from 20 Megabit Verizon FiOS connections to 56K dialup. Technology sophistication varies from fully 'wired' clinicians to offices run on 3x5 index cards.
4. Managing the project – CIOs traditionally serve hospital-based customers. They may not have the bandwidth or expertise to serve non-owned geographically dispersed customers.
5. Building a scalable infrastructure - The architecture must be designed to minimize costs, maximize reliability and support a project scope that is continually evolving.
6. Deploying staff - The existing hospital IT staff is not optimized for supporting networks, telecom, desktop and application at hundreds of remote locations. The physicians' organization and clinician offices do not have the staff or expertise to execute this project.
7. Creating the Model Office – A clinically integrated network of providers in a community will want to adopt a standard EHR configuration with common dictionaries to support healthcare information exchange and continuity of care. Standardizing software configuration means standardizing workflow, which requires business process re-engineering. Practice consulting is needed to balance standardization with specialty specific processes, ensuring that providers buy into the new workflow and staff are appropriately trained.
8. Obtaining all the funding - Once the scope, architecture, staffing and cost modeling is completed, the funding must be obtained from all of the stakeholders. State and Federal governments are not likely to contributing anything. Payers may fund the outcomes of EHR use via pay for performance, but are unlikely to pay for implementation.
9. Specifying the order of implementation - How do we choose the most appropriate offices for pilots and once those pilots are completed, how do we place hundreds of clinicians in a well ordered queue for rollout?
10. Supporting the practices – Hospital support models depend upon standardized networks and desktops with end to end control over the quality of service. Supporting heterogeneous practices requires on site, high touch, higher cost service.
My goal is to create a blog entry for each of these issues. Next week, I'll publish the governance model. The following week, I'll post the detailed considerations we're using to develop the cost model (the finished models will be published under 'obtaining all the funding', since they are still evolving). By the time all 10 posts are done, we'll be live in pilots with 4 practices and I hope to be able to sleep again.
This will be the first of a series of posts about this very complex project. My posts will detail the cost challenges, the partnership we've built to execute the project, and the technical approach we've taken. Since this is truly a work in process, I'll publish these posts each week over the next few months.
Since 2002, my IT teams have provided electronic health records to every owned/closely affiliated clinician of Beth Israel Deaconess, using our home built webOMR software. We even have a Medical Executive Committee policy mandating the use of electronic health records for owned clinicians by July 30, 2008. 'Use' is carefully defined, since we want all our clinicians to update the problem list, create an electronic note for each encounter, and perform all medication management (e-Prescribing, medication reconciliation, allergy documentation) electronically. Of interest, recent data collected by David Blumenthal of Massachusetts General Hospital concludes that only 4% of clinicians in the US have this level of use of fully functional electronic health records.
Since Stark safe harbors now enable hospitals to fund up to 85% of the non-hardware implementation costs of private practice electronic health records, my teams are now expected to provide EHR solutions for all the non-owned affiliated BIDMC affiliated physicians in Eastern Massachusetts. This is a very different project than providing applications and infrastructure to owned clinicians, which we manage, on a network, which we manage.
The planning for the project includes the following major issues:
1. Designing Governance – My typical steering committees are drawn from hospital senior management, employed clinicians and hospital staff. The governance of a community-wide electronic health record system must include members of the physicians' organization, private physicians, community hospital executives, and legal experts.
2. Modeling costs – The 'hydraulics' of the project budget are quite complex. The hospital wants to support implementation for as many physicians as possible but has limited capital. The physicians want as much implementation subsidy as possible since by Stark regulation they have to fund all hardware and ongoing support themselves. The number of doctors implemented, the level of subsidy and total costs for all stakeholders are interrelated but each party has different goals.
3. Planning for distributed users – These non-owned clinicians are widely dispersed throughout Massachusetts and New England in urban and rural settings. Bandwidth varies from 20 Megabit Verizon FiOS connections to 56K dialup. Technology sophistication varies from fully 'wired' clinicians to offices run on 3x5 index cards.
4. Managing the project – CIOs traditionally serve hospital-based customers. They may not have the bandwidth or expertise to serve non-owned geographically dispersed customers.
5. Building a scalable infrastructure - The architecture must be designed to minimize costs, maximize reliability and support a project scope that is continually evolving.
6. Deploying staff - The existing hospital IT staff is not optimized for supporting networks, telecom, desktop and application at hundreds of remote locations. The physicians' organization and clinician offices do not have the staff or expertise to execute this project.
7. Creating the Model Office – A clinically integrated network of providers in a community will want to adopt a standard EHR configuration with common dictionaries to support healthcare information exchange and continuity of care. Standardizing software configuration means standardizing workflow, which requires business process re-engineering. Practice consulting is needed to balance standardization with specialty specific processes, ensuring that providers buy into the new workflow and staff are appropriately trained.
8. Obtaining all the funding - Once the scope, architecture, staffing and cost modeling is completed, the funding must be obtained from all of the stakeholders. State and Federal governments are not likely to contributing anything. Payers may fund the outcomes of EHR use via pay for performance, but are unlikely to pay for implementation.
9. Specifying the order of implementation - How do we choose the most appropriate offices for pilots and once those pilots are completed, how do we place hundreds of clinicians in a well ordered queue for rollout?
10. Supporting the practices – Hospital support models depend upon standardized networks and desktops with end to end control over the quality of service. Supporting heterogeneous practices requires on site, high touch, higher cost service.
My goal is to create a blog entry for each of these issues. Next week, I'll publish the governance model. The following week, I'll post the detailed considerations we're using to develop the cost model (the finished models will be published under 'obtaining all the funding', since they are still evolving). By the time all 10 posts are done, we'll be live in pilots with 4 practices and I hope to be able to sleep again.
The Law of Unintended Consequences
Health care is filled with the unintended consequences of laws, regulations and poor planning that plague our ability to care for patients. Here are a few, in no particular order. You may have more to add.Lab laws. I used to be able to contract with labs to do tests on my patients. I negotiated a low fee, kept a few dollars for lab draw and my services and everyone won. Enter California
Saturday, January 26, 2008
IQ - Why do Men Think They are Smarter?
A number of studies have shown that "men" think they are better looking than they really are and "women" often underestimate their good looks, thinking they are less attractive. It should come as no surprise that the same holds true when it comes to intelligence.Adrian Furnham, a professor of psychology at University College London, reported that men and women have fairly equal IQ's but men
Friday, January 25, 2008
Handwashing Reduces Diarrhea
Here we are with fantastic technical advances of the 21st century and surprise! A new study shows that hand washing reduces episodes of diarrhea by 30%. Since diarrhea is caused by pathogens (read: germs) that get passed from human to human or human to food to human...hand washing interrupts this cycle.What was interesting about this report is that hand washing was as effective in preventing
Amswer Quiz #4
The answer is #5 Mandibular fracture. This patient was struck on his left lower jaw and suffered a comminuted fracture. The open fracture caused the left half of his mandible to be pushed upward. Surgical wiring of the jaw puts it back in alignment but he'll be drinking from a straw for a good period of time.
Thursday, January 24, 2008
Medical Quiz 2008 #4
How smart are you? What is your Diagnosis? Click on the image to get a better view.#1. Dental abscess#2 Neurofibromatosis#3 Cleft jaw#4 Hemiatrophy syndrome#5 Mandibular fracture Check back tomorrow for the answer!
Wednesday, January 23, 2008
Medical Journal Update
Ultrasounds for aortic aneurysms: Abdominal aortic aneurysm ruptures occur suddenly and are almost always fatal. They are about 4 times more common in smokers, than in people who have never smoked. The researchers studied 67,770 men ages 65-74 and found that performing an ultrasound to screen for an enlarged aorta significantly reduced death. Aorta's enlarge slowly until they rupture but do
Cool Technology of the Week
In my previous posts, I've described our web applications of CareGroup and Harvard Medical School that run anywhere on anything. At some point, as more and more software is offered via Software as a service providers, Web 2.0 applications, and operating system neutral thin clients, the notion of a desktop becomes a basic operating system running a browser. Such a device can be a very thin client without much local storage.
We've investigated several thin client computers in the past, but the Cool Technology of the Week is the Jack PC from Chip PC Technologies. It's slightly larger than a pager fits inside a wall mounted network port. It's powered over ethernet, offers VMware support and connectivity to Microsoft Terminal or Citrix servers. Features include
On the subject of small PCs, one followup from my post about the MacBook Air. In a recent article about the Air in AppleInsider, I found a fascinating comparison of major subnotebook manufacturers, that aligns with my own experience
Sony targets high end consumers; it leverages its physical media engineering prowess to build DVD burners into most of its models, something that few other light notebook makers even attempt to do. Sony's Vaio line is splashy and feature rich, but isn't commonly regarded as well built or durable.
Panasonic is known for its ruggedized Toughbook line, designed to operate in rough environments. Its models commonly trade off high end performance and features for extremely light weight and compact size. That relegates Panasonic's fans to mobile business users, and makes it less appealing to mainstream consumers.
Lenovo, which bought up IBM's PC division, continues the venerable ThinkPad line as a highly regarded workhorse that delivers top performance in a thin but well constructed case -- all work and no play. ThinkPads are also known for their long usable life and their fingertip controllers rather than trackpads, something that polarizes users for or against based on their personal preferences.
Fujitsu is another leader in light and thin notebooks, but also makes more general purpose machines that borrow from its leading edge thin designs. Its larger sized lines are powerful and economical while still remaining thin and fairly light. Fujitsu also makes Tablet PC convertible machines with the flip-around monitors that have yet to prove popular because they are expensive.
Dell was not discussed in the article, but my experience is that the Latitude line, such as the D420 with a wide screen, are light, durable, and economical.
At CareGroup and Harvard Medical School, we deploy Lenovo and Dell Latitude laptops.
Here's the complete article
We've investigated several thin client computers in the past, but the Cool Technology of the Week is the Jack PC from Chip PC Technologies. It's slightly larger than a pager fits inside a wall mounted network port. It's powered over ethernet, offers VMware support and connectivity to Microsoft Terminal or Citrix servers. Features include
- Installable in wall, furniture or floor
- Network integrated
- 100% theft-proof
- 100% virus-immune
- 100% data-secured
- 100% remotely managed
- Energy saver (3.5W)
On the subject of small PCs, one followup from my post about the MacBook Air. In a recent article about the Air in AppleInsider, I found a fascinating comparison of major subnotebook manufacturers, that aligns with my own experience
Sony targets high end consumers; it leverages its physical media engineering prowess to build DVD burners into most of its models, something that few other light notebook makers even attempt to do. Sony's Vaio line is splashy and feature rich, but isn't commonly regarded as well built or durable.
Panasonic is known for its ruggedized Toughbook line, designed to operate in rough environments. Its models commonly trade off high end performance and features for extremely light weight and compact size. That relegates Panasonic's fans to mobile business users, and makes it less appealing to mainstream consumers.
Lenovo, which bought up IBM's PC division, continues the venerable ThinkPad line as a highly regarded workhorse that delivers top performance in a thin but well constructed case -- all work and no play. ThinkPads are also known for their long usable life and their fingertip controllers rather than trackpads, something that polarizes users for or against based on their personal preferences.
Fujitsu is another leader in light and thin notebooks, but also makes more general purpose machines that borrow from its leading edge thin designs. Its larger sized lines are powerful and economical while still remaining thin and fairly light. Fujitsu also makes Tablet PC convertible machines with the flip-around monitors that have yet to prove popular because they are expensive.
Dell was not discussed in the article, but my experience is that the Latitude line, such as the D420 with a wide screen, are light, durable, and economical.
At CareGroup and Harvard Medical School, we deploy Lenovo and Dell Latitude laptops.
Here's the complete article
Central and Local Information Technology
Recently, I was asked to write an overview about day to day IT support at Harvard Medical School, including ideas about expanding services over the next year. One of the most savvy folks in the research community responded with a great dialog that illustrates the balance between central and local IT. I've italicized his comments below:
1. Networks
Continue to provide high bandwidth wired and wireless networks to all HMS departments. Provide 100 Megabit connections to all desktops and 1 Gigabit connections as needed, assuming that appropriate Cat 6 wiring is available in the department. In parallel replace all legacy wiring at HMS, over the next 5 years to ensure all stakeholders have the potential for gigabyte connections.
This level of infrastructure is naturally a central IT (and physical infrastructure) function. However, the requirements are not entirely generic; e.g. different sites have specific network architecture needs. Support for the software side of networking (firewalls) with adequate attention to the particular needs of departments is also essential.
2. Servers
Continue to provide Unix/Linux and Windows server hosting with 24x7 management, operating system patching and security services for all HMS stakeholders. The offiste data center should be expanded to meet growing demand for server and high performance computing hosting.
This is one of the areas in which there is the most variation in the benefits of centralization to different research units. Some departments or labs have well-developed IT support staffs and server infrastructure, others have local servers but are poorly staffed, and some have little local support and are reliant on central services. For the first group, having local support staff responsible to a local group and dedicated to the particular needs of the site is a highly effective model. Whether this is consistent with centralized physical location of server hardware remains to be determined. For the other groups, some further investigation is needed to determine whether a centralized or dedicated server model is more appropriate.
3. Storage
Provide 50 Gigabytes per user and 500 Gigabytes per lab at no charge. Provide storage and archiving services to high volume users for $1/Gigabyte per year. To ensure that storage needs are met for all new faculty, include funding for IT infrastructure such as storage in their recruiting packages. Storage includes 24x7 support for high speed fiber channel or SATA Network Attached Storage, including appropriate means to attach to this storage for Mac OS X, Linux, Unix and Windows as well as web-based Online Storage. Archiving/backup is also provided as a service.
The proper organization of storage is closely linked to the organization of servers; the physical link between storage and server is still important. Backups might be efficiently provided as a school-level commodity service if competitively priced with an adequate service level for restores, although there are issues of confidentiality and data use agreements affecting some datasets.
4. Desktops
Continue to maintain help desk services, jointly managed with the departments, for support of Windows, Macs, and Linux machines. Provide anti-virus software and operating system updates.
Joint management with departments has been a good model. Some desktop-oriented applications and services are standardized commodities, but assignment of support staff to departments allows them to gain familiarity with the particular needs and configurations of departments and individual users.
5. Applications
Today, the application suite at HMS includes web portals such as Mycourses/eCommons; Administrative applications such as FIRST, Room Scheduling, MARS-Medical Area Reporting System, MAES- Medical Area Equipment System, Peoplesoft, GMAS, MADRIS; Research applications that run on Orchestra such as Twiki, Matlab, LaserGene, and Endnote; and Educational applications such as virtual microscopy; streaming video/podcasting; and simulations.
E-mail is another crucial service that is best supported at the school level due to the high cost of effectively dealing with spam and malware.
Expand this suite of applications through the enhancement of Mycourses/eCommons to include collaboration services such as instant messaging, Webex meetings, CONNECTS (a match making service for equipment, techniques, and scientists), SHRINE (a means of data mining across all Harvard affiliates), and web content management for easy hosting of internal and external websites. Expand this suite of applications to support research administrative and CTSA needs such as IACUC and animal ordering. Expand application support to include informatics services per the BRIC business plan such as database consulting and web application design.
Some attention should be given to the cost allocation model for these services since some of them are of general use, some are sometimes provided by internally supported staff at departments, and some are of use only to particular groups. However for the most part the items listed here are of general utility.
6. Disaster Recovery
Currently, HMS maintains two data centers and has significant redundancy in storage, server clusters, and networks. However, it does not have a true disaster recovery center and plan. Hire staff and develop a plan to ensure business continuity in the case of a major data center or natural disaster.
7. Media Services
Provide media services support for the entire school including presentation services, digital photography/videography, and teleconferencing services. The Media Services infrastructure is currently under review and school wide enhancements will be proposed in 2008.
As you can see from this dialog, neither completely central nor completely local is the right model for a research-focused medical school.
At HMS, central organizations provide "heat, power, light, networking and terabytes" - the utilities needed to empower all the core businesses of HMS - education, research and administration. Centralizing those utilities which can achieve economies of scale and reliability, while leaving local those areas of science and application expertise unique to each department, has worked well to support all our stakeholders. One of the major central/local collaborations has been the joint hiring customer service representatives for each department and locating them within the departments they serve. This enables each department to have centrally trained, managed, and budgeted staff but with specific skills to meet each department's academic needs. Similiarly, the HMS data center hosts centrally managed servers and storage but also hosts department specific infrastructure managed by scientists.
This division of labor between the IT department and the scientific community leverages the skills of each, ensuring a positive working relationship, based on a transparent governance model, for all the services we deliver.
1. Networks
Continue to provide high bandwidth wired and wireless networks to all HMS departments. Provide 100 Megabit connections to all desktops and 1 Gigabit connections as needed, assuming that appropriate Cat 6 wiring is available in the department. In parallel replace all legacy wiring at HMS, over the next 5 years to ensure all stakeholders have the potential for gigabyte connections.
This level of infrastructure is naturally a central IT (and physical infrastructure) function. However, the requirements are not entirely generic; e.g. different sites have specific network architecture needs. Support for the software side of networking (firewalls) with adequate attention to the particular needs of departments is also essential.
2. Servers
Continue to provide Unix/Linux and Windows server hosting with 24x7 management, operating system patching and security services for all HMS stakeholders. The offiste data center should be expanded to meet growing demand for server and high performance computing hosting.
This is one of the areas in which there is the most variation in the benefits of centralization to different research units. Some departments or labs have well-developed IT support staffs and server infrastructure, others have local servers but are poorly staffed, and some have little local support and are reliant on central services. For the first group, having local support staff responsible to a local group and dedicated to the particular needs of the site is a highly effective model. Whether this is consistent with centralized physical location of server hardware remains to be determined. For the other groups, some further investigation is needed to determine whether a centralized or dedicated server model is more appropriate.
3. Storage
Provide 50 Gigabytes per user and 500 Gigabytes per lab at no charge. Provide storage and archiving services to high volume users for $1/Gigabyte per year. To ensure that storage needs are met for all new faculty, include funding for IT infrastructure such as storage in their recruiting packages. Storage includes 24x7 support for high speed fiber channel or SATA Network Attached Storage, including appropriate means to attach to this storage for Mac OS X, Linux, Unix and Windows as well as web-based Online Storage. Archiving/backup is also provided as a service.
The proper organization of storage is closely linked to the organization of servers; the physical link between storage and server is still important. Backups might be efficiently provided as a school-level commodity service if competitively priced with an adequate service level for restores, although there are issues of confidentiality and data use agreements affecting some datasets.
4. Desktops
Continue to maintain help desk services, jointly managed with the departments, for support of Windows, Macs, and Linux machines. Provide anti-virus software and operating system updates.
Joint management with departments has been a good model. Some desktop-oriented applications and services are standardized commodities, but assignment of support staff to departments allows them to gain familiarity with the particular needs and configurations of departments and individual users.
5. Applications
Today, the application suite at HMS includes web portals such as Mycourses/eCommons; Administrative applications such as FIRST, Room Scheduling, MARS-Medical Area Reporting System, MAES- Medical Area Equipment System, Peoplesoft, GMAS, MADRIS; Research applications that run on Orchestra such as Twiki, Matlab, LaserGene, and Endnote; and Educational applications such as virtual microscopy; streaming video/podcasting; and simulations.
E-mail is another crucial service that is best supported at the school level due to the high cost of effectively dealing with spam and malware.
Expand this suite of applications through the enhancement of Mycourses/eCommons to include collaboration services such as instant messaging, Webex meetings, CONNECTS (a match making service for equipment, techniques, and scientists), SHRINE (a means of data mining across all Harvard affiliates), and web content management for easy hosting of internal and external websites. Expand this suite of applications to support research administrative and CTSA needs such as IACUC and animal ordering. Expand application support to include informatics services per the BRIC business plan such as database consulting and web application design.
Some attention should be given to the cost allocation model for these services since some of them are of general use, some are sometimes provided by internally supported staff at departments, and some are of use only to particular groups. However for the most part the items listed here are of general utility.
6. Disaster Recovery
Currently, HMS maintains two data centers and has significant redundancy in storage, server clusters, and networks. However, it does not have a true disaster recovery center and plan. Hire staff and develop a plan to ensure business continuity in the case of a major data center or natural disaster.
7. Media Services
Provide media services support for the entire school including presentation services, digital photography/videography, and teleconferencing services. The Media Services infrastructure is currently under review and school wide enhancements will be proposed in 2008.
As you can see from this dialog, neither completely central nor completely local is the right model for a research-focused medical school.
At HMS, central organizations provide "heat, power, light, networking and terabytes" - the utilities needed to empower all the core businesses of HMS - education, research and administration. Centralizing those utilities which can achieve economies of scale and reliability, while leaving local those areas of science and application expertise unique to each department, has worked well to support all our stakeholders. One of the major central/local collaborations has been the joint hiring customer service representatives for each department and locating them within the departments they serve. This enables each department to have centrally trained, managed, and budgeted staff but with specific skills to meet each department's academic needs. Similiarly, the HMS data center hosts centrally managed servers and storage but also hosts department specific infrastructure managed by scientists.
This division of labor between the IT department and the scientific community leverages the skills of each, ensuring a positive working relationship, based on a transparent governance model, for all the services we deliver.
Monday, January 21, 2008
The Shape of a Mother
I just wanted to share this link of a woman who gave birth to triplets (at home no less). She bravely photographed her growing belly and it is awesome to see. For those who are dismayed at the flabby tummy afterward...keep in mind that it takes a year for the post pregnancy abdomen to resolve...although I am sure the loose skin will remain. Who cares? The result was worth it. Check it out.
Personalized Medicine
Folks who read my blog know that I am a strong believer in personal health records and personalized medicine. What is personalized medicine?
It's a lifetime transportable medical record that follows the patient everywhere to ensure they receive the safest and most effective care based on their history.
Additionally, personalized medicine is coordination of care among providers, payers, pharmacies and labs, respecting patient privacy preferences .
However, the most personalized medicine will occur when each of us has a transportable version of our fully sequenced genomes available to optimize care for the diseases we are likely to develop. The following is a description of my experience to date.
In 2007, I contributed blood, skin, and oral mucosa cells to the Personal Genome Project. Perpetual cell lines of my white blood cells were created to serve as a source of my DNA for analysis. By February 2008, project will fully sequence the 1% of my genome that is uniquely me, about 60 million base pairs.
As a first, step, my DNA was analyzed using Affymetrix technology to hybridize my genes to 12 million probes, identifying Single Nucleotide Polymorphisms and relating them to probabilities of disease through Genome Wide Association Studies . For those with a genetics background, my entire 20 megabyte SNP mapping is here and here
An analysis based on the Welcome Trust Genome Wide Association Study (GWAS) indicates that my risks for disease are:
Coronary Artery Disease - increased risk
Diabetes Type 2 - average risk
Rheumatoid Arthritis - average risk
Thus for me, becoming a vegan 7 years ago was truly a great idea. By reducing all my cardiac risk factors, I've likely negated my genetic risks.
This analysis used regions of my genome that are associated with diseases, not a full sequence. The full 60 million base pairs will be completed using the next generation technology described in this presentation.
In 2008, the US Healthcare Information Technology Standards Panel (HITSP), has been chartered with harmonizing the standards to securely transmit genome sequences. The work will be completed by October 2008.
The folks at the Personal Genome Project hope to expand their effort from the 10 pilot volunteers they have today to 100,000 volunteers. If they are successful, they may even win the X-prize.
In 2008 we'll have the technology to sequence humans in a fast affordable way, then transmit that data securely with patient consent to those who need to know it. The future of personalized medicine is around the corner, not a decade from now!
It's a lifetime transportable medical record that follows the patient everywhere to ensure they receive the safest and most effective care based on their history.
Additionally, personalized medicine is coordination of care among providers, payers, pharmacies and labs, respecting patient privacy preferences .
However, the most personalized medicine will occur when each of us has a transportable version of our fully sequenced genomes available to optimize care for the diseases we are likely to develop. The following is a description of my experience to date.
In 2007, I contributed blood, skin, and oral mucosa cells to the Personal Genome Project. Perpetual cell lines of my white blood cells were created to serve as a source of my DNA for analysis. By February 2008, project will fully sequence the 1% of my genome that is uniquely me, about 60 million base pairs.
As a first, step, my DNA was analyzed using Affymetrix technology to hybridize my genes to 12 million probes, identifying Single Nucleotide Polymorphisms and relating them to probabilities of disease through Genome Wide Association Studies . For those with a genetics background, my entire 20 megabyte SNP mapping is here and here
An analysis based on the Welcome Trust Genome Wide Association Study (GWAS) indicates that my risks for disease are:
Coronary Artery Disease - increased risk
Diabetes Type 2 - average risk
Rheumatoid Arthritis - average risk
Thus for me, becoming a vegan 7 years ago was truly a great idea. By reducing all my cardiac risk factors, I've likely negated my genetic risks.
This analysis used regions of my genome that are associated with diseases, not a full sequence. The full 60 million base pairs will be completed using the next generation technology described in this presentation.
In 2008, the US Healthcare Information Technology Standards Panel (HITSP), has been chartered with harmonizing the standards to securely transmit genome sequences. The work will be completed by October 2008.
The folks at the Personal Genome Project hope to expand their effort from the 10 pilot volunteers they have today to 100,000 volunteers. If they are successful, they may even win the X-prize.
In 2008 we'll have the technology to sequence humans in a fast affordable way, then transmit that data securely with patient consent to those who need to know it. The future of personalized medicine is around the corner, not a decade from now!
Respecting patient privacy preferences
One of the greatest challenges for healthcare information exchanges is to ensure continuity of care for patients while also respecting patient privacy preferences. In Massachusetts, our model has been Opt-In which means we will not exchange patient information among healthcare organizations unless a patient specifically consents us to do so. The educational materials from the Massachusetts eHealth Collaborative for this Opt-In consent are found here.
A patient signs a consent at each institution and then the data from that institution is shared on a need to know basis with clincians directly caring for the patient. The data is never sold or data mined.
Over time, the types of consent that we'll be asked to support will be much more granular. You can imagine patient preferences such as
If I'm unconscious in an emergency room, share everything including mental health, substance abuse and HIV status data.
If I'm visiting a Minuteclinic do not include my mental health and substance abuse history.
If I'm sharing my data for a population-based research study, do not include my HIV status.
Ideally, each patient would be able to declare their preferences for sharing data and have these preferences universally accessible to all healthcare information exchanges or institutions which need consents. To solve this problem, I recently proposed a technology called the Consent Assertion Markup Language (CAML) which is described in detail here.
The basic idea is that a Consent Wizard could be created on the web to record and transmit all patient privacy preferences. Such an electronic consent document could be stored on the patient's personal health record, at their insurer, or at a third party secure website.
The alternative to CAML is making the patient the steward of their own data. Some patients would welcome the opportunity to manage their own records by gathering source data from clinics, hospitals, labs and pharmacies then applying their privacy preferences and sharing the resulting data with caregivers as needed. The next generation of Personal Health Records such as Microsoft Health Vault and Google's rumored healthcare offerings are likely to support this kind of data exchange.
In the meantime, Massachusetts is working on a consent wizard prototype as part of the Mass Health Data Consortium's participation in the HISPC project. We'll experiment with CAML and report back how well it works.
A patient signs a consent at each institution and then the data from that institution is shared on a need to know basis with clincians directly caring for the patient. The data is never sold or data mined.
Over time, the types of consent that we'll be asked to support will be much more granular. You can imagine patient preferences such as
If I'm unconscious in an emergency room, share everything including mental health, substance abuse and HIV status data.
If I'm visiting a Minuteclinic do not include my mental health and substance abuse history.
If I'm sharing my data for a population-based research study, do not include my HIV status.
Ideally, each patient would be able to declare their preferences for sharing data and have these preferences universally accessible to all healthcare information exchanges or institutions which need consents. To solve this problem, I recently proposed a technology called the Consent Assertion Markup Language (CAML) which is described in detail here.
The basic idea is that a Consent Wizard could be created on the web to record and transmit all patient privacy preferences. Such an electronic consent document could be stored on the patient's personal health record, at their insurer, or at a third party secure website.
The alternative to CAML is making the patient the steward of their own data. Some patients would welcome the opportunity to manage their own records by gathering source data from clinics, hospitals, labs and pharmacies then applying their privacy preferences and sharing the resulting data with caregivers as needed. The next generation of Personal Health Records such as Microsoft Health Vault and Google's rumored healthcare offerings are likely to support this kind of data exchange.
In the meantime, Massachusetts is working on a consent wizard prototype as part of the Mass Health Data Consortium's participation in the HISPC project. We'll experiment with CAML and report back how well it works.
Remember Martin Luther King, Jr.
Martin Luther King, Jr. (Jan 15, 1929 -April 4, 1968)"I have a dream that one day this nation will rise up and live out the true meaning of its creed" 'We hold these truths to be self-evident, that all men are created equal."'"I have a dream that my four little children will one day live in a nation where they will not be judged by the color of their skin but by the content of their character.""
Saturday, January 19, 2008
Desktops and Laptops in an Enterprise
One of the interesting aspects of being a CIO at two institutions, Harvard Medical School and CareGroup, is that I can experience complex IT issues from multiple perspectives.
A tension that always exists in enterprises is how much IT is centralized and standardized verses local and variable. Desktops and laptops provide a good example of the issue.
At Caregroup, we must protect the confidentiality of 3 million records (HIPAA mandate and patient expectation), ensure nearly 100% uptime, and prevent all viruses/trojans/worms/keystroke loggers from entering our network to ensure the integrity of patient data as mandated by numerous compliance requirements. Security is an end to end design requirement from the servers to the network to the desktop/laptop used to access the data. Because of these patient care and regulatory requirements, hospitals tend to function more like corporate entities, mandating standards for each device. We have 7500 Windows XP computers at Beth Israel Deaconess, standardized on the Dell Optiplex product line and managed with an IT provided image that ensures stability and security. The total cost of ownership of this managed infrastructure is low for the following reasons:
Thus, we have official policies mandating that only IT provided desktops can be connecting to the network. I rarely get feedback on this policy because a desktop is seen as a commodity with little personalization. We spend a bit more up front for the Dell Optiplex line than we would for the consumer grade Dell Dimension line, but the total cost of ownership over the lifecycle of the device is much lower than if we purchased consumer grade machines.
Laptops are harder because they are seen by many as highly personal. Some folks want "Executive Jewelry" like the Sony VAIO. Others shop for price over enterprise manageability. The challenge is that laptops are generally made from highly proprietary hardware, which is optimized for small size/light weight. This means that they are hard to service and support. In our case, we have standardized on Lenovo/Dell laptops and require that each is purchased with a 3 year warranty, ensuring complete service coverage for the life cycle of the device. We do not make repairs ourselves and require the manufacturer to replace any defective parts. As with the desktop, Dell makes two product lines - the Latitude optimized for enterprises and the Inspiron optimized for consumers. Occasionally, Caregroup employees will look at the Inspiron line and note they are cheaper and more full featured than the Latitude line. What they do not realize is that the Latitude laptop, like the Optiplex desktop, uses higher quality parts that are very consistent for the lifetime of the product. If a Toshiba optical drive was part of the original design, the same Toshiba part will be in every Latitude shipped. By requiring the purchase of standardized laptops, we optimize the total cost of ownership of these devices over 3 years, as well as their reliability, security, and supportability. Responding to a security incident or attempting to provide best effort service to a non-standard laptop rapidly costs more than purchasing a standard laptop to begin with.
Apple Macintosh computers are analogous to the Dell Latitude/Optiplex line. Macs use consistent, high quality parts and their standard configurations have been able to support the needs of the BIDMC research community. The majority of the systems are MacBook Pros, followed by MacBooks and then a small number of iMacs and Mac desktops. All have OS X and a 3 year AppleCare warranty.
Sometimes employees try to purchase these on their own and submit a receipt for reimbursement, bypassing the institutional mandate to standardize devices connected to the network. In early 2007, the CareGroup adopted a policy that no such purchases would be reimbursed. If truly unusual cases of very specialized "high power" hardware purchases are required, the CIO can sign off on an exception before the purchase is made, but IT must apply patch management and anti-virus protection to these devices to ensure appropriate security. The cost of trying to manage the operating system on non-standard devices is very high, so we try to limit exceptions to under 10 devices per year.
At Harvard Medical School, there are no HIPAA constraints, no patient care issues, and fewer regulatory requirements applying to desktop/laptops. The education and administrative locations at Harvard are centrally managed similarly to CareGroup. However, the very large research enterprise at Harvard is given the recommendation to purchase Lenovo and Dell products at attractive prices but not mandated to standardize. The cost of supporting this research enterprise is 5 times higher than CareGroup - 1 person per 200 devices instead of 1 person for 1000 devices. Over time, as more and more applications become software as a service (Saas), web-based, and operating system neutral, it may be possible to use thin client devices, or more managed devices at Harvard, but for now, the school has accepted higher support costs in research environments by acknowledging that experimentation sometimes requires non-standard approaches. A researcher can implement cutting edge software or hardware if it will aid their research inquiry and at Harvard there are no patient care issues to worry about.
The CIO in many organizations is seen as the corporate guy who says "no" to requests for adopting heterogeneous personalized technology. I hope this brief explanation of the total cost of ownership, the need for security and the need for managing service levels illustrates that, honestly, I'm not such a bad guy after all.
A tension that always exists in enterprises is how much IT is centralized and standardized verses local and variable. Desktops and laptops provide a good example of the issue.
At Caregroup, we must protect the confidentiality of 3 million records (HIPAA mandate and patient expectation), ensure nearly 100% uptime, and prevent all viruses/trojans/worms/keystroke loggers from entering our network to ensure the integrity of patient data as mandated by numerous compliance requirements. Security is an end to end design requirement from the servers to the network to the desktop/laptop used to access the data. Because of these patient care and regulatory requirements, hospitals tend to function more like corporate entities, mandating standards for each device. We have 7500 Windows XP computers at Beth Israel Deaconess, standardized on the Dell Optiplex product line and managed with an IT provided image that ensures stability and security. The total cost of ownership of this managed infrastructure is low for the following reasons:
- I have 8 staff servicing these 7500 Windows machines - nearly 1000 machines per person. Because the hardware and software is standardized, replacing parts, maintaining desktop images, and managing the lifecycle of the products (averages 5 years) is very efficient.
- We leverage our purchasing power with the vendor to get the best price for these machines
- We do not have to test our applications on a large array of different hardware
- The Optiplex line from Dell is designed for enterprises to require very little service (i.e. better power supplies and more reliable parts) and to have little variability over the product life cycle
- Our training effort can be very focused to create high expertise regarding the machines we support
Thus, we have official policies mandating that only IT provided desktops can be connecting to the network. I rarely get feedback on this policy because a desktop is seen as a commodity with little personalization. We spend a bit more up front for the Dell Optiplex line than we would for the consumer grade Dell Dimension line, but the total cost of ownership over the lifecycle of the device is much lower than if we purchased consumer grade machines.
Laptops are harder because they are seen by many as highly personal. Some folks want "Executive Jewelry" like the Sony VAIO. Others shop for price over enterprise manageability. The challenge is that laptops are generally made from highly proprietary hardware, which is optimized for small size/light weight. This means that they are hard to service and support. In our case, we have standardized on Lenovo/Dell laptops and require that each is purchased with a 3 year warranty, ensuring complete service coverage for the life cycle of the device. We do not make repairs ourselves and require the manufacturer to replace any defective parts. As with the desktop, Dell makes two product lines - the Latitude optimized for enterprises and the Inspiron optimized for consumers. Occasionally, Caregroup employees will look at the Inspiron line and note they are cheaper and more full featured than the Latitude line. What they do not realize is that the Latitude laptop, like the Optiplex desktop, uses higher quality parts that are very consistent for the lifetime of the product. If a Toshiba optical drive was part of the original design, the same Toshiba part will be in every Latitude shipped. By requiring the purchase of standardized laptops, we optimize the total cost of ownership of these devices over 3 years, as well as their reliability, security, and supportability. Responding to a security incident or attempting to provide best effort service to a non-standard laptop rapidly costs more than purchasing a standard laptop to begin with.
Apple Macintosh computers are analogous to the Dell Latitude/Optiplex line. Macs use consistent, high quality parts and their standard configurations have been able to support the needs of the BIDMC research community. The majority of the systems are MacBook Pros, followed by MacBooks and then a small number of iMacs and Mac desktops. All have OS X and a 3 year AppleCare warranty.
Sometimes employees try to purchase these on their own and submit a receipt for reimbursement, bypassing the institutional mandate to standardize devices connected to the network. In early 2007, the CareGroup adopted a policy that no such purchases would be reimbursed. If truly unusual cases of very specialized "high power" hardware purchases are required, the CIO can sign off on an exception before the purchase is made, but IT must apply patch management and anti-virus protection to these devices to ensure appropriate security. The cost of trying to manage the operating system on non-standard devices is very high, so we try to limit exceptions to under 10 devices per year.
At Harvard Medical School, there are no HIPAA constraints, no patient care issues, and fewer regulatory requirements applying to desktop/laptops. The education and administrative locations at Harvard are centrally managed similarly to CareGroup. However, the very large research enterprise at Harvard is given the recommendation to purchase Lenovo and Dell products at attractive prices but not mandated to standardize. The cost of supporting this research enterprise is 5 times higher than CareGroup - 1 person per 200 devices instead of 1 person for 1000 devices. Over time, as more and more applications become software as a service (Saas), web-based, and operating system neutral, it may be possible to use thin client devices, or more managed devices at Harvard, but for now, the school has accepted higher support costs in research environments by acknowledging that experimentation sometimes requires non-standard approaches. A researcher can implement cutting edge software or hardware if it will aid their research inquiry and at Harvard there are no patient care issues to worry about.
The CIO in many organizations is seen as the corporate guy who says "no" to requests for adopting heterogeneous personalized technology. I hope this brief explanation of the total cost of ownership, the need for security and the need for managing service levels illustrates that, honestly, I'm not such a bad guy after all.
Oatmeal for Cholesterol
The recent news about Vytorin and Zetia not being effective for lowering cholesterol was front page headlines across America and in the medical blogosphere. What didn't get any attention was the new research that shows "oatmeal lowers cholesterol even more than we thought 10 years ago ". Dr James Anderson looked at new studies and compared them with the original conclusion reached by the FDA 10
Thursday, January 17, 2008
Medical Quiz #3- Answer (see prior post)
This was a toughy. The answer is #4 Furuncular myiasis. This is a human botfly larvae (Dermatobia hominis) emerging from a nodule on the patient's upper arm. The fly's eggs are carried by mosquitoes after the female fly catches the mosquito and attaches its eggs to the body. The eggs hatch and enter the human body when the mosquito is feeding on it. The larvae develops in the subcutaneous skin
Cool Technology of the Week
On Tuesday, I briefly mentioned the MacBook Air as the Cool Technology of the Week. Although several authors have debated the decision to leave out the ethernet connection and optical drive, here are the specific reasons I believe this laptop addresses many of my top design criteria.
1. The device consumes 45 Watts. My experience is that airline onboard power jacks offer 15V at 75 Watts maximum. My Apple MacBook Pro consumes 85W which makes it very problematic on airplanes. I've experienced issues with powering my Macbook to perform at full processor speed as well as issues with charging the battery while flying. My Dell D420 works with a 65W Juice iGo but does not charge. A 45 Watt device will work on all car/boat/plane power systems and is greener than a typical laptop.
2. The device does not have an internal DVD or CD. Although some may consider this controversial, I've not used a DVD or CD in years, since everything I need is on a network. Just about every workplace and airport I traverse worldwide has 802.11 available, so I've not needed the extra weight of an optical drive. If I need to physically exchange data, I'll use my 4 Gig USB Thumb Drive, which weighs under 1 ounce. I'm a big fan of Software as a Service, Web 2.0 applications, and downloadable open source, so there is no CD-based software on my current computers.
3. The device has LED backlighting. All my existing laptops use fluorescent backlighting which consumes battery power, has a latency to start up from sleep/hiberation, and is more delicate than LEDs.
4. The size and weight are
Height: .16-0.76 inch (0.4-1.94 cm)
Width: 12.8 inches (32.5 cm)
Depth: 8.94 inches (22.7 cm)
Weight: 3.0 pounds (1.36 kg)
which means that I can slide it into my Tumi subnotebook briefcase
My torture test for laptops occurs when the person in the economy seat in front of me reclines completely. Yes, I'll have a closeup view of their hair implants, but at least the Macbook Air will be small enough to work in the remaining tray table space.
5. The device supports native digital video output, which works with either the micro-DVI to DVI adapter (included) or micro-DVI to VGA adapter (included). This means I can plug directly into HDTV displays or LCD projectors. My existing laptops cannot do that. Of course, if I forget the adapters at home, I'm out of luck.
6. The battery life is 5 hours compared to 2 hours for my existing Macbook and Dell. That's advertised life and your mileage may vary.
7. The price is competitive at $1799
8. It runs Mac OS X Leopard, which I believe is currently the most stable, most secure, and most usable operating system
9. It's ecofriendly - since it does not contain lead or mercury. The aluminum case is recyclable.
10. The device has a widescreen 1280x800 format, a full keyboard, and a trackpad which supports gestural navigation, just like the iPhone and iPod touch. My experience with laptops is that 1280x800 format in a 13.3 inch screen maximizes productivity by providing enough real estate to use multiple applications simultaneously.
What the downsides? Although a 5 hour battery life and a 45 Watt power supply that plugs into airline seat power meet my use cases, some folks want to carry extra batteries and since the Air battery is not user replaceable, this may create a power problem for them. The device does not have an RJ45 ethernet port built in and needs a USB to Ethernet dongle . Also, it does not have a built in GSM/GPRS/EDGE or CDMA/EDGE wide area network modem to bypass the costs of airport 802.11 connections. Finally, the 64 Gig Solid State Disk option is pricey at $1000.
As with any new technology, I will wait a few months so that initial production and configuration issues are resolved (i.e. the bugs are worked out of the first generation). When it arrives, my home will include an iMac 20", an Air, and a MacBook Pro all connected wirelessly. This means that my 14 year old daughter, my wife and I can finally iChat together!
1. The device consumes 45 Watts. My experience is that airline onboard power jacks offer 15V at 75 Watts maximum. My Apple MacBook Pro consumes 85W which makes it very problematic on airplanes. I've experienced issues with powering my Macbook to perform at full processor speed as well as issues with charging the battery while flying. My Dell D420 works with a 65W Juice iGo but does not charge. A 45 Watt device will work on all car/boat/plane power systems and is greener than a typical laptop.
2. The device does not have an internal DVD or CD. Although some may consider this controversial, I've not used a DVD or CD in years, since everything I need is on a network. Just about every workplace and airport I traverse worldwide has 802.11 available, so I've not needed the extra weight of an optical drive. If I need to physically exchange data, I'll use my 4 Gig USB Thumb Drive, which weighs under 1 ounce. I'm a big fan of Software as a Service, Web 2.0 applications, and downloadable open source, so there is no CD-based software on my current computers.
3. The device has LED backlighting. All my existing laptops use fluorescent backlighting which consumes battery power, has a latency to start up from sleep/hiberation, and is more delicate than LEDs.
4. The size and weight are
Height: .16-0.76 inch (0.4-1.94 cm)
Width: 12.8 inches (32.5 cm)
Depth: 8.94 inches (22.7 cm)
Weight: 3.0 pounds (1.36 kg)
which means that I can slide it into my Tumi subnotebook briefcase
My torture test for laptops occurs when the person in the economy seat in front of me reclines completely. Yes, I'll have a closeup view of their hair implants, but at least the Macbook Air will be small enough to work in the remaining tray table space.
5. The device supports native digital video output, which works with either the micro-DVI to DVI adapter (included) or micro-DVI to VGA adapter (included). This means I can plug directly into HDTV displays or LCD projectors. My existing laptops cannot do that. Of course, if I forget the adapters at home, I'm out of luck.
6. The battery life is 5 hours compared to 2 hours for my existing Macbook and Dell. That's advertised life and your mileage may vary.
7. The price is competitive at $1799
8. It runs Mac OS X Leopard, which I believe is currently the most stable, most secure, and most usable operating system
9. It's ecofriendly - since it does not contain lead or mercury. The aluminum case is recyclable.
10. The device has a widescreen 1280x800 format, a full keyboard, and a trackpad which supports gestural navigation, just like the iPhone and iPod touch. My experience with laptops is that 1280x800 format in a 13.3 inch screen maximizes productivity by providing enough real estate to use multiple applications simultaneously.
What the downsides? Although a 5 hour battery life and a 45 Watt power supply that plugs into airline seat power meet my use cases, some folks want to carry extra batteries and since the Air battery is not user replaceable, this may create a power problem for them. The device does not have an RJ45 ethernet port built in and needs a USB to Ethernet dongle . Also, it does not have a built in GSM/GPRS/EDGE or CDMA/EDGE wide area network modem to bypass the costs of airport 802.11 connections. Finally, the 64 Gig Solid State Disk option is pricey at $1000.
As with any new technology, I will wait a few months so that initial production and configuration issues are resolved (i.e. the bugs are worked out of the first generation). When it arrives, my home will include an iMac 20", an Air, and a MacBook Pro all connected wirelessly. This means that my 14 year old daughter, my wife and I can finally iChat together!
Web 2.0 for the CIO
You may have heard the term Web 2.0 and been unclear what it means. Then again, you're reading this blog, so chances are you're part of the enlightened who have already embraced blogs, which are part of Web 2.0.
Web 1.0 was all about web pages maintained centrally. Content was published by corporate communications/public affairs.
Web 2.0 is all about collaboration, everyone as publisher and complete interactivity with networks of associates. Here's a brief primer on Web 2.0
A blog is a specialized form of web-based content management specifically designed for creating and maintaining short articles published by anyone who wants to be an author. Although blogging is not a real time collaboration tool, it is a remarkable way to spread information. For example, my blog had 7,429 views by 4,611 visitors over the past week. As an external relations tool for communicating information, proposing an idea, or marketing a concept, blogs work extremely well. Blogger, WordPress and TypePad are leading blogging sites.
A wiki is software that enables users to create, edit, and link web pages easily. Wikis are often used to create collaborative websites and to power community websites. They are being installed by businesses to provide knowledge management and are extremely useful for a community of authors to create shared documentation. At Harvard Medical School, we use open source wiki software called Twiki as an enterprise application.
A forum is a threaded discussion with multiple participants that is not real time. Entries can be read and responded to at any time. At Harvard, we created our own threaded discussion forums and these are used for strategic planning activities by a diverse group of geographically dispersed participants. There are variations on forums such as Dell's IdeaStorm (a Software as a Service application hosted by salesforce.com), which Dell implemented to get input from customers and employees about new strategic priorities. Users post ideas and anyone can vote to support the idea, raising its score/relevance in the forum.
Chat is a real time, synchronous discussion group with many participants. Old fashioned Internet Relay Chat has been largely replaced by group Instant Messaging chat. Although Instant Messaging is often a 1 on 1 conversation, most IM clients support inviting multiple participants to message as a group. For example, Google's Gmail includes Instant Messaging chat and Apple's iChat works well with AOL Instant Messanging to support group Instant Messaging. Chat requires that all participants must be online together. Forums, as described above, can be more convenient because not all parties involved in the discussion have to be online at the same time.
A unique collaboration tool called Gobby, enables multiple authors to edit text together in realtime. Document respositories such as Microsoft Sharepoint, Documentum and many content management systems (CMS) support sharing of documents. I've used Sharepoint as a document repository to coordinate the nation's healthcare data standardization process at HITSP.
Social Networking tools include sites such as Facebook, LinkedIn, and MySpace . Most social networking services provide a collection of various ways for users to interact, such as chat, messaging, email, video, voice chat, file sharing, blogging, and discussion groups.. To test the value of these services for collaboration, I established a Facebook, LinkedIn, and Second Life account. I currently have 30 friends on Facebook, 78 Contacts on LinkedIn and a Second Life Avatar named Geek Destiny.
Facebook is great for building collaborative groups. LinkedIn is great if you're looking for a job. Second Life is a fine virtual reality environment , although its business applications are limited . Here's a great article from the UK Guardian about Social Networking tools.
Everyone who knows me understands that I'm very transparent about my successes and failures. I want to admit publicly that I did not embrace Web 2.0 fast enough. At Harvard, we do provide easy to use content management for departmental websites (not individuals), online document sharing, calendars, news and forums. We also host dozens of Wiki sites. However, we do not provide web-based IM and we're just starting to deploy social networking. As part of the new Dean's strategic planning process, I have recommended an immediate, wholesale adoption of Web 2.0 throughout Harvard Medical School. My report to the Dean notes:
"Immediately expand our enteprise intranet sites (eCommons.med.harvard.edu and mycourses.med.harvard.edu) to include collaboration services, instant messaging, Webex meetings, CONNECTS (a match making service for equipment, techniques, and scientists), SHRINE (a means of data mining across all Harvard affiliates), and web content management that enables anyone in the Harvard community to be a publisher."
At BIDMC, I'm relaunching our intranet using a commercial Content Management System called SiteCore which includes numerous Web 2.0 collaboration features such as Wikis, Blogs, Forums, Project Rooms, Collaborative Project Management, Quick Polls, and Project Rooms with Whiteboards.
I spent the last year focusing on integrating our web-based applications and achieved a number of important benefits, including single sign on for all clinical applications. My mistake was in not also focusing on the individual as content publisher. 2008 will include a major push to catch up and broadly deploy Web 2.0 collaboration and publishing tools throughout all my organizations.
Web 1.0 was all about web pages maintained centrally. Content was published by corporate communications/public affairs.
Web 2.0 is all about collaboration, everyone as publisher and complete interactivity with networks of associates. Here's a brief primer on Web 2.0
A blog is a specialized form of web-based content management specifically designed for creating and maintaining short articles published by anyone who wants to be an author. Although blogging is not a real time collaboration tool, it is a remarkable way to spread information. For example, my blog had 7,429 views by 4,611 visitors over the past week. As an external relations tool for communicating information, proposing an idea, or marketing a concept, blogs work extremely well. Blogger, WordPress and TypePad are leading blogging sites.
A wiki is software that enables users to create, edit, and link web pages easily. Wikis are often used to create collaborative websites and to power community websites. They are being installed by businesses to provide knowledge management and are extremely useful for a community of authors to create shared documentation. At Harvard Medical School, we use open source wiki software called Twiki as an enterprise application.
A forum is a threaded discussion with multiple participants that is not real time. Entries can be read and responded to at any time. At Harvard, we created our own threaded discussion forums and these are used for strategic planning activities by a diverse group of geographically dispersed participants. There are variations on forums such as Dell's IdeaStorm (a Software as a Service application hosted by salesforce.com), which Dell implemented to get input from customers and employees about new strategic priorities. Users post ideas and anyone can vote to support the idea, raising its score/relevance in the forum.
Chat is a real time, synchronous discussion group with many participants. Old fashioned Internet Relay Chat has been largely replaced by group Instant Messaging chat. Although Instant Messaging is often a 1 on 1 conversation, most IM clients support inviting multiple participants to message as a group. For example, Google's Gmail includes Instant Messaging chat and Apple's iChat works well with AOL Instant Messanging to support group Instant Messaging. Chat requires that all participants must be online together. Forums, as described above, can be more convenient because not all parties involved in the discussion have to be online at the same time.
A unique collaboration tool called Gobby, enables multiple authors to edit text together in realtime. Document respositories such as Microsoft Sharepoint, Documentum and many content management systems (CMS) support sharing of documents. I've used Sharepoint as a document repository to coordinate the nation's healthcare data standardization process at HITSP.
Social Networking tools include sites such as Facebook, LinkedIn, and MySpace . Most social networking services provide a collection of various ways for users to interact, such as chat, messaging, email, video, voice chat, file sharing, blogging, and discussion groups.. To test the value of these services for collaboration, I established a Facebook, LinkedIn, and Second Life account. I currently have 30 friends on Facebook, 78 Contacts on LinkedIn and a Second Life Avatar named Geek Destiny.
Facebook is great for building collaborative groups. LinkedIn is great if you're looking for a job. Second Life is a fine virtual reality environment , although its business applications are limited . Here's a great article from the UK Guardian about Social Networking tools.
Everyone who knows me understands that I'm very transparent about my successes and failures. I want to admit publicly that I did not embrace Web 2.0 fast enough. At Harvard, we do provide easy to use content management for departmental websites (not individuals), online document sharing, calendars, news and forums. We also host dozens of Wiki sites. However, we do not provide web-based IM and we're just starting to deploy social networking. As part of the new Dean's strategic planning process, I have recommended an immediate, wholesale adoption of Web 2.0 throughout Harvard Medical School. My report to the Dean notes:
"Immediately expand our enteprise intranet sites (eCommons.med.harvard.edu and mycourses.med.harvard.edu) to include collaboration services, instant messaging, Webex meetings, CONNECTS (a match making service for equipment, techniques, and scientists), SHRINE (a means of data mining across all Harvard affiliates), and web content management that enables anyone in the Harvard community to be a publisher."
At BIDMC, I'm relaunching our intranet using a commercial Content Management System called SiteCore which includes numerous Web 2.0 collaboration features such as Wikis, Blogs, Forums, Project Rooms, Collaborative Project Management, Quick Polls, and Project Rooms with Whiteboards.
I spent the last year focusing on integrating our web-based applications and achieved a number of important benefits, including single sign on for all clinical applications. My mistake was in not also focusing on the individual as content publisher. 2008 will include a major push to catch up and broadly deploy Web 2.0 collaboration and publishing tools throughout all my organizations.
Wednesday, January 16, 2008
Medical Quiz 2008 #3
This one is really really hard. I missed it myself but I guarantee, when I see it in a patient...I will know what it is. Click on the image for a better view.What is the diagnosis?1. Cutaneous leishmanaisis2. Cutaneous larva migrans3. Epidermoid cyst4. Furuncular myiasis5. TungiasisCheck back tomorrow for the answer.
Grand Rounds is Here
Grand rounds this week is at Sharp Brains and is posted as a clever message to our next Mr. or Ms. President! Check it out for good medical blogging. Everything Health is mentioned again!
An Engineering Eye for the Average Guy
This is one of my more unusual blog entries, so I'll start with a disclaimer that I do have a life and I'm not obsessive compulsive. What follows is an engineering version of 'clothes make the man'.
When I lost 60 pounds through a combination of vegan diet and exercise, I began to look for new clothes from an engineering standpoint, not a fashion standpoint. To work 20 hours a day and fly 400,000 miles a year, I need very practical clothing that fits well, survives abuse, and works in a multitude of settings . Here are my lessons learned about men's clothes, from toe to head.
First, a comment about the color black. Black is formal and informal. I've worn it to the stuffiest supper clubs on Beacon Hill and to rock concerts. It's the choice of savvy business travelers and the always hip Steve Jobs. It does not show dirt or wrinkles. At 3am half way around the world, you do not have to think about color matching. It never goes out of style.
Shoes - Most shoes are made in a single average width. This means that men with wide feet (I'm 9 EE) buy shoes that are too long so that the width feels right in the shoe store. However, just about all shoes stretch in one direction - the width expands with wear. If wide shoes are not available, buy shoes that are the right length and the width will expand by stretching in a few days. I used to wear size 10D shoes that were too long but the right width. I now wear size 9 shoes (European 42/UK size 8) and after a day or two of wear, they are perfect. Since I'm a vegan, I only buy non-leather shoes which is not as hard as it sounds. Take a look at Mooshoes, Vegetarian Shoes, and Novacas. My choice is a simple black office shoe for day to day, and a simple dress boot for New England winters.
Socks - Why do we wear socks? Since mankind evolved to walk on two feet without the encumbrance of shoes, our feet are not optimized to manage temperature/moisture/friction encased in a shoe. A sock should keep feet dry all year long, warm in the winter, and cool in the summer. Hence, a few thoughts about socks. Cotton as a fabric is like a sponge. Once wet, it takes days to try. Just dunk Levis in water and see how long they stay wet. Does it make to sense to put your feet in a sponge and then encase them in a shoe? Cotton socks are not a good choice. A wicking combination of polyester/nylon/wool is ideal. Polyester wicks, nylon provides strength and wool provides temperature regulation. I wear black Smartwool liner socks. From a vegan standpoint, this is an interesting moral dilemma. Animals are not intentionally killed in the wool making process (unlike leather), but the use of any animal products is not consistent with veganism. Since veganism is a journey toward a goal in the modern industrial world, I have accepted the fact that until I find a better performing material for temperature regulation, a small percentage of wool in my long lasting socks is acceptable.
Pants - Have you ever looked at a group of men walking down the street? Half have pants that are too long, creating large clumps of fabric falling over their shoes. Half have pants that are too short, exposing socks and looking like capris. Why is this? Typically men's pants off the rack are sized in even-numbered lengths i.e. 32, 34, 36, so it means that a person with a 33 inch inseam cannot get a good fit. Also, pants are cut from templates based on an average sized thigh, seat, hip etc. Buying pants turns out to be very challenging because they are a 3 dimensional object sold with 2 dimensional measurements. The way that pants are constructed is that they have an inseam, a rise (from crotch to navel), and an outseam (from shoe to waist). Here's the engineering - the inseam is not a straight line and pants measurements are related by this equation: outseam=rise+inseam-1
Purely by luck, 32 waist/34 length Levi's Button Fly 501's (in black of course) have an outseam of 44, a rise of 11 and an inseam of 34 which are exactly my measurements. However, no such luck with dress pants. A typical 32x34 dress pant has an outseam of 44, a rise of 12.5 and an inseam of 32.5, making me wear the pants about 2 inches above my navel because of the 12.5 rise, but that makes the pants too short. Wearing the pants below the navel makes them long enough, but then the crotch sits 2 inches below my body, creating a "gangsta" look. The bottom line of all of this is that men need to determine the rise (where on the waist they wear their pants) then find a pair with the right outseam to give them the right inseam. Unfortunately, pants do not have published rises or outseams. In my case, finding a pair of dress pants that fits was not possible and I had to find a company that creates pants based on body measurements. The cost of these is not much higher than typical off the rack dress pants. My belt is not leather - it's microfiber polyester from Vegetarian Shoes.
Shirts - Finding a shirt that fits well is very challenging. Collars are sized by inches i.e. 15 or 16, sleeves are often sized by multiple inches 34/35, and just about all shirts follow a typical template that creates a waist with way too much extra fabric. A tailor's template for a 16 collar includes a 40 inch waist measurement - you have no choice. For me to find a shirt that fits a 32 inch waist, I'd have to buy a 13 inch collar, which would asphyxiate me. How can I find a shirt with a 15.75 inch collar, a 34 inch sleeve, and a 32 inch waist? Only two choices - a polyester mock turtleneck that stretches, or a company that creates shirts based on your measurements. I have black mock turtlenecks off the rack and black linen shirts made to my measurements.
Jackets - Many suit jackets and sport coats have the same problem as shirts. I wear a 40 Long, which automatically comes with sleeves about 35 or 36 inches long. I buy black Nehru jackets made of rayon/polyester that fit my chest and body length, then have the sleeves altered.
Thus, my office wardrobe is black, mostly vegan, and fitted to my measurements by ordering custom shirts/pants, wearing Levis 501s, and alteration.
My outdoor wardrobe for ice climbing, rock climbing and mountaineering is made of form fitting synthetic materials by Arcteryx, which by random chance has outdoor clothing that matches my every measurement. Although I wear all black in the office, I wear red shirts (no Star Trek jokes allowed) outdoors for visibility.
Clearly this is more information that anyone wants to know about clothing, but it works for me!
When I lost 60 pounds through a combination of vegan diet and exercise, I began to look for new clothes from an engineering standpoint, not a fashion standpoint. To work 20 hours a day and fly 400,000 miles a year, I need very practical clothing that fits well, survives abuse, and works in a multitude of settings . Here are my lessons learned about men's clothes, from toe to head.
First, a comment about the color black. Black is formal and informal. I've worn it to the stuffiest supper clubs on Beacon Hill and to rock concerts. It's the choice of savvy business travelers and the always hip Steve Jobs. It does not show dirt or wrinkles. At 3am half way around the world, you do not have to think about color matching. It never goes out of style.
Shoes - Most shoes are made in a single average width. This means that men with wide feet (I'm 9 EE) buy shoes that are too long so that the width feels right in the shoe store. However, just about all shoes stretch in one direction - the width expands with wear. If wide shoes are not available, buy shoes that are the right length and the width will expand by stretching in a few days. I used to wear size 10D shoes that were too long but the right width. I now wear size 9 shoes (European 42/UK size 8) and after a day or two of wear, they are perfect. Since I'm a vegan, I only buy non-leather shoes which is not as hard as it sounds. Take a look at Mooshoes, Vegetarian Shoes, and Novacas. My choice is a simple black office shoe for day to day, and a simple dress boot for New England winters.
Socks - Why do we wear socks? Since mankind evolved to walk on two feet without the encumbrance of shoes, our feet are not optimized to manage temperature/moisture/friction encased in a shoe. A sock should keep feet dry all year long, warm in the winter, and cool in the summer. Hence, a few thoughts about socks. Cotton as a fabric is like a sponge. Once wet, it takes days to try. Just dunk Levis in water and see how long they stay wet. Does it make to sense to put your feet in a sponge and then encase them in a shoe? Cotton socks are not a good choice. A wicking combination of polyester/nylon/wool is ideal. Polyester wicks, nylon provides strength and wool provides temperature regulation. I wear black Smartwool liner socks. From a vegan standpoint, this is an interesting moral dilemma. Animals are not intentionally killed in the wool making process (unlike leather), but the use of any animal products is not consistent with veganism. Since veganism is a journey toward a goal in the modern industrial world, I have accepted the fact that until I find a better performing material for temperature regulation, a small percentage of wool in my long lasting socks is acceptable.
Pants - Have you ever looked at a group of men walking down the street? Half have pants that are too long, creating large clumps of fabric falling over their shoes. Half have pants that are too short, exposing socks and looking like capris. Why is this? Typically men's pants off the rack are sized in even-numbered lengths i.e. 32, 34, 36, so it means that a person with a 33 inch inseam cannot get a good fit. Also, pants are cut from templates based on an average sized thigh, seat, hip etc. Buying pants turns out to be very challenging because they are a 3 dimensional object sold with 2 dimensional measurements. The way that pants are constructed is that they have an inseam, a rise (from crotch to navel), and an outseam (from shoe to waist). Here's the engineering - the inseam is not a straight line and pants measurements are related by this equation: outseam=rise+inseam-1
Purely by luck, 32 waist/34 length Levi's Button Fly 501's (in black of course) have an outseam of 44, a rise of 11 and an inseam of 34 which are exactly my measurements. However, no such luck with dress pants. A typical 32x34 dress pant has an outseam of 44, a rise of 12.5 and an inseam of 32.5, making me wear the pants about 2 inches above my navel because of the 12.5 rise, but that makes the pants too short. Wearing the pants below the navel makes them long enough, but then the crotch sits 2 inches below my body, creating a "gangsta" look. The bottom line of all of this is that men need to determine the rise (where on the waist they wear their pants) then find a pair with the right outseam to give them the right inseam. Unfortunately, pants do not have published rises or outseams. In my case, finding a pair of dress pants that fits was not possible and I had to find a company that creates pants based on body measurements. The cost of these is not much higher than typical off the rack dress pants. My belt is not leather - it's microfiber polyester from Vegetarian Shoes.
Shirts - Finding a shirt that fits well is very challenging. Collars are sized by inches i.e. 15 or 16, sleeves are often sized by multiple inches 34/35, and just about all shirts follow a typical template that creates a waist with way too much extra fabric. A tailor's template for a 16 collar includes a 40 inch waist measurement - you have no choice. For me to find a shirt that fits a 32 inch waist, I'd have to buy a 13 inch collar, which would asphyxiate me. How can I find a shirt with a 15.75 inch collar, a 34 inch sleeve, and a 32 inch waist? Only two choices - a polyester mock turtleneck that stretches, or a company that creates shirts based on your measurements. I have black mock turtlenecks off the rack and black linen shirts made to my measurements.
Jackets - Many suit jackets and sport coats have the same problem as shirts. I wear a 40 Long, which automatically comes with sleeves about 35 or 36 inches long. I buy black Nehru jackets made of rayon/polyester that fit my chest and body length, then have the sleeves altered.
Thus, my office wardrobe is black, mostly vegan, and fitted to my measurements by ordering custom shirts/pants, wearing Levis 501s, and alteration.
My outdoor wardrobe for ice climbing, rock climbing and mountaineering is made of form fitting synthetic materials by Arcteryx, which by random chance has outdoor clothing that matches my every measurement. Although I wear all black in the office, I wear red shirts (no Star Trek jokes allowed) outdoors for visibility.
Clearly this is more information that anyone wants to know about clothing, but it works for me!
Tuesday, January 15, 2008
Cool Technology of the Week - Special Edition
A few minutes ago, Steve Jobs announced the Apple Macbook Air
1.6Ghz Core Duo, 2 Gig RAM, 13.3 inch widescreen, 5 hour battery life, .16-.76 inch thick, 3 pounds, 802.11n + Bluetooth 2.1/EDR, 45 Watt MagSafe, 1 USB 2.0 port, Micro-DVI, Audio Out, aluminum case, LED Backlighting, multi-touch trackpad, no mercury or lead, $1799
This is the machine I've been waiting for. I'll be retiring my other machines soon.
Clorox Goes Green
I love the green movement because it helps people think about the environment. The first step to any change is conscious thought. We can't change what we don't know.Now The Clorox Co., the Oakland firm that introduced bleach to America a century ago, is marketing natural, biodegradable household cleaners called "Green Works". More than 99% of the ingredients in "Green Works" are natural and
Monday, January 14, 2008
Managing IT Projects
I'm often asked how we manage IT projects. Do we have an Enterprise Project Management Office? A tracking Gantt chart for every project?
The answer is that we do not use a one size fits all approach to project management, we use a suite of tools and common principles that we apply as appropriate depending on the scope, risk, and complexity of the project.
Here's my guide to project management
1. Every project starts with a charter. A project charter clarifies the purpose and urgency of the project which is important for change management. The key leaders, stakeholders, milestones, and risks are clearly stated.
2. Every project gets a single accountable leader. That leader may be the project manager or may have project management staff reporting to him/her. Having a project manager is key to the success of a project. That project manager may use tools such as Gantt charts, workplans, and issue tracking logs. Although some projects are managed through sophisticated Microsoft Project diagrams, most are done on a far less sophisticated basis. Several years ago, we tried to introduce a program management software package and found it to be so burdensome that we dropped it. The main ingredients to our being able to successful, despite the lack of a PMO, are good managers and good communications.
Our managers are good at triaging and adapting when unforeseen demands interrupt work plans. They are also good at keeping others informed. There are a few things that I believe have promoted such good communications:
a. The tenure of our staff, especially at the managerial layer has been excellent. They are a well-oiled team and know when the left hand needs to learn what the right hand is doing.
b. Our office layout promotes communications. We have more conference room space per FTE than anyone in the medical center. This has proven not to be a luxury, but a necessity. It makes it easier to hold adhoc sessions and there are few delays because rooms are available for meetings.
c. We liberally use conference and bridge calls. Bringing people together periodically to make sure things are moving along is of great value.
d. We reach out to our vendor partners and other experts when we need assistance. These folks have helped crystallize our plans when complex problems have arisen.
e. We have a very active change management process that also serves to let the right and left hand communicating. The meetings and email announcements assure others get the word and are able to weigh in with their advice.
We also tend to select a small number of vendor partners, making technology life cycle planning predictable. We are not constantly shifting vendors which would drag on our efficiency.
3. Every project has a steering committee with minutes of each committee meeting. Each steering committee is comprised of key decision makers and stakeholders which build a guiding coalition for each project, a key ingredient for change management. We complete each meeting with a summary of who will be accountable for what, and when it will be done. This assures that the give and take, the digressions, and the range of topics discussed has not confused what has been agreed upon. Every meeting has an element negotiation and we repeat back what was agreed upon to get all parties to acknowledge it.
4. Every project has success metrics which are reviewed frequently. I believe that "the troops do well what their commander reviews". If my top level IT managers are skilled at asking the right questions about high visibility projects, but also pay attention to the basics of operations that keep the systems working, it carries down to the staff. They know they need to pay attention to the details, keep the project moving, and adhere to agreed upon deadlines.
Over the past decade we've had a few projects that were over budget or overtime. In every case, it was because one of the above steps was not followed. By using these general principles, project risk is minimized and all stakeholders are likely to have a better project experience.
The answer is that we do not use a one size fits all approach to project management, we use a suite of tools and common principles that we apply as appropriate depending on the scope, risk, and complexity of the project.
Here's my guide to project management
1. Every project starts with a charter. A project charter clarifies the purpose and urgency of the project which is important for change management. The key leaders, stakeholders, milestones, and risks are clearly stated.
2. Every project gets a single accountable leader. That leader may be the project manager or may have project management staff reporting to him/her. Having a project manager is key to the success of a project. That project manager may use tools such as Gantt charts, workplans, and issue tracking logs. Although some projects are managed through sophisticated Microsoft Project diagrams, most are done on a far less sophisticated basis. Several years ago, we tried to introduce a program management software package and found it to be so burdensome that we dropped it. The main ingredients to our being able to successful, despite the lack of a PMO, are good managers and good communications.
Our managers are good at triaging and adapting when unforeseen demands interrupt work plans. They are also good at keeping others informed. There are a few things that I believe have promoted such good communications:
a. The tenure of our staff, especially at the managerial layer has been excellent. They are a well-oiled team and know when the left hand needs to learn what the right hand is doing.
b. Our office layout promotes communications. We have more conference room space per FTE than anyone in the medical center. This has proven not to be a luxury, but a necessity. It makes it easier to hold adhoc sessions and there are few delays because rooms are available for meetings.
c. We liberally use conference and bridge calls. Bringing people together periodically to make sure things are moving along is of great value.
d. We reach out to our vendor partners and other experts when we need assistance. These folks have helped crystallize our plans when complex problems have arisen.
e. We have a very active change management process that also serves to let the right and left hand communicating. The meetings and email announcements assure others get the word and are able to weigh in with their advice.
We also tend to select a small number of vendor partners, making technology life cycle planning predictable. We are not constantly shifting vendors which would drag on our efficiency.
3. Every project has a steering committee with minutes of each committee meeting. Each steering committee is comprised of key decision makers and stakeholders which build a guiding coalition for each project, a key ingredient for change management. We complete each meeting with a summary of who will be accountable for what, and when it will be done. This assures that the give and take, the digressions, and the range of topics discussed has not confused what has been agreed upon. Every meeting has an element negotiation and we repeat back what was agreed upon to get all parties to acknowledge it.
4. Every project has success metrics which are reviewed frequently. I believe that "the troops do well what their commander reviews". If my top level IT managers are skilled at asking the right questions about high visibility projects, but also pay attention to the basics of operations that keep the systems working, it carries down to the staff. They know they need to pay attention to the details, keep the project moving, and adhere to agreed upon deadlines.
Over the past decade we've had a few projects that were over budget or overtime. In every case, it was because one of the above steps was not followed. By using these general principles, project risk is minimized and all stakeholders are likely to have a better project experience.
Sunday, January 13, 2008
Emergency Room Call-Another Doctor Shortage
Across the United States, and certainly in Northern California...hospitals are facing increasing difficulty with the availability of certain specialties to cover emergency rooms. In the past, if a patient is seen in a community hospital by the emergency physician, she can call on a panel of specialists to come in when needed to see the patient. Physicians in Neurosurgery, Orthopedics,
Saturday, January 12, 2008
Selecting New Applications
Every week, I'm asked by customers to collaborate with them to choose new applications.
Here's how we do it.
1. First, we take an enterprise view of each application request, since we would much rather consolidate applications and reduce the number of vendors than provide a new niche application for every evolving departmental need. If an existing enterprise application cannot meet the user's requirements, we then survey the marketplace.
2. We do not believe in Requests for Proposals (RFP). RFP means Request for Prevarication, since most vendors do their best to answer RFPs with as many positive responses as possible. Instead, we review the marketplace, often via the web and and by reviewing summary evaluations from KLAS reports. We pick the three or four applications which seem to best meet our stakeholder functionality requirements.
3. Once we have a small number of applications identified, we evaluate them for their suitability in our infrastructure environments using a formal process. In 2003, we created a Change Control Board to orient the infrastructure team to new applications that are being proposed. The forum meets weekly and has broad representation, including help desk, desktop, servers, storage, networking and security staff. The checklist of issues we review is here. Note that this screening sheet evolves as technologies evolve. Two years ago, we would not have asked about compatibility with VMWare/Virtualization technologies. Also, this list is expanded to address those issues arising from bad experiences, so we do not repeat them.
4. Once an application is approved for suitability in our infrastructure environment, application teams then work collaboratively with our customers to
a. Manage all vendor relationships including scripted demonstrations, contract negotiation, installation/training and life cycle maintenance of the application.
b. Manage integration with our existing applications. Typically this is done via our eGate messaging engine or via web services, since we have widely deployed a service oriented architecture.
c. Define service levels, a disaster recovery strategy, an escalation plan for issues, and division of labor for support of the application. Typically, IS manages the infrastructure and keeps the application running smoothly. Power users in the department document workflow and ensure the application's functionality is optimized to support that workflow.
Over the past ten years, we've found that this collaborative approach with IT, rather than having each department select its own applications, ensures stability, maintainability and scalability. At this point, most departmental IT systems and staff at BIDMC have been replaced by services from the central IT organization. It's a win/win for everyone, since costs decrease, quality increases, and the frustration of choosing an application which does not work in our infrastructure has been eliminated.
Here's how we do it.
1. First, we take an enterprise view of each application request, since we would much rather consolidate applications and reduce the number of vendors than provide a new niche application for every evolving departmental need. If an existing enterprise application cannot meet the user's requirements, we then survey the marketplace.
2. We do not believe in Requests for Proposals (RFP). RFP means Request for Prevarication, since most vendors do their best to answer RFPs with as many positive responses as possible. Instead, we review the marketplace, often via the web and and by reviewing summary evaluations from KLAS reports. We pick the three or four applications which seem to best meet our stakeholder functionality requirements.
3. Once we have a small number of applications identified, we evaluate them for their suitability in our infrastructure environments using a formal process. In 2003, we created a Change Control Board to orient the infrastructure team to new applications that are being proposed. The forum meets weekly and has broad representation, including help desk, desktop, servers, storage, networking and security staff. The checklist of issues we review is here. Note that this screening sheet evolves as technologies evolve. Two years ago, we would not have asked about compatibility with VMWare/Virtualization technologies. Also, this list is expanded to address those issues arising from bad experiences, so we do not repeat them.
4. Once an application is approved for suitability in our infrastructure environment, application teams then work collaboratively with our customers to
a. Manage all vendor relationships including scripted demonstrations, contract negotiation, installation/training and life cycle maintenance of the application.
b. Manage integration with our existing applications. Typically this is done via our eGate messaging engine or via web services, since we have widely deployed a service oriented architecture.
c. Define service levels, a disaster recovery strategy, an escalation plan for issues, and division of labor for support of the application. Typically, IS manages the infrastructure and keeps the application running smoothly. Power users in the department document workflow and ensure the application's functionality is optimized to support that workflow.
Over the past ten years, we've found that this collaborative approach with IT, rather than having each department select its own applications, ensures stability, maintainability and scalability. At this point, most departmental IT systems and staff at BIDMC have been replaced by services from the central IT organization. It's a win/win for everyone, since costs decrease, quality increases, and the frustration of choosing an application which does not work in our infrastructure has been eliminated.
Thursday, January 10, 2008
Do You Wonder Why Americans Are Fat?
I am happy to link you to the most amazing array of bad food choices you will ever see.Reading this is enough to scare anyone into healthy, low cal eating (I hope). Can you believe those calories? (hat tip to kevinmd)
Answer 2008 Quiz #1
The answer is #1 - Leukemia.Gingival (gum) infiltration in a patient with fever, fatigue and weight loss is consistent with acute leukemia. This condition resolved after treatment.
Cool Technology of the Week
Over the past 3 months, I've been evaluating all the technologies needed for flexible work arrangements including video teleconferencing. I've used numerous H320/H323 ISDN/IP Videoteleconferencing applications and appliances. I've used audio/video chat through IM. I've used Apple's iChat with H264 video conferencing.
iChat was really the only usable desktop technology and Polycom was the only usable appliance technology.
Tonight I had the opportunity to 'meet' with Marthin De Beer, Senior Vice President, Emerging Markets Technology Group at Cisco via their Telepresence Technology. It's the Cool Technology of the Week.
Teleprescence creates an easy to use environment for teleconferencing with no latency, no pixelation, and perfect spacial audio quality. What does all that mean?
H320/H323 teleconferencing can take teams of engineers to get working. Every call is an adventure. With telepresence, it's just a Session Initiation Protocol (SIP) phone call. Just pick up your IP phone and dial. It's no harder than making a cell phone call. This makes it so easy to use that even a CEO can do teleconferencing without assistance.
In all my other teleconferencing experiences, there is a palpable delay between the time the speaker talks and the recipient hears their voice that makes the conversation feel a bit like a walkie talkie. One person talks, then another person talks. It's not a real time, full duplex, interruptible conversation. Cisco Telepresence uses high performance dedicated hardware codecs, eliminating any perception of latency.
The picture is true 1080p - better than any HDTV broadcast. The codecs provide such efficient compression that only 3 megabits/second is required for this high resolution image. If the available bandwidth is lower than 3 megabits, the image automatically shifts to 720p (HDTV resolution) without any visible degradation of image quality. All images are lifesize, so your eye perceives the conference as truly in person.
The sound system is similiar to a home theater and all the sounds on the transmitting side are perfectly replicated to the receiving side. A person speaking on the right side of the room, sounds like a person speaking on the right side of your room.
What about the price? The Cisco Boxborough location that I used for our conversation was outfitted with a $299,000 unit that creates a room size telepresence experience. Marthin was using a $50,000 corporate unit at Cisco Headquarters in Santa Clara.
Although not formally announced, Cisco is working on a $15,000 home office unit that will function over standard broadband. They are also planning a consumer level version that will cost less than $5000. If I can achieve real time, easy to use, perfectly clear teleconferencing on my home Sharp Aquos HDTV for under $5000, I'd be able to meet with anyone, anytime around the world.
Marthin and I discussed the technology which is H264, but we also discussed the policies and use cases for deploying it. Currently it's designed for meetings which bring people together without having to travel - across the world, country or town. That makes everyone more productive.
We also discussed its potential use for large group presentations. This You Tube Video shows the technology projected on a transparent film, giving a truly holographic experience. If a room were set up with this technology, I could avoid traveling to speeches.
Finally, we discussed its use for flexible work arrangements/remote workers. Marthin's Executive Assistant works in Texas, but the desk outside Marthin's office is outfitted with an always on unit of her virtual presence. This means that Marthin and any Cisco staff member can walk to her desk to speak with her anytime. Marthin will soon have two offices and she'll be in both simultaneously. I can imagine a virtual company, with virtual cubicles, staffed with virtually present employees.
As this technology becomes more affordable and more widely installed, it will empower telecommuting and eliminate a lot of discretionary travel. Sign me up!
iChat was really the only usable desktop technology and Polycom was the only usable appliance technology.
Tonight I had the opportunity to 'meet' with Marthin De Beer, Senior Vice President, Emerging Markets Technology Group at Cisco via their Telepresence Technology. It's the Cool Technology of the Week.
Teleprescence creates an easy to use environment for teleconferencing with no latency, no pixelation, and perfect spacial audio quality. What does all that mean?
H320/H323 teleconferencing can take teams of engineers to get working. Every call is an adventure. With telepresence, it's just a Session Initiation Protocol (SIP) phone call. Just pick up your IP phone and dial. It's no harder than making a cell phone call. This makes it so easy to use that even a CEO can do teleconferencing without assistance.
In all my other teleconferencing experiences, there is a palpable delay between the time the speaker talks and the recipient hears their voice that makes the conversation feel a bit like a walkie talkie. One person talks, then another person talks. It's not a real time, full duplex, interruptible conversation. Cisco Telepresence uses high performance dedicated hardware codecs, eliminating any perception of latency.
The picture is true 1080p - better than any HDTV broadcast. The codecs provide such efficient compression that only 3 megabits/second is required for this high resolution image. If the available bandwidth is lower than 3 megabits, the image automatically shifts to 720p (HDTV resolution) without any visible degradation of image quality. All images are lifesize, so your eye perceives the conference as truly in person.
The sound system is similiar to a home theater and all the sounds on the transmitting side are perfectly replicated to the receiving side. A person speaking on the right side of the room, sounds like a person speaking on the right side of your room.
What about the price? The Cisco Boxborough location that I used for our conversation was outfitted with a $299,000 unit that creates a room size telepresence experience. Marthin was using a $50,000 corporate unit at Cisco Headquarters in Santa Clara.
Although not formally announced, Cisco is working on a $15,000 home office unit that will function over standard broadband. They are also planning a consumer level version that will cost less than $5000. If I can achieve real time, easy to use, perfectly clear teleconferencing on my home Sharp Aquos HDTV for under $5000, I'd be able to meet with anyone, anytime around the world.
Marthin and I discussed the technology which is H264, but we also discussed the policies and use cases for deploying it. Currently it's designed for meetings which bring people together without having to travel - across the world, country or town. That makes everyone more productive.
We also discussed its potential use for large group presentations. This You Tube Video shows the technology projected on a transparent film, giving a truly holographic experience. If a room were set up with this technology, I could avoid traveling to speeches.
Finally, we discussed its use for flexible work arrangements/remote workers. Marthin's Executive Assistant works in Texas, but the desk outside Marthin's office is outfitted with an always on unit of her virtual presence. This means that Marthin and any Cisco staff member can walk to her desk to speak with her anytime. Marthin will soon have two offices and she'll be in both simultaneously. I can imagine a virtual company, with virtual cubicles, staffed with virtually present employees.
As this technology becomes more affordable and more widely installed, it will empower telecommuting and eliminate a lot of discretionary travel. Sign me up!
Wednesday, January 9, 2008
Medical Quiz 2008 - #2
Here is your 2nd medical diagnosis challenge of 2008. This patient has fatigue, fever, anorexia and weight loss and this appearance of her gums. What is the diagnosis? Click on the image for a better view:1. Leukemia2. Scurvy (Vit C deficiency)3. Acquired Immunodeficiency syndrome (HIV)4. Sarcoidosis5. Pellagra (Vit B3-niacin-deficiency)Check back tomorrow to see if you are correct.To
When to Hire Consultants
I'm sure you've heard the consulting stereotypes.
"For a large sum, they will ask for your watch and tell you what time it is."
"They gather an immense amount of knowledge from the organization, create a splashy presentation summarizing what you already know, then leave the organization taking that knowledge with them to apply to other consulting engagements."
However, there are 3 specific circumstances in which I hire consultants.
1. As part of change management by having an external group validate the path chosen
Change is hard and sometimes politics in an organization are such that no internal stakeholder can champion the new idea. Bringing in consultants to publicly validate the idea can build transparency and break down barriers. It may sound strange to pay an external party to explain to an organization what it already knows, but sometimes it is necessary. Also, I've seen stakeholders in politically charged situations be more honest and open with external consultants than with their peers. Many staff seem to be happy to tell all to an external party, which can accelerate information gathering.
2. To extend the capacity of the organization for short term urgent work
I've recently been asked to significantly expand the services offered by IT. All of my existing staff are working at 120% on existing projects. Bringing in consultants for a very focused, short term engagement will enable my staff to focus on their deadlines while getting extra work done by consultants in parallel. A few caveats about doing this. Consultants need to be managed carefully to ensure travel expenses are minimized and the time spent is tightly scoped toward a specific deliverable. This means that consultants will take management time and staff time, so adding 1 consultant FTE comes at a cost of .5 FTE to manage and provide support for the consultant. Also, the organization must buy into the consulting engagement. I've seen passive/aggressive behavior toward consultants, so stakeholders should ask for the consulting engagement rather than have one forced upon them.
3. As contractors that add new knowledge to the organization
In 2002, I had a serious network outage because there aspects of network management "that I did not know that I did not know". We brought in experts in network infrastructure and applications (DNS/DHCP) design. These folks were more educators and contractors than consultants. We now have one of the most resilient networks in healthcare due to their education about best practices.
There are also reasons not to hire consultants
1. Do not outsource your strategy to consultants
Although many talented consulting firms offer strategic planning, I've not seen business changing strategic plans come out of outsourcing strategy. Consultants can be helpful facilitators of strategic planning, organizing all the ideas of employees, customers and senior management, but the strategy should belong to the organization, not external consultants.
2. Do not hire consultants as operational line managers
Sometimes positions are hard to fill and consultants are brought in as temporary staff. This can work. However, hiring a consultant to manage permanent employees does not work. It generates a great deal of resentment from the existing employees and it's hard to sustain because everyone knows their manager is temporary. It's a bit like having a substitute teacher in school.
3. Do not allow consultants to hire consultants
Sometimes consultants are self-propagating. A tightly scope engagement grows as consultants discover new work for other consultants to do. Keep the consulting engagement focused and move the work to the permanently employed staff in the organization as soon as possible.
On rare occasions, I make myself available for one day consulting engagements doing comprehensive IT audits of healthcare organizations. When I do this, I donate all fees to BIDMC or Harvard, not accepting any payment for my time. I create an overview analysis of the strategy, structure and staffing of the IT organization as a guide for the existing management and staff. I hope these efforts follow my guidelines above - bringing external validation, extending capacity, and offering new perspective.
"For a large sum, they will ask for your watch and tell you what time it is."
"They gather an immense amount of knowledge from the organization, create a splashy presentation summarizing what you already know, then leave the organization taking that knowledge with them to apply to other consulting engagements."
However, there are 3 specific circumstances in which I hire consultants.
1. As part of change management by having an external group validate the path chosen
Change is hard and sometimes politics in an organization are such that no internal stakeholder can champion the new idea. Bringing in consultants to publicly validate the idea can build transparency and break down barriers. It may sound strange to pay an external party to explain to an organization what it already knows, but sometimes it is necessary. Also, I've seen stakeholders in politically charged situations be more honest and open with external consultants than with their peers. Many staff seem to be happy to tell all to an external party, which can accelerate information gathering.
2. To extend the capacity of the organization for short term urgent work
I've recently been asked to significantly expand the services offered by IT. All of my existing staff are working at 120% on existing projects. Bringing in consultants for a very focused, short term engagement will enable my staff to focus on their deadlines while getting extra work done by consultants in parallel. A few caveats about doing this. Consultants need to be managed carefully to ensure travel expenses are minimized and the time spent is tightly scoped toward a specific deliverable. This means that consultants will take management time and staff time, so adding 1 consultant FTE comes at a cost of .5 FTE to manage and provide support for the consultant. Also, the organization must buy into the consulting engagement. I've seen passive/aggressive behavior toward consultants, so stakeholders should ask for the consulting engagement rather than have one forced upon them.
3. As contractors that add new knowledge to the organization
In 2002, I had a serious network outage because there aspects of network management "that I did not know that I did not know". We brought in experts in network infrastructure and applications (DNS/DHCP) design. These folks were more educators and contractors than consultants. We now have one of the most resilient networks in healthcare due to their education about best practices.
There are also reasons not to hire consultants
1. Do not outsource your strategy to consultants
Although many talented consulting firms offer strategic planning, I've not seen business changing strategic plans come out of outsourcing strategy. Consultants can be helpful facilitators of strategic planning, organizing all the ideas of employees, customers and senior management, but the strategy should belong to the organization, not external consultants.
2. Do not hire consultants as operational line managers
Sometimes positions are hard to fill and consultants are brought in as temporary staff. This can work. However, hiring a consultant to manage permanent employees does not work. It generates a great deal of resentment from the existing employees and it's hard to sustain because everyone knows their manager is temporary. It's a bit like having a substitute teacher in school.
3. Do not allow consultants to hire consultants
Sometimes consultants are self-propagating. A tightly scope engagement grows as consultants discover new work for other consultants to do. Keep the consulting engagement focused and move the work to the permanently employed staff in the organization as soon as possible.
On rare occasions, I make myself available for one day consulting engagements doing comprehensive IT audits of healthcare organizations. When I do this, I donate all fees to BIDMC or Harvard, not accepting any payment for my time. I create an overview analysis of the strategy, structure and staffing of the IT organization as a guide for the existing management and staff. I hope these efforts follow my guidelines above - bringing external validation, extending capacity, and offering new perspective.
Tuesday, January 8, 2008
National Healthcare Identifiers
I was recently asked to comment on the likelihood that a national healthcare identifier will be created for the United States, such as those that are already used in Canada, Norway, and the UK. Many people do not know that Congress has imposed a hold since 1998 on any funding to plan or implement a national health identifier, so the US Department of Health and Human Services cannot even discuss the issue. Here's the background
August 1996
HIPAA enacted, "Sec. 1173(b) ... The Secretary [of HHS] shall adopt standards providing for a standard unique health identifier for each individual, employer, health plan, and health care provider for use in the health care system." Health Insurance Portability and Accountability Act of 1996 (HIPAA) (42 U.S.C. 1320d-2(b)).
February 1998
Due to controversy and a lack of consensus, National Committee on Vital and Health Statistics (NCVHS) issues recommendation that the Secretary delay selection and implementation of the unique health identifier. Recommends publication of a Notice of Intent (NOI) in the Federal Register with a 60-day comment period to solicit input from the public.
July 2, 1998
NCVHS publishes background paper, “Unique Health Identifier for Individuals, A White Paper” discussing options for identifying individuals and associated implications.
July 20-21,1998
NCVHS Subcommittee on Standards and Security holds in Chicago what was to be the first in a series of regional public hearings on the Unique Health Identifier for Individuals. Due to public reaction further hearings, as well as the planned publication of an NOI by HHS, are canceled.
July 31, 1998
Vice President Gore announces that the Clinton administration will block implementation of unique health identifiers until comprehensive privacy protections are in place.
October 1998
Congressional hold on provision in FY1999 HHS budget: "Sec. 514. None of the funds made available in this Act [the HHS appropriation act for the next fiscal year] may be used to promulgate or adopt any final standard under section 1173(b) of the Social Security Act (42 U.S.C 1320d-2(b)) providing for, or providing for the assignment of, a unique health identifier for an individual (except in an individual’s capacity as an employer or a health care provider), until legislation is enacted specifically approving the standard." - Public Law 106-554, 105th Congress (114 STAT. 2763).
2000 – 2007
Annual Congressional hold on provision in HHS budget such as H.R.3010.ENR - 109th Congress (2006) Departments of Labor, Health and Human Services, and Education, and Related Agencies Appropriations Act, 2006 - Making appropriations for the Departments of Labor, Health and Human Services, and Education, and Related Agencies for the fiscal year ending September 30, 2006, and for other purposes. Sec. 511
My opinion is that a compulsory national health identifier is unlikely but that personal health records may catalyze the development of a voluntary health identifier used to facilitate continuity of care.
Currently, each healthcare provider uses a different medical record numbering scheme, making unification of records from inpatient, ambulatory, lab, pharmacy, and payers a true informatics challenge. At CareGroup we use a statistical, probabilitistic algorithm from Initiate.com that incorporates name, gender, date of birth, zip code and other demographics to link multiple medical records together into a virtual patient record. This works great for John D. Halamka, male, 05/23/1962, 02481 but not perfectly for John Smith, Boston. The Markle Foundation's Connecting for Health Record Linking report is a great backgrounder on this approach.
A voluntary patient identifier, assigned purely with patient consent, would add another element to the matching algorithm and would significantly increase the confidence of linking together demographics of patients with common names.
The benefit to the patient is clear. With new personal health record products like Microsoft Health Vault, and the rumored Google Health offerings, patients would be able to link more accurately to their data at all sites of care, then be able to be stewards of their lifetime records. Since the voluntary identifier would be completely patient consented and controlled, only those patients wanting one would opt in, ensuring personal privacy preferences are respected.
How long will this take? In 2008 , many doctors will start using electronic health records which will provide enough clinical data to make personal health records more value added. In 2009 personal health records will become much more popular but will require manual linking of patient identifiers by requiring patients to establish accounts to access their data with each healthcare provider. I predict that by 2010 a voluntary health identifier will be considered and implemented by some vendors and institutions. Over the next decade, if patients gain confidence in the security of a healthcare information exchange system they control, it is conceivable that Congress would revisit their ban on a secure national identifier for healthcare. Until then, a national identifier is not a prerequisite to getting started with personal health records and I will fully enable any patient to retrieve their health records from BIDMC with their consent via the new generation of standards-based vendor, employer-sponsored, and payer-based personal health records using manual linking methods.
August 1996
HIPAA enacted, "Sec. 1173(b) ... The Secretary [of HHS] shall adopt standards providing for a standard unique health identifier for each individual, employer, health plan, and health care provider for use in the health care system." Health Insurance Portability and Accountability Act of 1996 (HIPAA) (42 U.S.C. 1320d-2(b)).
February 1998
Due to controversy and a lack of consensus, National Committee on Vital and Health Statistics (NCVHS) issues recommendation that the Secretary delay selection and implementation of the unique health identifier. Recommends publication of a Notice of Intent (NOI) in the Federal Register with a 60-day comment period to solicit input from the public.
July 2, 1998
NCVHS publishes background paper, “Unique Health Identifier for Individuals, A White Paper” discussing options for identifying individuals and associated implications.
July 20-21,1998
NCVHS Subcommittee on Standards and Security holds in Chicago what was to be the first in a series of regional public hearings on the Unique Health Identifier for Individuals. Due to public reaction further hearings, as well as the planned publication of an NOI by HHS, are canceled.
July 31, 1998
Vice President Gore announces that the Clinton administration will block implementation of unique health identifiers until comprehensive privacy protections are in place.
October 1998
Congressional hold on provision in FY1999 HHS budget: "Sec. 514. None of the funds made available in this Act [the HHS appropriation act for the next fiscal year] may be used to promulgate or adopt any final standard under section 1173(b) of the Social Security Act (42 U.S.C 1320d-2(b)) providing for, or providing for the assignment of, a unique health identifier for an individual (except in an individual’s capacity as an employer or a health care provider), until legislation is enacted specifically approving the standard." - Public Law 106-554, 105th Congress (114 STAT. 2763).
2000 – 2007
Annual Congressional hold on provision in HHS budget such as H.R.3010.ENR - 109th Congress (2006) Departments of Labor, Health and Human Services, and Education, and Related Agencies Appropriations Act, 2006 - Making appropriations for the Departments of Labor, Health and Human Services, and Education, and Related Agencies for the fiscal year ending September 30, 2006, and for other purposes. Sec. 511
My opinion is that a compulsory national health identifier is unlikely but that personal health records may catalyze the development of a voluntary health identifier used to facilitate continuity of care.
Currently, each healthcare provider uses a different medical record numbering scheme, making unification of records from inpatient, ambulatory, lab, pharmacy, and payers a true informatics challenge. At CareGroup we use a statistical, probabilitistic algorithm from Initiate.com that incorporates name, gender, date of birth, zip code and other demographics to link multiple medical records together into a virtual patient record. This works great for John D. Halamka, male, 05/23/1962, 02481 but not perfectly for John Smith, Boston. The Markle Foundation's Connecting for Health Record Linking report is a great backgrounder on this approach.
A voluntary patient identifier, assigned purely with patient consent, would add another element to the matching algorithm and would significantly increase the confidence of linking together demographics of patients with common names.
The benefit to the patient is clear. With new personal health record products like Microsoft Health Vault, and the rumored Google Health offerings, patients would be able to link more accurately to their data at all sites of care, then be able to be stewards of their lifetime records. Since the voluntary identifier would be completely patient consented and controlled, only those patients wanting one would opt in, ensuring personal privacy preferences are respected.
How long will this take? In 2008 , many doctors will start using electronic health records which will provide enough clinical data to make personal health records more value added. In 2009 personal health records will become much more popular but will require manual linking of patient identifiers by requiring patients to establish accounts to access their data with each healthcare provider. I predict that by 2010 a voluntary health identifier will be considered and implemented by some vendors and institutions. Over the next decade, if patients gain confidence in the security of a healthcare information exchange system they control, it is conceivable that Congress would revisit their ban on a secure national identifier for healthcare. Until then, a national identifier is not a prerequisite to getting started with personal health records and I will fully enable any patient to retrieve their health records from BIDMC with their consent via the new generation of standards-based vendor, employer-sponsored, and payer-based personal health records using manual linking methods.
Monday, January 7, 2008
Doctors and Sexual Misconduct
The Ventura County Star reports some alarming statistics from the California Medical Board about prosecution of doctors who engage in sexual misconduct with patients. They looked at reports over the past 6 years and found only 1 in 7 cases had any action taken at all. Of 680 reports of bad sexual behavior, only 123 doctors received any sanction.As a physician, I pay $805 as a licensing fee to
Knowledge Navigators Combat Information Overload
One of my greatest challenges in 2008 is information overload. 700 emails a day on my Blackberry, RSS feeds, Facebook, Instant Messaging, LinkedIn, MySpace and Second Life all create a pummeling amounts of data.
With email and other information sources escalating year to year, we all need a knowledge navigator to help us sort through all this data and ensure that we triage our incoming information flows into that which is important and that which is just FYI. I've started to build a team of navigators beginning with my medical librarians.
In my CIO role at Beth Israel Deaconess, I oversee the medical libraries. In the past, Libraries were "clean, well lighted places for books". With the advent of Web 2.0 collaboration tools, blogging, content management portals, lulu.com on demand publishing, and digital journals, it is clear that libraries of paper books are becoming less relevant. By the time a book is printed, the knowledge inside may be outdated. Instead, libraries need to become an information commons, a clean, well lighted lounge for digital media staffed by expert knowledge navigators. In my institution, the librarians have thinned the book collection, migrated paper journals to digital media, and spent their time indexing digital knowledge resources to support our search engine optimiziation efforts.
The end result is that the Medical Library has been renamed the Information Commons and the Department of Medical Libraries has been retitled the Department of Knowledge Services. Librarians are now called Information Specialists.
Here's a few examples of how they turn data into knowledge:
Every day Harvard faculty generate numerous new presentations for students, residents and the medical community. Since all Harvard courseware is web enabled, all these presentations are placed online. It's not enough to free text index these materials because search engines, even Google, are only helpful for exact matches of text, not searching concepts. Our Knowledge Services staff apply metadata tags using the National Library of Medicine Medical Subject Heading (MeSH) concept hierarchy to these presentations. For example, a presentation may be about neurons which are part of the brain, but the word brain may not appear anywhere in the text. As user searches from articles about the brain, any presentation containing a part of the brain is automatically included.
Our web portals contain hundreds of links to journals, books, databases and collections of medical references. The challenge with using any search engine is that they page rank based on popularity, not necessarily authorativeness or value to the patient. Just because a certain diet is popular does not mean it is medically sound. Our librarians ensure our links and resources are dynamically updated and refer to the most credible resources, not the most popular.
Every day I receive advertisements about new web-based and mobile knowledge resources. Our Department of Knowledge Services is laboratory for testing these products and we deploy those which are most relevant to our stakeholders. One of their recent projects was acronym resolving tools and developing quantifiable standards for abbreviations.
Although we keep 5000 journals online, we also have access to many pre-digital resources. Our Knowledge Services folks respond to requests for complex historical literature searches with desktop PDF delivery of scanned articles.
Finally, our information commons, formerly the medical library is now an array of desktop computers, printers, wireless access points, scanners and staff to assist users with the technology.
Ideally, we'll all have software agents in the future that turn data into information into knowledge into wisdom, but the first step has been building a department of Knowledge Services within my institution staffed with Knowledge Navigators. Because of them, I'm optimstic that in 2008, I'll receive more wisdom and not just more data.
With email and other information sources escalating year to year, we all need a knowledge navigator to help us sort through all this data and ensure that we triage our incoming information flows into that which is important and that which is just FYI. I've started to build a team of navigators beginning with my medical librarians.
In my CIO role at Beth Israel Deaconess, I oversee the medical libraries. In the past, Libraries were "clean, well lighted places for books". With the advent of Web 2.0 collaboration tools, blogging, content management portals, lulu.com on demand publishing, and digital journals, it is clear that libraries of paper books are becoming less relevant. By the time a book is printed, the knowledge inside may be outdated. Instead, libraries need to become an information commons, a clean, well lighted lounge for digital media staffed by expert knowledge navigators. In my institution, the librarians have thinned the book collection, migrated paper journals to digital media, and spent their time indexing digital knowledge resources to support our search engine optimiziation efforts.
The end result is that the Medical Library has been renamed the Information Commons and the Department of Medical Libraries has been retitled the Department of Knowledge Services. Librarians are now called Information Specialists.
Here's a few examples of how they turn data into knowledge:
Every day Harvard faculty generate numerous new presentations for students, residents and the medical community. Since all Harvard courseware is web enabled, all these presentations are placed online. It's not enough to free text index these materials because search engines, even Google, are only helpful for exact matches of text, not searching concepts. Our Knowledge Services staff apply metadata tags using the National Library of Medicine Medical Subject Heading (MeSH) concept hierarchy to these presentations. For example, a presentation may be about neurons which are part of the brain, but the word brain may not appear anywhere in the text. As user searches from articles about the brain, any presentation containing a part of the brain is automatically included.
Our web portals contain hundreds of links to journals, books, databases and collections of medical references. The challenge with using any search engine is that they page rank based on popularity, not necessarily authorativeness or value to the patient. Just because a certain diet is popular does not mean it is medically sound. Our librarians ensure our links and resources are dynamically updated and refer to the most credible resources, not the most popular.
Every day I receive advertisements about new web-based and mobile knowledge resources. Our Department of Knowledge Services is laboratory for testing these products and we deploy those which are most relevant to our stakeholders. One of their recent projects was acronym resolving tools and developing quantifiable standards for abbreviations.
Although we keep 5000 journals online, we also have access to many pre-digital resources. Our Knowledge Services folks respond to requests for complex historical literature searches with desktop PDF delivery of scanned articles.
Finally, our information commons, formerly the medical library is now an array of desktop computers, printers, wireless access points, scanners and staff to assist users with the technology.
Ideally, we'll all have software agents in the future that turn data into information into knowledge into wisdom, but the first step has been building a department of Knowledge Services within my institution staffed with Knowledge Navigators. Because of them, I'm optimstic that in 2008, I'll receive more wisdom and not just more data.
Saturday, January 5, 2008
Childbirth Classes - Not Too Popular
I remember the "60's" and one of the off shoots from the women's movement was women getting in touch with their bodies and reproduction. It is hard to imagine now, when Brittany is flashing her bare crotch to photographers and soft porn is on family time TV, but back then women actually came together in "feminist" groups to learn about their bodies. An entire industry of prenatal "
Friday, January 4, 2008
Clinical Systems Midyear Update
Every year, I work with my governance committees to create the Information Systems Operating Plan. Each January, I given an update on our progress. I thought my blog readers would enjoy the insight into the details of our clinical systems work.
Inpatient
Outpatient
Radiology Information System
Work in progress to create Pre-Anesthesia Testing documentation (medication reconciliation, nursing assessment form) and web-based versions of the CCC PACU log that are integrated with OR Scheduling. On track for late winter/early spring go live.
Decision Support
A subgroup of the Decision Support Steering Committee has been formed to oversee development of ambulatory quality reporting, with the SVP of Healthcare Quality as chair. First priority is to develop diabetes management reports based on the diabetes registry.
Electronic Health Records for non-owned physicians
Work continues on implementation of a hosted ASP model for BIDPO eClinical Works. Final design completed. Beginning work on Model Office setup.
Positive Patient ID
Rollout of bar-coded wristbands to the remaining (23) inpatient is scheduled for completion by February 2008. Once done, all inpatient units will have the ability to print new or replacement bands on demand at the time the patient presents.
Critical Care Documentation System
Testing complete; planning in progress for a late January / early February Pilot in one adult ICU and the NICU; expected roll-out to all ICUs to follow.
OB-TV Fetal Monitoring Surveillance System
Emergency Department – Rev E. All testing completed in December 07; go-live planned in February 08 (the ED pushed go-live out due to other commitments).
OB - Upgrade to Rev E. Funding approved in November 07; contract discussions beginning.
Anesthesia Information System (AIMS)
Major upgrade to Rev E is in early planning stages; timeline TBD. This will allow the AIMS to begin receiving lab data.
NTRACS Trauma Database
Successful December go-live of NTRACS Performance Manager Upload. ’08 work will include, version upgrade, implementation of a report module and modified National Trauma Data Standard (NTDS) dictionary.
New Laboratory Information System Implementation
Participating in discussions to explore the potential to leverage Radiology PACS to meet growing imaging and storage needs in other departments (e.g. Cardiology, GI).
Rad Onc Infrastructure Improvements
Upgrades to Oracle databases completed; move of RadOnc and Cyberknife databases to a more secure environment recently completed. Rollout of Portal Vision workstations nearing completion.
Cardiology
Participating in planning for upcoming consulting engagements to assess and develop a strategy for Cardiovascular Institution (CVI) related needs at both BIDMC and CVI’s growing number of remote locations. Clinical Systems staff participating in assessment of EMR and imaging requirements at CVI remote locations.
Continue to support various Apollo efforts including:
Participating in planning discussions for MacLab 6.5.3 upgrade
Other
Billing system interface for ED visits enhanced to improve accuracy and reduce the need for manual billing interventions.
Inpatient
- We will go live with statewide clinical summary exchange through the Massachusetts Regional Heath Information Organization called MA-Share in early February 2008. Inpatient discharge documents and the ED discharge documents will be electronically sent to providers using the HITSP Continuity of Care Document format.
- New influenza vaccine prompting and tracking features went live in Provider Order Entry
- Chemotherapy ordering for inpatient setting will integrate with inpatient Provider Order Entry and our outpatient Oncology Management system. It will go live in late March 2008, completing our effort to eliminate handwriting in chemotherapy treatment at every site of care.
- Provider Order Entry for the NICU – Planning meetings held with NICU to discuss resources needed for work. Formal project kick off planned for early March 2008, after the iMDSoft Metavision ICU documentation pilot is completed. Once completed we will eliminate the last handwritten orders in any site of care in the institution.
- Inpatient History and Physical with medication reconciliation – Initial meeting of Clinician Advisory Group held in December, with much interest expressed for this project. Formal kick off planned for late January/early February 2008. This project will eliminate paper history and physicals while also supporting medication reconciliation at the point of patient arrival.
- Scanning of inpatient paper records with a web viewer on track for go live in early 2008. This will enable our medical record coders to work anywhere in the world, expanding the pool of talented people we can hire. It also gives clinicians easy access to any historical paper records.
- Completed all requested order sets for clinical pathways in Provider Order Entry.
- Completed dashboard to support inpatient physician rounding.
Outpatient
- Completed numerous medication list enhancements to make outpatient medication reconciliation more efficient, and to foster better communication/workflow among the multiple providers who often care for patients.
- Completed a onetime “autoretire” of old medications to help remove inactive drugs from medications lists.
- Numerous ePrescribing enhancements have gone live: eligibility, formulary, alternate drug recommendations, mail order, and community medication history. Further piloting and roll out in coming months.
- Automated Results notifications and sign off – Additional specialty divisions have begun using the system (Pulmonary, Orthopedics, GI, Dermatology, Rheumatology). New physician coverage features have been developed and are being piloted. The pathology orders interface was enhanced to improve the accuracy of results routing.
- Ordering for BID Needham (our community hospital) studies is now live in webOMR. Areas that are included are lab, radiology, cardiology, EEG, pulmonary and sleep.
- Work on outpatient scanning with viewing in webOMR will begin after inpatient scanning goes live in early 2008. Projects will be prioritized by the webOMR Users Group with a goal of facilitating standardization of webOMR documentation at BIDMC. First project will be Dermatology outpatient progress notes.
Radiology Information System
- RIS enhancements – An initial set of requirements for a web-based portal that focuses on integration with other clinical systems is currently being defined.
- Preliminary reports and wet reads – Medical Executive Committee mandated changes to remove preliminary ( unsigned) reports from clinical viewers and provide a workflow to request wet reads. Work in progress and will be completed in Feb 2008.
Work in progress to create Pre-Anesthesia Testing documentation (medication reconciliation, nursing assessment form) and web-based versions of the CCC PACU log that are integrated with OR Scheduling. On track for late winter/early spring go live.
Decision Support
A subgroup of the Decision Support Steering Committee has been formed to oversee development of ambulatory quality reporting, with the SVP of Healthcare Quality as chair. First priority is to develop diabetes management reports based on the diabetes registry.
Electronic Health Records for non-owned physicians
Work continues on implementation of a hosted ASP model for BIDPO eClinical Works. Final design completed. Beginning work on Model Office setup.
Positive Patient ID
Rollout of bar-coded wristbands to the remaining (23) inpatient is scheduled for completion by February 2008. Once done, all inpatient units will have the ability to print new or replacement bands on demand at the time the patient presents.
Critical Care Documentation System
Testing complete; planning in progress for a late January / early February Pilot in one adult ICU and the NICU; expected roll-out to all ICUs to follow.
OB-TV Fetal Monitoring Surveillance System
Emergency Department – Rev E. All testing completed in December 07; go-live planned in February 08 (the ED pushed go-live out due to other commitments).
OB - Upgrade to Rev E. Funding approved in November 07; contract discussions beginning.
Anesthesia Information System (AIMS)
Major upgrade to Rev E is in early planning stages; timeline TBD. This will allow the AIMS to begin receiving lab data.
NTRACS Trauma Database
Successful December go-live of NTRACS Performance Manager Upload. ’08 work will include, version upgrade, implementation of a report module and modified National Trauma Data Standard (NTDS) dictionary.
New Laboratory Information System Implementation
- Soft Computer module build nearing completion
- Work with the lab managers in progress to develop approach for managing security, patient & management reporting, specimen tracking, etc.
- Solution for handling organizational and workflow issues around creation of fiscal number in the ambulatory, outreach and community sites is in discussion
- Key lab vacancy (LIS Manager) filled; two additional lab FTEs for the project just approved
- A software upgrade (Lab/Path) which includes contracted customizations is scheduled for installation in early January
- Work is progressing on the clinical viewer with the expectation that all requested changes will be completed by spring 08
- Integration work with POE and ADT is moving forward
- Validation test scripts in development;
- Validation testing is targeted to begin in late spring/early summer 08
Participating in discussions to explore the potential to leverage Radiology PACS to meet growing imaging and storage needs in other departments (e.g. Cardiology, GI).
Rad Onc Infrastructure Improvements
Upgrades to Oracle databases completed; move of RadOnc and Cyberknife databases to a more secure environment recently completed. Rollout of Portal Vision workstations nearing completion.
Cardiology
Participating in planning for upcoming consulting engagements to assess and develop a strategy for Cardiovascular Institution (CVI) related needs at both BIDMC and CVI’s growing number of remote locations. Clinical Systems staff participating in assessment of EMR and imaging requirements at CVI remote locations.
Continue to support various Apollo efforts including:
- Implementation of new Vascular Surgery module
- Development and testing of new Cardiac Cath Registry export
- Planning and implementation of a Physical Inventory module
- Assisting with development of STS reports
Participating in planning discussions for MacLab 6.5.3 upgrade
Other
- Adverse Events Tracking – Numerous enhancements based on user feedback were included in a major version release.
- Transfer Log – New features to allow MD comments and to allow log data to be viewed even after patients have been admitted.
- New ADT feed to support rollout of new lab and coding systems went live
Subscribe to:
Posts (Atom)