Does hospital IT need airline-style certification?

Earlier this month the pilots of a Boeing 787 “Dreamliner” carrying 249 passengers aborted a landing at Okayama airport in Japan when the wheels failed to deploy automatically. The pilots circled and deployed the landing gear...

Share

Earlier this month the pilots of a Boeing 787 “Dreamliner” carrying 249 passengers aborted a landing at Okayama airport in Japan when the wheels failed to deploy automatically. The pilots circled and deployed the landing gear manually.

About a  year ago pilots of an Airbus A380, the world’s largest passenger plane, made an emergency landing at Singapore on landing gear that they deployed using gravity, the so-called “gravity drop emergency extension system”. In both emergencies the contingencies worked. The planes landed safely and nobody was hurt. 

Five years earlier, during tests, part of the landing gear on a pre-operational A380 failed initially to drop down using gravity.

The Teflon solution

The problem was solved, thanks in part to the use of Teflon paint. Eventually the A380 was certified to carry 853 passengers.

Those who fly sometimes owe their lives to the proven and certified backup arrangements on civil aircraft. Compare this safety culture to the improvised and sometimes improvident way some health IT systems are tested and deployed.

Routinely hospital boards order the installation of information systems without proven backup arrangements and certification. Absent in health IT are the mandatory standards that underpin air safety.

When an airliner crashes investigators launch a formal inquiry and publish their findings. Improvements usually follow, if they haven’t already, which is one reason flying is so safe today.

Shutters come down when health IT fails 

When health IT implementations go wrong the effect on patients is unknown.Barts and The London NHS Trust, the Royal Free Hampstead, the Nuffield Orthopaedic Centre, Barnet and Chase Farm Hospitals NHS Trust and other trusts had failed go-lives of NPfIT patient administration systems. They have not published reports on the consequences of the incidents, and have no statutory duty to do so.

Instead of improvements triggered by a public report there may, in health IT, be an instinctive and systemic cover-up, which is within the law. Why would hospitals own up to the seriousness of any incidents brought about by IT-related confusion or chaos? And under the advice of their lawyers suppliers are unlikely to own up to weaknesses in their systems after pervasive problems.

Supplier “hold harmless” clauses

Indeed a “hold harmless” clause is said to be common in contracts between electronic health record companies and healthcare provider organisations. This clause helps to shift liability to the users of EHRs in that users are liable for adverse patient consequences that result from the use of clinical software, even if the software contains errors.

That said the supplier’s software will have been configured locally; and it’s those modifications that might have caused or contributed to incidents.

Done well, health IT implementations can improve the care and safety of patients. But after the go-live of a patient administration system Barts and The London NHS Trust lost track of thousands of patient appointments and had no idea how many were in breach of the 18-week limit for treatment after being referred by a GP. There were also delays in appointments for cancer checks.

At the Royal Free Hampstead staff found they had to cope with system crashes, delays in booking patient appointments, data missing in records and extra costs.

And an independent study of the Summary Care Records scheme by Trisha Greehalgh and her team found that electronic records can omit allergies and potentially dangerous reactions to certain combinations of drugs. Her report also found that the SCR database:

-  Omitted some medications

-  Listed ‘current’ medication the patient was not taking

-  Included indicated allergies or adverse reactions which the patient probably did not have

Electronic health records can also record wrong dosages of drugs, or the wrong drugs, or fail to provide an alert when clinical staff have come to wrongly rely on such an alert.

study in 2005 found that Computerized Physician Order Entry systems, which were widely viewed as a way of reducing prescribing errors, could lead to double the correct doses being prescribed.

One problem of health IT in hospitals is that computer systems are introduced alongside paper where neither one nor the other is a single source of truth. This could cause mistakes analogous to the ones made in early air crashes which were caused not by technology alone but pilots not fully understanding how the systems worked and not recognising the signs and effects of systems failing to work as intended.

In air crashes the lessons are learned the hard way. In health IT the lessons from failed implementations will be learned by committed professionals. But what when a hospital boss is overly ambitious, is bowled over by unproven technology and is cajoled into a premature go-live?

In 2011, indeed in the past few months, headlines in the trade press have continued to flow when a hospital’s patient information system goes live, or when a trust reaches a critical mass of Summary Care Record uploads of patient records (although some of the SCR records may or not be accurate, and may or may not be correctly updated).

What we won’t see are headlines on any serious or even tragic consequences of the implementations. A BBC File on 4 documentary this month explained how hospital mistakes are unlikely to be exposed by coroners or inquests.

So hospital board chief executives can order new and large-scale IT systems without the fear of any tragic failure of those implementations being exposed, investigated and publicly reported on. The risk lies with the patient. Certification and regulation of health IT systems would reduce that risk.

Should health IT systems be tested as well as the A380’s landing gear? - those A380 uncarriage tests in detail (see near end of the article)