Public Impact Fundamentals

The Electronic Health Records System In the UK

In 2002, the UK government launched development of the National Programme for Information Technology (NPfIT) NHS Care Records Service, which was intended to deliver an electronic health records system containing patient records from across the UK. There were problems with poor user requirements analysis, the failure to address patient confidentiality, overambitious timescales, and enormous cost overruns. It was eventually closed down in 2011 as part of the dismantling of the NPfIT programme.

The initiative

The UK government chose the top-down, government-driven approach: a “nationwide implementation of EHRs, known as the NHS Care Records Service, is the cornerstone of the £12.7 billion National Programme for Information Technology (NPfIT)”. [2] This very ambitious approach required enormous resources. “In order to begin the process of creating a system that supports health IT for the entire nation, the UK with the help of four companies including the US-based company, the Computer Sciences Corporation (CSC), began one of the largest and ambitious health IT projects that the world had ever seen in early 2002… NPfIT attempted to create a national EHR system for the entire UK. It was a project that would eliminate the challenges of interoperability between various competitive EHR systems around the UK.”[3]

The initiative came to an end in 2011. “Launched in 2002 and officially dismantled in 2011, NPfIT included the first sustained national attempt to introduce centrally-procured EHR systems across the NHS’s hospitals, including mental health settings.”[4]

The challenge

At the turn of the century, the health records of UK citizens were often held locally, with the result that there was no coordinated system. “Although electronic health records (EHRs) are widely viewed as central to modernising the organisation and delivery of sustainable, high quality healthcare, the uptake of such records in hospital has tended to be slow. Approaches to deployment of EHRs vary from home-grown systems in single organisations with the necessary technical and managerial capacity; to interoperability standards for linking multiple information technology (IT) systems; to top-down, government driven, national implementations of standardised, commercial software applications.”[1]

The public impact

The public impact has been largely negative. “What is true is that many hospitals lack comprehensive electronic patient record (EPR) systems. In England, this can be blamed on the disastrous £12 billion NPfIT. In The blunders of our governments, a survey of government disasters including numerous IT projects, authors Anthony King and Ivor Crewe describe NPfIT as ‘the veritable RMS Titanic of IT disasters’ and ‘doomed from the beginning’.”[5]

It also dented the reputations of the project suppliers. “The UK lost £12.7 billion trying to complete the project. They sued CSC and the company was forced to pay the NHS USD97.5 million.”[6]

MPs expressed frustration that “the funding scheme that supported other British hospitals’ investment in EHRs, the controversial £9.8 billion NPfIT, cancelled in 2011, continues to cost the country millions, with contracts in some cases not set to expire for 12 more years”.[7]


  • Weak
  • Fair
  • Good
  • Strong

Stakeholder engagement

The main stakeholders were the UK government, principally the Department of Health, the NHS (its employees and patients), and Connecting for Health (CFH), the agency responsible for the design, development and implementation of NPfIT. During the design phase there were private sector stakeholders such as CSC and Accenture. There has been criticism that the views of its prospective endusers were not addressed (see Mesaurement below). “A criticism of the programme has been that it has not reflected the needs of the NHS. The number of stakeholders involved is vast and contains many categories, including clinician, managerial, technical, informatics, and professional bodies. The management of and engagement with stakeholders appears to have been less than systematic and rigorous during the life of the programme.”[8]

There were other stakeholders in EHR planning and development in England. “In England, EHR planning is managed by NHS England, the National Information Board (NIB, which develops priorities for data and technology for the Department of Health) and the Health and Social Care Information Centre (HSCIC, a non-departmental public body which manages information, data and IT systems for health and care).”[9]

After the implementation, there was no interest among the stakeholders working to deliver the programme. “The computer software was secret and proprietary. There was no accountability to the public, and the vendors did not provide enough technical support to clinicians having trouble using the records.”[10]

  • Weak
  • Fair
  • Good
  • Strong

Political commitment

There was initially very strong commitment from the first Blair government, reflecting its support for NHS reform: “most unusually for an IT project, the prime minister of the day was centrally involved, at least at the outset”.[11] In February 2002, Tony Blair chaired a Downing Street seminar at which “the Department of Health announced a massive overhaul and expansion of the health service’s IT infrastructure”.[12] He was supported in his enthusiasm by his government: “ministers were committed to the programme and publicly lauded it”.[13]

This commitment was publicly maintained, despite the media’s criticism of NPfIT’s progress. “Nevertheless, against all the accumulating evidence, most of the five secretaries of state for health who served under Tony Blair and then Gordon Brown during the lifetime of the project… continued to insist that the programme as a whole was worthwhile and that, if it was not already proving a success, it soon would be.”[14]

The coalition government that came into power in 2010 was not committed to the NPfIT NHS Care Records system. In 2011, it had become clear that the system was not going to be implemented throughout the NHS. A second report by the House of Commons Public Accounts Committee was no more complimentary than the first one, which had been published in 2007. “The Rt Hon Margaret Hodge MP, Chair of the Committee of Public Accounts, today said: ‘The Department of Health is not going to achieve its original aim of a fully integrated care records system across the NHS. Trying to create a one-size-fits-all system in the NHS was a massive risk and has proven to be unworkable. The Department has been unable to demonstrate what benefits have been delivered from the £2.7 billion spent on the project so far.’”[15] It was inevitable that the political commitment had evaporated by this time and NPfIT and its EHR system would no longer have government support.

  • Weak
  • Fair
  • Good
  • Strong

Public confidence

There is general public support for prospective increases in efficiency and information-sharing that EHRs can offer. “Despite previous difficulties with NHS technology projects, patients and the public generally support the development of integrated EHRs for healthcare provision, planning and policy, and health research. This support, however, varies between social groups and is not unqualified; relevant safeguards must be in place and patients should be guided in their decision-making process, including increased awareness about the benefits of EHRs for secondary uses.”[16]

There are, however, concerns among both NHS doctors and patients about patient privacy and other issues relating to EHRs:

  • “Surveys have shown that patients are concerned about the security of their EHRs, but recognise the value of sharing data, both for their own care and for research.133 In a 2015 survey of 2,761 patients in London, 79% reported that they worry about the security of an EHR, but 55% of those nonetheless supported their development…
  • “A recent survey of 502 doctors found that half believed that the use of healthcare IT had decreased time spent with patients. Staff to transcribe patients’ medical records can be used to free doctors’ time, but this may impede the development of electronic records, as clinicians disengage with using and improving them…
  • “There are conflicting views about how much information patients should have access to. Surveys show that while three-quarters of adults think they should have full access to their health records, only one-third of doctors share this view.”[17]

  • Weak
  • Fair
  • Good
  • Strong

Clarity of objectives

The main objective of the policy was clearly defined at the outset: a nationwide implementation of EHRs (see The initiative above). The subsidiary objectives were those set out by the Department of Health in 1998. “The roots of NPfIT lie in the Department of Health’s (1998) Information for Health strategy. This committed the NHS to goals which remain central to NPfIT:

  • “Lifelong electronic patient records (EPRs), also known as electronic health records (EHRs), that bring together birth-to-death data on NHS patients throughout England;
  • “Round-the-clock online access by all NHS healthcare professionals to patient records and information about best clinical practice;
  • “Seamless care for patients through GPs, hospitals and community services sharing information across the NHS information highway.”[18]
  • Weak
  • Fair
  • Good
  • Strong

Strength of evidence

There were a number of EHR systems developed in other countries. “EHRs are being introduced in Europe, North America, Australasia, the Middle East, and elsewhere.”[19] However, none was on the scale envisaged for NPfIT, which was on a par with the largest US military IT projects rather than any civil projects.

There was some use of pilot projects used to test the principles of NPfIT: there was some “implementation and adoption of EHR systems in NHS ‘early adopter’ hospitals”.[20] However, these were often unsuccessful. “Where the systems were installed, they frequently crashed… In September 2006 ‘Computer Weekly’ reported 110 ‘major incidents’ across the English NHS during the previous four months alone.”[21]

  • Weak
  • Fair
  • Good
  • Strong

Feasibility

CFH claimed that NPfIT would be “the world’s biggest civil IT programme”. This in itself indicated the need for exhaustive feasibility analysis. However, NPfIT’s technical feasibility was not properly investigated before the programme began nor was there a proper cost benefit analysis. “It was wildly overambitious. It was far from being essential. No one ever seems to have subjected it to a serious – or even a back-of-the-envelope – cost-benefit analysis. The programme’s alleged benefits, even if they accrued, were going to be outweighed many times over by its exorbitant costs.”[22] There was little account taken of the complexity of the NHS itself or the needs of the software’s endusers.

It had become clear by 2006 that NPfIT was out of control. In his evidence to the House of Commons Public Accounts Committee, Simon Bowers of The Guardian said that his research had thrown up a number of problems, as was indicated in his subsequent article. “Leading healthcare IT experts have warned that the NHS’s troubled £6.2 billion system upgrade is costing taxpayers substantially more than it should. They claim the same functions could be delivered for considerably less outside of the national programme for IT, dogged by delays and software setbacks.”[23] There were similar criticisms in publications such as ‘Computer Weekly’, ‘The Daily Telegraph’ and ‘Private Eye’. The Committee’s report, based on substantial evidence, was damning. It then took too long for the government to react and to accept that it was an undeliverable programme.


  • Weak
  • Fair
  • Good
  • Strong

Management

The initial strategy was that “national standards for NHS IT would be laid down but that local NHS trusts would be left free to commission their own suppliers and choose their own software”.[24] However, the programme became a centralised one, managed by the agency, CFH. CFH was based in Harrogate in Yorkshire and was responsible for estimating and procurement, selecting contractors such as Accenture and CSC, with little apparent oversight from the government.

The estimates of time and budget were unrealistic. “The timescale proposed for the system was ludicrously short… no one seems to have addressed the crucial, and predictable, issues of patient confidentiality… large parts of the enterprise were hopelessly mismanaged… The rate of turnover of officials at the Health Department was [high]… the project had no fewer than six ‘senior responsible owners’ during its first two years.”[25]

The requirements gathering also indicated a failure of project management. “Because non-clinicians developed the system, the electronic forms they designed have little to do with how doctors treat patients – making it unworkable for many physicians. As the Chair of the British House of Commons Public Accounts Committee recently stated, ‘This is the biggest IT project in the world and it is turning into the biggest disaster’.” [26] The head of the programme, Richard Granger, an IT consultant from Deloitte, resigned in 2007 once it became apparent beyond the confines of CFH that NPfIT was in crisis.

  • Weak
  • Fair
  • Good
  • Strong

Measurement

Evaluating the success of an IT project is dependent on the accurate definition of the software requirements and on a realistic development schedule and budget. The requirements of NPfIT NHS Care Records system were not specified in a professional manner, as was evident in the questioning of the House of Commons Public Accounts Committee. “Q188 Mr Bacon (B): If the clinicians were not really controlling the creation of the specification for the healthcare record, who was? Dr Nowlan (N): A design authority was established. Q189 B: Was this within the NPfIT? N: Yes; at the end of 2002. Q190 B: What experience did the design authority have of healthcare? N: In terms of the people who took charge of it, none to speak of. Q191 B: None? No experience of healthcare at all? N: No, not that I can recall. We worked within that team to produce the specification but it was done at breakneck speed and largely by putting together information from a whole raft of previous specifications and then it had to be reduced. I must say it was not exactly the ideal process to commit this sort of resource.”[27]

The costs and timescales of NPfIT were allowed to spiral out of control. “Estimates of the total cost of the programme over a decade edged upwards from £2.3 billion at one point to £6.2 billion and then to £12.4 billion… The system was supposed to be up and running in about three years. While 155 administration systems were supposed to be delivered by the spring of 2007, in the event only sixteen were.”[28]

The performance of the software did not meet its requirements, as shown by the experience of Royal Berkshire Foundation Trust’s implementation of the Cerner Millennium EHR system, the Trust being an NHS trust that took delivery of NPfIT software. “The software originally was envisioned as the ideal way to match information on patients to the right surgeons, beds and treatment appointment times, as well as helping staff retrieve and share patient details efficiently. But problems and delays with the system, aggravated by disputes over the hosting of the software with services firm CSC, meant that even after a delayed go-live of June last year, staff and patients alike are disenchanted with it. The implementation is ‘plagued with problems’, with it taking up to 15 minutes for staff to navigate their way through multiple screens to book routine appointments, causing severe patient backlogs.”[29]

  • Weak
  • Fair
  • Good
  • Strong

Alignment

There was a lack of alignment between the Department of Health and CFH, the agency responsible for the national delivery of NPfIT. “Criticisms included weak management and oversight of the programme and contracts that were poor value for money. The government, NHS England and NIB argue that the programme was too centralised, and insufficiently sensitive to local circumstances.” [30] There was also a lack of alignment between the prospective endusers of the software, such as clinicians, and the software designers and developers (see, for example, Measurement above).

Resources

Share