Analysis Walk into any hospital and ask the same question – “Which security system should we invest in?” – to both a doctor and a board member, and you may get different answers. The doctor chooses the system that leads to the most positive patient outcomes, while the board member chooses whichever solution is best for their increasingly stretched budget.
That’s the opinion of NHS leaders right now, and therein lies one of the fundamental issues in building cyber resilience in the UK’s National Health Service (NHS). It’s a difficult fact to swallow, but the key industry decision-makers aren’t making the most cyber-secure choices, and there are few incentives for them to do so.
The Register attended a roundtable discussion held between senior NHS IT and security folk recently, and contrary to popular belief, solving the healthcare security crisis can’t be fixed simply by throwing more money at the problem.
It might help some hospitals to an extent, but the very idea that a fat cash injection could be a silver bullet for healthcare security was immediately met with unanimous headshakes from those in attendance. It’s just not that simple.
The message was clear, however, that the country’s NHS, beloved by all, revered around the world, and one of few shining jewels in the UK’s otherwise splotchy crown, has a severe culture problem – at least when it comes to security. Go back to that first question and you can get a sense of what we mean.
When clinical decisions aren’t made with consideration of the cybersecurity implications, the same attitude trickles down the wider organization, and insiders believe the NHS should take a leaf out of the finance industry’s playbook. It would be doing so 30 years late, but there’s no time like the present.
Does the answer wear a quarter-zip and gilet?
Cyber resilience is difficult to achieve even for organizations that don’t have the financial difficulties of the NHS, or its disparate, often aging systems. The issue threatens the safety of all hospitals, especially if a ransomware payment ban – like the one being considered in the UK – is passed into law.
The view among techies working in the UK’s healthcare industries is that if board members were held personally liable for cybersecurity failures, a practice that’s slowly increasing across the private sector, then the number of serious incidents would fall.
Attacks are on the rise, but the NHS continues to deal with the same security problems it was wrestling with a decade ago. Insiders doubt whether the organization has even learned anything from WannaCry, let alone the various major attacks since.
The consultation period for the UK’s proposed public sector ransom payment ban ends next month. Of the three proposals on the table, two would outlaw payments made in the public sector, including the NHS.
Hospitals will be without a legal route for paying off a ransomware gang. They generally don’t in the UK anyway, which is partially why events such as the attack on Synnovis tend to be so disruptive. Hospitals would be left to rely on their backups, but what if they were corrupted during the attack. What if they weren’t made recently enough?
Higher-ups within NHS Digital think the time has come to introduce personal liability for board members as a next step toward ensuring money is spent on achieving cyber resilience across the organization.
The idea is that such a measure would bring similar accountability to healthcare board members, like how the Financial Services and Markets Act (FSMA) 2000 helped transform the finance industry into the highly regulated, secure behemoth it is today.
The FSMA ushered in the Financial Services Authority, which later became the Financial Conduct Authority (FCA), whose broad work has in part led to the wider industry deploying systems robust enough to mitigate serious attacks.
Now we have rules introduced by the Securities and Exchange Commission (SEC), plus regulations such as NIS2 and DORA, all working to ramp up the pressure on executives to prioritize security or face legal consequences.
The UK’s Cyber Security and Resilience Bill (CSRB), announced in the King’s Speech last year, aims to strengthen its existing NIS regulations and may also introduce personal liability provisions, although the consultation made no mention of it. The bill is expected to start progressing through Parliament later this year.
NHS doesn’t help itself
The security folk inside the NHS would love to start building more cyber-resilient systems and future-proofing them from attacks, but one of the main inhibitors of long-term progress within the organization is the way it issues budgets to various departments.
IT consultants who go from working with banks that offer backing for long-term technology transformation plans for five or even ten years ahead, to the NHS where budgets are often issued for a single year only, say the inability to forward-plan a security transformation must change.
One leader said they were incredibly lucky to get a three-year period of committed funding for a frontline digitization project recently, which was seen as something of an oddity.
For most, however, it takes around the same time to even get projects off the ground. Even if the funding is made available, the number of different bodies and trusts required to sign off on major plans leads to long delays in progress. If whatever solution is being procured hasn’t been paid for in full by then, the budget is taken away and no positive change is brought about at all.
There is also a feeling among higher-ups that the NHS is bad at managing the contracts it dishes out, not keeping an eye on service level agreements and holding suppliers to account for failures. This has led to a distrust of vendors within the organization. Many believe vendors aren’t acting in the NHS’s best interest and ultimately contribute to its issues.
Handling a ban
Fundamentally, the security issues at the NHS aren’t just monetary, although more robust finances would certainly help with its budgeting woes. It lacks a top-down security culture, one that makes the NHS’s path to cyber resilience everyone’s duty and simplifies the process of implementing security measures for those who are crying out for it.
It says a lot that healthcare insiders actually miss the COVID-19 pandemic, because during that time they say the NHS was, for the first time in their careers, agile enough to allow improvements to be made without the usual onerous approval stages.
Reinventing the way in which the organization does security may be the only way it could survive a ransomware payment ban should one be passed in the UK.
The topic remains a highly contentious issue with experts firmly divided about whether a ban should be introduced, or even if it’s logistically feasible for in-scope organizations such as the NHS.
Without the ability to pay a ransom – again, not that the NHS is known to do so – its choices in a crisis become even more limited.
Hospitals generally manage well in these crises. Sure, stress levels are higher, and normal operations that are automated or digitized must be completed manually, yet – bar short-term interruptions – patients continue to be treated. Usually, care isn’t affected too badly, although that’s not true in some tragic cases.
Whether a ban is introduced or not, the general consensus among incident responders and national security agencies is that organizations must be incentivized to improve their security. Whether it be via cyber insurers demanding more from policyholders, more stringent regulations, or a combination of multiple measures together, more needs to be done to force organizations to become more cyber resilient.
Those measures could come with the CSRB and it wouldn’t be surprising if they did, given that all the other major regulations of recent times have included such provisions.
It’s worth noting that, as with the ransom payment debate, there might not be total consensus on introducing board-level personal liability. However, strong voices within the NHS argue it could be a useful tool for improving its security culture.
If boards became responsible for every factor influencing security outcomes, and became personally liable for any failures, then it’s highly likely that improvements will be seen, insiders argue. Saving one’s bacon to get out of a security pigsty? Now there’s an idea.
Insiders spoke freely during a recent discussion held under Chatham House rules, which allow reporting of what was said, but forbid identifying who said it.
The Register contacted the NHS for a response. ®
0 Comments