A speedy recovery: the key to good outcomes as health care’s dependence on data deepens

By WANdisco
Nov 14, 2017

It may have been slow to catch on compared to other industries, but the healthcare sector has developed a voracious appetite for data. Digital transformation topped the agenda at this year’s Healthcare Information and Management Systems Society (HIMSS) conference in Florida, and Big-Data analytics in healthcare is on track to be worth more than $34 billion globally within the next five years – possibly sooner

 Electronic health records are growing in importance to enable more inter-disciplinary collaboration, speed up communication on patient cases, and drive up the quality of care. Enhanced measurement and reportinghave become critical for financial management and regulatory compliance, and to protect organizations from negligence claims and fraud. More strategically, Big Data is spurring new innovation, from smart patient apps to complex diagnostics driven by machine learning. Because of their ability to crunch Big Data and build knowledge at speed, computers could soon take over from clinicians in identifying patient conditions – in contrast to doctors relying on their clinical experience to determine what’s wrong.

But as healthcare providers come to rely increasingly on their IT systems, their vulnerability to data outages grows exponentially. If a planned surgery can’t go ahead due to an inability to look up case information, lab results or digital images, the patient's life might be put at risk. 

Symptoms of bigger issues

Even loss of access to administrative systems can be devastating. The chaos inflicted across the UK National Health Service in May following an international cyber-attack – which took down 48 of the 248 NHS trusts in England – gave a glimpse into healthcare’s susceptibility to paralysis if key systems become inaccessible, even for a short time. In the NHS’s case, out-of-date security settings were to blame for leaving systems at risk. But no one is immune to system downtime, as was highlighted recently by the outage at British Airways, which grounded much of its fleet for days, at great cost not to mention severe disruption for passengers.

Although disastrous events like these instill fear in CIOs, they can – and should – also serve as a catalyst for positive action. The sensible approach is to design data systems for failure – for times when, like patients, they are not firing on all cylinders. Even with the best intentions, biggest budgets and most robust data center facilities in the world, something will go wrong at some point according to the law of averages. So it’s far better to plan for that than to assume an indefinitely healthy prognosis.

If the worst happens, and critical systems go down, recovery is rarely a matter of switching over to backup infrastructure and data – particularly if we’re talking about live records and information, which are currently in use and being continuously updated. Just think of the real-time monitoring of the vital signs of patients in intensive care units.

If a contingency data-set exists (as it should) in another location, the chances are that the original and the backup copy will be out of sync for much of the time, because of ongoing activity involving those records. In the event of an outage, the degree to which data is out of step will have a direct bearing on the organization’s speed of recovery

To ensure continuous care and patient safety, healthcare organizations need the fastest possible recovery time. But how many organizations have identified and catered for this near-zero tolerance for downtime in their contingency provisions? 

Emergency protocol

The issue must be addressed as data becomes an integral part of medical progress. Already, data is not just a key to better operational and clinical decisions, but also an intrinsic part of treatments – for example in processing the data that allows real-time control and movement in paralyzed patients. Eventually, these computer-assisted treatments will also come to rely on external servers, because local devices are unlikely to have the computing power to process all the data. They too will need live data backups to ensure the continuity and safety of treatment. 

On a broader scale, data looks set to become pivotal to new business models (for example determining private healthcare charges based on patient outcomes, otherwise known as ‘value-based medicine’).

While technology companies will be pulling out all the stops to keep up with these grander plans, maintaining data continuity in live environments is already possible. So that’s one potential barrier to progress that can be checked off the list.

FOLLOW

SUBSCRIBE

Get notified of the latest WANdisco Blog posts and Newsletter.

Terms of Service and Privacy Policy. You also agree to receive other marketing communications from WANdisco and our subsidiaries. You can unsubscribe anytime.

Related Blog Posts

https://www.wandisco.com/news-events/blog/tech-trends/how-iot-will-transform-transportation

Tech & Trends

How IoT Will Transform Transportation

IoT is at the core of forces reshaping transportation: providing greater safety; making travel more...

https://www.wandisco.com/news-events/blog/tech-trends/3-ways-og-industry-applying-iot-cut-costs

Tech & Trends

3 Ways the Oil & Gas Industry is Applying IoT to Cut Costs

Oil and gas companies that use IoT can cut operating costs and free up cash to finance migration to...

Cookies and Privacy

We use technology on our website to collect information that helps us enhance your experience and understand what information is most useful to visitors.
By clicking “I ACCEPT,” you agree to the terms of our privacy policy.

Cookie Setting