Data-Driven Cloud Migration for Public Sector Healthcare Organizations

In the wake of COVID-19, it should be obvious to everyone that public sector health organizations have operated in extremis over the past two years. It is also now clear that technological capacity in general – and the power of the cloud in particular – has an outsized impact on how effectively these critical institutions can respond to changing demands.

As Deloitte noted last month, “The flexibility and scalability of the cloud has enabled governments to address pressing challenges…

READ MORE

In the wake of COVID-19, it should be obvious to everyone that public sector health organizations have operated in extremis over the past two years. It is also now clear that technological capacity in general – and the power of the cloud in particular – has an outsized impact on how effectively these critical institutions can respond to changing demands.

As Deloitte noted last month, “the flexibility and scalability of the cloud has enabled governments to meet pressing pandemic challenges, such as massive increases in demand for services or the sudden shift to remote working. “. Estimates indicate that federal cloud spending in healthcare grew by 46% between 2020 and 2021; a FedRamp survey last year indicated that about half of all US state, local and federal governments say they have some or most systems and solutions in the cloud.

But that still leaves much of the public sector — and its health-related services, agencies, and institutions — working without the power of the cloud.

While a multitude of non-trivial factors have contributed to this shortfall, not the least of which are uncompromising operating models tied to existing on-premises infrastructure, there are now migration approaches that can both significantly facilitate and accelerate the path to public sector IT modernization, and with powerful effect.

Do what matters most first

When transitioning to a new infrastructure, traditional migrations have focused on applications, because that’s how most organizations view their workloads: “We log into this system to do this, and we log to another system to do this.”

In information technology, day-to-day work revolves around the applications that people interact with; their tasks and deliverables are really centered around this application space. So it’s natural to think in terms of application migration because that’s what we work with directly. We tend to think of data as something that “accompanies” applications.

Data-driven migration is about recognizing that it’s actually the other way around: data is what really matters most.

Applications are inherently transient; they change over time as real-world needs change. The apps we use today may not be the apps we use tomorrow. Either they are deprecated in favor of newer applications or they are modified over time with new features and functionality. The one thing that remains constant in this evolution is the need for data. The real value is in the datasets.

Data-driven migration is a “do what matters first” approach to cloud adoption. The main focus is on sending data to the cloud. It then becomes much easier to manage applications driven by this data and quickly and efficiently develop new applications around this inherently scalable cloud data repository.

Reasonable path to radical power

A lot of information about moving to the cloud touts radical digital transformation, which is great! But it also means that data migration necessarily involves drastic changes to existing workflows and operations, a huge red flag in the often overburdened and underfunded public sector.

It doesn’t have to be that way. The truth is, data-driven cloud migrations can be — and often are — incremental. Migration can be essentially transparent to the application layer initially, then iterated and optimized over time. The first phase may involve using the same set of existing tools: you can simply switch from an on-premises SQL server implementation to a cloud-based SQL server implementation, for example. It’s just that now the data for this toolset is in the cloud rather than an on-premises database. The cloud offers the opportunity for radical digital transformation, which can be incredibly creative and productive for the mission. But no one should boil the ocean on day one.

The only drastic change is the immediately wider degree of options available in terms of data constructs. Once data is in the cloud, there are new and different ways to use it by leveraging previously inaccessible resources: is a relational database still the optimal format for your needs? Where might you consider migrating to a data lake or separating operational data stores from analytics data stores? A host of new abilities are immediately viable.

Understanding the cloud economy

Considering these types of options becomes viable because the economics of provisioning and deprovisioning compute are very different in the cloud. For example, a public health organization might need to purchase $1 million worth of servers to equip an existing on-premises data center to run machine learning experiments. But with data in the cloud, you can spin up a supercomputer in the sky in minutes, run an experiment for half an hour, then shut it all down, and it can cost a few hundred dollars.

By building one or more public clouds, an organization can quickly create new environments and selectively use as-a-service options while avoiding the expense of acquiring, storing, configuring, and managing physical infrastructure by internal.

A prototypical example of this polarity can be drawn from the evolution of computing at NASA. Consider the Voyager 1 and 2 spacecraft, which were launched in the late 1970s. When they sent amazing images back to Earth, NASA had to commission new supercomputers and several million capital expenditures to work with the data, and it took months to perform image analyses.

Fast forward to the Cassini and Mars missions, after the cloud transformed NASA’s capabilities. As explained at the SpaceOps 2010 conference, “Instead of procuring a machine, a project simply rents a machine and pays for it by the hour. In fact, sometimes a mission can rent a machine for as little as 3 cents an hour. If an application requires a machine, it can be provisioned in 5 minutes instead of 5 weeks.

In 2012, using the cloud, NASA would “process nearly 200,000 Cassini images in a few hours for less than $200 on AWS,” according to NASA JPL Principal Solutions Architect Khawaja Shams, where previous inelastic internal resources devoted “15 days to the same task”. As the Curiosity rover transmitted the first images of Mars to Earth in August of that year, “all of the raw images that came in went straight to AWS. People everywhere could see them on their smart devices. NASA’s JPL was able to release 150TB of data in just a few hours. »

It was ten years ago; these days, the power of the cloud is helping NASA fly small helicopters on the red planet.

What is won

NASA’s journey to the cloud illustrates orders of magnitude reduction in public sector computing costs combined with mind-boggling advances in speed, scale and capacity. By all traditional measures of return on investment (ROI), cloud adoption clearly demonstrates excellent stewardship of public funds. Data-driven migration is an accessible way to capture this value.

Transposed to the public health mission in our time, it is important to recognize even more compelling incentives for transitioning to the cloud. The ultimate return on investment for the organization is agility. Public sector agility greatly affects the day-to-day affairs of human beings right here on Earth: the way to quickly analyze and disseminate images of Mars is also the way to set up a regional COVID response in days instead of month.

In a field like healthcare, this agility literally saves lives.

Gerry Miller is founder and CEO of Cloudticity.

Comments are closed.