2022 Predictions: Enterprises will Embrace a Data-led Approach to Mainframe Infrastructure Modernization
Changes in technology and business conditions are driving fresh thinking about data and computing strategies.
What’s coming is not a revolution but a triad of evolutions that will keep data relevant and deliver more value to organizations from their IT.
Clothe Your Enterprise in a Data Fabric
The concept of fabric has been applied in the past primarily to computing infrastructure. Fabric architectures typically were a way to link many compute nodes via high-bandwidth technology, so that they could effectively coordinate in solving large or complex problems. Data played a part but infrastructure was the focus.
Data Fabric builds on that architectural idea aggregating and connecting data, but as a practical matter can be thought of simply as the opposite of an architecture built around data silos. Today, data siloing is much more common than most organizations are aware. Differing means of storage, different formats of data, and different ties between data and specific applications tends to separate different collections of data and limit their use by other functions or applications.
This can have some functional benefits for specific tasks but from a strategic standpoint it limits the ways in which data can be used and reduces the possible insights that can be derived from that data. It also often involves supporting a complex and expensive architecture of physical storage devices.
Rethinking data strategy in terms of a data fabric means aiming for maximum access to data for the most possible applications. It also means releasing data from reliance on traditional, expensive, siloed technology such as VTLs and favoring instead flexible, fabric-like cloud storage options.
Most critically, a data fabric includes the idea of smarter and more automated data management. Gartner has gone as far as comparing the approach to the self-driving automobile where a human can operate – or just let the vehicle take over the task, depending on the level of comfort and confidence attributed to the automation. In an optimal iteration, data can be assessed, analyzed, and connected nearly continually, delivering more, better, and faster insight.
The tools and technologies are now readily available. Choosing to implement data fabric delivers more value and can significantly cut costs, as well. For example, a fabric approach can enhance existing data creating better integration and connections using an agile and adaptable cloud-based approach that frees data and delivers more and better insights.
Lead the way on Migration
A related concept also puts data at center stage: migrating with “data first” to the cloud before considering any other migration projects. When organizations recognize that data is the most fundamental IT asset, they realize its location and accessibility matters. Rather than looking at their IT strategy based primarily on existing applications and processes, they see that the data itself needs to be the focus.
With this approach, data is freed from silos and made more accessible through techniques such as automated tiering, adding cloud as a low cost storage tier. This best practice enables change, sparking new analytic initiatives using powerful cloud-based tools, and encouraging more efficient and relevant ways to conduct business.
By enabling data movement first, organizations can show interim progress and are then able to better prioritize and strategize regarding how they move, rehost, or replace their applications. Moving the data and making it more accessible clarifies the entire focus of IT, yields immediate value from more ubiquitous availability, and enables other changes in an orderly and efficient manner.
Superpower: Mainframe + Cloud
The third forthcoming development is repositioning mainframe computing to maximize its strengths and optimize its position in the broader IT ecosystem.
For concentrated, reliable processing power the mainframe remains unmatched. Today’s enterprises must handle many “bursty” processing tasks and ad hoc requests to remain responsive in a rapidly changing marketplace. They must be able to delve into new, compute-intensive analytic tasks. Provisioning for maximum requirements can be an expensive and a wasteful proposition since much of the excess capacity could sit idle for long periods of time.
To succeed, organizations need access to the additional, on-demand computing power available through large cloud providers, and only pay for extra capacity when needed. They must find an effective way to use on-premise, and cloud computing together.
This third triad element completes the picture. Data fabric architecture and data first modernization and migration enhances an organization’s ability to grow and adapt. The equally important embrace of mainframe and cloud together, ensures organizations will have the means to achieve insights and implement the right changes faster than their competitors with lower risk and immediate benefits.