A year ago, few businesses could have anticipated the dramatic changes that 2020 had in store for them. By the end of March, the COVID-19 pandemic had disrupted operations worldwide, forcing businesses to quickly adapt their technology infrastructures to accommodate all or most of their workforces remotely and cope with unprecedented levels of long-term uncertainty. For many businesses, re-vamping IT infrastructure has been a key to survival.
For many organizations, that meant accelerating plans to move additional workloads to the cloud by adopting a hybrid cloud environment. As companies make their digital transformations, they are realizing the value of hybrid cloud environments and how to get the best out of each cloud solution.
Although businesses are still dealing with high levels of unpredictability heading into 2021, several trends that emerged this year enable us to make some predictions about what to expect in the coming year. 2020 ushered in new needs for technology, with the pandemic prompting businesses to consider a number of challenges related to growing hybrid cloud use, including:
- Wider adoption and experimentation of new security technologies, including Confidential Computing, quantum safe encryption and fully homomorphic encryption.
- AI automation making the shift to hybrid cloud faster and easier.
- The integration of many clouds and on-premise systems into a single hybrid platform
- The ability to leverage hybrid cloud to push more workloads onto intelligent edge devices.
Next year, we expect businesses to address these challenges in ways that will apply new resources and strategies to drive business outcomes, in a world that will continue to require new advances in cloud and AI research. We detail our predictions below.
In 2021, security technologies such as Confidential Computing, quantum-safe and Fully Homomorphic Encryption will make even the most regulated industries move to the hybrid cloud.
It’s already apparent that companies will continue to decentralize IT operations to hybrid cloud environments in the coming year — and companies in even the most tightly regulated industries will be among them. To do that successfully, organizations need to take security measures that improve isolation, ensure system and data integrity and implement zero trust strategies — while remaining compliant with tougher data privacy regulations worldwide — all as complex security threats evolve. Hardware systems that provide these security capabilities will be widely adopted to protect on premise and public cloud workloads. These hardware systems such as LinuxONE and IBM Z provide a higher level of security for open source and traditional workloads.
Industry-specific clouds, such as IBM Cloud for Financial Services and IBM Cloud for Telecommunications, are designed to address unique challenges and security requirements of highly-regulated industries. Across the board, we’ll see providers continue to invest in security innovation as businesses look to embrace technologies including Confidential Computing in their hybrid cloud environments as a way to protect data during processing and at rest. Confidential Computing, combined with data encryption at rest and in transit with exclusive control of keys, protects sensitive and highly regulated data sets and application workloads.
Additionally, technology companies including IBM are pioneering quantum computers which are poised to solve some our most challenging problems the world’s most powerful supercomputers cannot solve. It also poses potential risks, such as the ability to quickly break encryption algorithms and access sensitive data. We expect companies to start deploying quantum-safe cryptography and prepare for when large scale quantum computers become part of our daily lexicon – not only to secure data available today but to help protect against future threats.
Similarly, more and more enterprises will begin experimenting with Fully Homomorphic Encryption (FHE) to protect their data. Today, encryption allows us to add an additional layer of protection to data by keeping the data hidden within a mathematical formula that can only be read by those with access to a secret “key.” However, it has its limits. FHE, on the other hand, allows data to remain encrypted even during computation. For example, insurance companies can run analysis on patient healthcare data, without any personal identifiable information being visible to the insurer.
AI will automate the shift to hybrid cloud by teaching machines to “reason.”
AI technologies like graph-based techniques, natural language processing (NLP) and explainable AI are already being applied to human language – think voice recognition and language translation apps. Now, applying the same AI to machines’ code will significantly accelerate moving applications to the cloud and subsequent manageability. These AI techniques offer reasoning about applications’ behavior and its structure to recommend and automate the generation of identified microservice candidates.
This approach moves beyond the “traditional” process of containerization. Automation is necessary when migrating mission-critical workloads to cloud environments. That is due, in part, to the complexity of on-premises mission-critical workloads. Often, companies first need to determine exactly where their mission-critical applications are running. Once that’s done, there is a lot involved in moving these applications and data that have been operating on-premises for years to hybrid cloud environments, parts of which those companies may not directly control.
AI will also improve the experience for cloud developers and reliability engineers; from automating the modernization and deployment of applications onto new environments to assisting with the day-to-day application management. In fact, the role of site reliability engineers is poised to grow, as enterprises accelerate the use of AI-based techniques and strategies like ChatOps to manage their applications and environments. Reliability engineers will anticipate and address risks proactively, as well as draw insights from more complex unstructured data, a critical function as applications operate in hybrid cloud ecosystems.
Open-source tools will help unify clouds, making the skills developers need to program and use a hybrid cloud a lot simpler.
Today, if you want to crunch a large dataset from your laptop, which might require using 100,000 containers, you need to know how to re-code applications for the hybrid cloud. Developers need access not only to a hybrid cloud platform but also to tools and frameworks that enable them to solve problems and be productive. Developers and data scientists without years of experience with containerization, parallelization and container orchestration tools, however, find it very difficult to program in hybrid cloud environments.
In the coming year, open-source tools will help integrate many clouds and on-premise systems into a single, seamless hybrid platform by shortening the learning curve for programmers and non-programmers alike. Companies will adopt an application deployment model that’s easier for those without lots of hybrid cloud expertise to program and use. This advance will free up programmers with hybrid cloud expertise to work on higher-value projects. We need subject matter experts to be able to focus on the actual problem they are trying to solve rather than on how to run their software efficiently across multiple clouds. Programs like IBM Cloud Satellite, which leverages Red Hat OpenShift, allow users to build, deploy and manage cloud services across any environment, from a single dashboard.
We’ll see some of the most advanced and powerful hybrid cloud hardware innovations extend to edge devices, thanks to breakthroughs in computing hardware efficiency.
At its core, the power of hybrid cloud is about bringing powerful computing to your fingertips from any infrastructure. But the most powerful computing has been historically confined to the data center.
Hardware for AI model training is notoriously resource hungry, consuming money, time, and energy. For example, the largest industrial-scale model currently deployed, GPT-3 from OpenAI, is 175B parameters – or more than 100 times bigger than models from just a couple of years ago. It costs several million dollars to train and generates a carbon footprint during training that is higher than the lifetime emissions of 20 cars.
In 2021, we will see major breakthroughs in AI hardware used to build and deploy AI models. The efficiency of AI training systems will increase by almost an order of magnitude in comparison to the best commercially available systems today. Coupled with 5G advances, sustainable AI computing at the edge could erase the border between cloud and edge – offering a key technological upgrade for hybrid cloud infrastructures, and a major advancement for privacy and security of AI models by keeping more data at the edge. 5G cellular architecture is expected to be a catalyst to trigger wide-spread adoption of edge computing.
AI hardware accelerators on hybrid cloud infrastructures could support large AI training jobs in data centers. And the same core AI hardware technology could also be deployed at a smaller scale or embedded in other processors at the edge. Expansion of OpenShift compatible hardware accelerators will further support flexible deployment of our AI hardware computing advances all the way to the edge.
As you can see, 2021 will be focused on both improving AI’s power and efficiency, so that it can help businesses meet a number of challenges to securely deploying, simplifying and managing hybrid cloud environments for a larger number of users.
Although crystal balls are in short supply during these disruptive times, it’s clear that hybrid cloud technologies will continue to be a breakdown of barriers for the enterprise and be a forward-looking strategy.