By James A. Martin
The 2020 CES conference provided a peek at what the near future might look like, provided you’re on board with robotic pets, smart deadbolts and air-conditioned baseball caps.
In the business world, emerging technologies are more practical, helping organizations create, collect, and make sense of a growing volume of data; be more efficient; drive better business outcomes; and bolster resilience against natural and human-made disruptions.
In this post, Sungard AS thought leaders share their forecasts for the emerging technologies of 2020 relevant to enterprises.
(See their 2019 tech predictions: Top 5 Technologies Most Likely to Impact Your Business in 2019 and Beyond.)
At CES, wireless network carriers insisted that 2020 will be a turning point for 5G, the long-gestating, fifth-generation cellular data network. AT&T and Verizon, for instance, said at CES that their 5G networks will be available throughout the U.S. this year, and that at least 15 smartphones released in 2020 will be capable of fully supporting the new network.
“With 5G, we’re going to witness a huge step forward in the density of data individuals can create and share,” says James Valdez, Senior Project Manager. “There may become a point where 5G starts making land-based telecom and data facilities obsolete.”
5G will help make other technologies even more valuable to businesses, such as IoT devices. “IoT has been around in various forms for some while now,” notes John Young, Vice President of Solutions Engineering Architecture, EMEA. “But its true potential will be unleashed with the higher capacity and speeds available on 5G networks and the ubiquity of high-speed data connections. This will open up more use case scenarios from areas as diverse as healthcare to manufacturing to vehicle telemetry to smart everything.”
IoT — in the form of smart sensors that track assets (such as the locations of a hospital’s EKG machines) and perform other tasks — will have a big impact on business operations in 2020, predicts Haim Glickman, Senior Vice President of Global Solutions Engineering. “Just about every organization is starting to use IoT in one way or another, and they aren’t slowing down,” he says. It helps that IoT security has been improving as well, he adds.
There are some challenges to widespread adoption, however. The data collected from, and the code built into, IoT sensors needs to be stored somewhere, and it must be redundant, Glickman notes. “The systems that collect the data and the automation around the IoT services still have a single point of failure if they’re not backed up,” he says.
In addition, the IoT market — especially the Industrial IoT (IIoT) market for businesses — is still new. And most cloud infrastructures such as Microsoft Azure and Amazon Web Services (AWS) have proprietary coding and approved devices that will work within their frameworks, Glickman says. “Developing an open programming platform for IoT devices will be the trick for wider adoption,” he adds.
In 2020 and beyond, the success of organizations will heavily depend upon how well they manage and analyze data to make better business decisions, which AI and ML will enable them to do, says Alex Ough, Senior CTO Architect.
“If a business can find potential issues and prevent them from happening, it can not only save costs but provide better service to customers,” Ough explains. “But it’s only possible when you can analyze the right data in the right way and make decisions based on the right analysis. New tools and applications are coming that will make data analysis and prediction easier and more convenient, but they’ll be of little use if the business doesn’t have the necessary data and appropriate skills.”
AI and ML will help organizations with Disaster Recovery/Business Continuity (DR/BC) as well, adds Joseph George, Vice President of Product Management. For example, an organization’s backup vendor might apply ML algorithms to analyze the customer’s backup statistics and metrics for improved operations and to more quickly identify ransomware or malware attacks. Recovery execution can become more intelligent and automated, taking into account considerations in the recovery process that are unique to a particular organization.
To leverage AI and ML, companies will need skilled talent in such areas as security (especially related to containers and serverless architectures) and automated testing for resilience as part of a Continuous Integration/Continuous Delivery (CI/CD) pipeline, says Todd Loeppke, Lead CTO Architect. At the same time, traditional data architect and business analyst roles will evolve into data engineers, data scientists, and machine learning architects.
But organizations can face obstacles in hiring for these skills, given the tight job market and strong competition. As a result, organizations must “learn how to efficiently propagate the knowledge and skills they have to cover the AI and ML skills shortage,” says Loeppke. “If done correctly, this can help companies better define technical career paths, which is often a secondary priority for them.”
AI, ML, and supporting technologies are helping to move digital twin technology into the mainstream, notes Kiran Chitturi, CTO Architect. A digital twin is a digital replication of a physical object or system. NASA pioneered the concept years ago with full-scale mockups of early space capsules used on the ground to diagnose potential problems in space.
“Digital twins can greatly enable businesses and their operations to become more resilient,” Chitturi says. “With digital twins, you can now have a virtual replica of a DR/BC product or service that can be used for analysis, tested for various failure events, and how the system can recover from such events. I see almost every industry leveraging digital twin technology to generate insights about their environments. Applying AI and ML makes it even more promising. This directly applies to improving DR/BC posture and operations and enhancing and automating processes.”
To succeed with digital twins, an organization’s IT team members must have an “automation mindset” to map a current product or service as a digital twin, along with an understanding of cloud computing, IoT, and scripting skills and the ability to apply analytics to business processes, Chitturi adds.
Gartner defines edge computing as “an emerging topology-based computing model that enables and optimizes extreme decentralization, placing nodes as close as possible to the sources and sinks of data and content.”
Edge computing arose from the huge growth of IoT devices, many of which generate vast amounts of data during operations, and it will likely become more widespread as 5G connectivity becomes more widely available.
With edge computing, 5G, IoT and other advancements, companies will have more leverage to create workloads that can be more easily shifted around, which can help with business continuity and resilience. “The idea is to have a workload run in the location that works best for that workload,” says Bob Peterson, CTO Architect. “In steady state, it may be running in your own data center or hosted by a local provider. If needed, you can migrate that workload to another location prior to an event.”
For example, if a hurricane is predicted for your area, you might push a workload from your on-premise system in the hurricane’s projected path to a cloud provider’s infrastructure or even to a mobile data center truck located beyond the potential storm area. “You can also have a DR/BC environment that mirrors the steady state, so you can shift to that in an unplanned emergency,” Peterson says.
There will be opportunities for local governments and municipalities to take advantage of fluid workloads enabled by 5G, IoT, and edge computing, Peterson adds. During non-emergency periods, their steady state may be running in an Emergency Operations Center (EOC), cloud infrastructure, or a hybrid location. “This may be where development, upgrades, and even training exercises could be run,” he explains. In an emergency, the government or municipal agency could shift its systems to a localized version in an EOC or to a mobile platform, making their operations less vulnerable.
“I see lots of opportunities to find better ways of creating ‘fluid’ workloads that can flow between hosting platforms,” Peterson says. “The movement may not be instantaneous. The idea is to make it more seamless.”
Every workload is unique, Peterson points out, which can present challenges to organizations. “Some environments may have specific needs that make portability difficult,” he explains. For example, a hospital’s on-premise patient care system may have all the access it needs for patient monitoring, labs, and imaging systems. But quickly shifting that system to another location may not be a simple task.
To take advantage of workload portability, an enterprise needs IT people with experience building such solutions. “Organizations that don’t have the capabilities to do this on their own will ultimately look to partners who know how to do this,” Peterson says. “Partners need to have the right knowledge and skills to be able to apply new technologies to solve a variety of different business needs.”
5G, IoT, AI and ML, digital twins, and edge computing aren’t new technologies per se. But each is coming into its own in 2020. And we’re just beginning to see the business benefits they bring — from super-fast access to information wherever we are to fluid workloads that can moved around easily. Whether used together or in isolation, these technologies can play a role in bolstering your organization’s resilience against unexpected events. For, if nothing else, the unexpected is the one thing you can reliably expect in 2020.
James A. Martin has written about security and other technology topics for CSO, CIO, Computerworld, PC World, and others.