Amazon Web Services’ annual gathering in Las Vegas, re:Invent, continues to grow both in terms of attendance and the (almost overwhelming) number of product announcements.
AWS is the dominant player in cloud computing by a wide margin—as shown by G2’s own data—yet the move to the cloud is only just building up steam. Estimates vary, but IT spend in the cloud is likely a single-digit percentage of overall spend; clearly there is substantial opportunity for growth, as ever-larger pieces of existing workloads transition and new ones are developed.
"One size fits all" is not a sufficient answer to the trillions of dollars of on-premises IT that could move to the cloud, and some of AWS’ key announcements this year are a clear reflection of that. AWS has added new options for where and how workloads are run and data is stored and managed; new capabilities for accessing and analyzing data to enable deeper data-driven insights; and more features that empower management of how cloud computing is used by enterprises.
The cloud is a vehicle for digital transformation
Cloud computing has often been used as a flexible supplement for enterprise IT, with benefits such as pay for what you use, scalability, and time to value all high up the agenda. Cloud computing is— and will continue—evolving at a rapid pace, bringing ever-greater amounts of enterprise-friendly functionality that has helped make it not just a supplement to enterprise IT, or even only a viable alternative. Instead, cloud computing is a desirable destination that helps transform how organizations operate.
AWS CEO Andy Jassy used part of his keynote to talk about the role of cloud computing in digital transformation, showcasing speakers from Goldman Sachs, Cerner, and Verizon to discuss their work with AWS. Transformation is no small undertaking, and requires resources and—critically— sponsorship from executives around an organization; Jassy talked about the need for C-suite sponsorship and that the business must set aggressive, top-down goals for its program. He also called out the need to train the developers and builders who are working with cloud capabilities to realize the goals.
Jassy also discussed some of the key trends AWS is seeing among its customer base. Mainframes replacement, on-premises relational databases moving to cloud options, and transitions from Windows to Linux were all highlighted. Each of these examples underscored the evolution of the cloud’s capabilities, and they are representative of bigger, transformative changes in the industry—utilizing the constantly growing range of capabilities offered by public cloud vendors, such as AWS.
This year’s re:Invent felt like a confirmation of the transformative capabilities of the cloud with product announcements and enhancements that provide further options, and therefore stronger support to enterprises evolving their adoption of cloud computing: from supplementing and experimenting to bigger, transformative commitments.
With those bigger changes comes the need to ask bigger questions about governance and vendor dependence. Multi-cloud—the use of more than one public cloud vendor to meet IT requirements—has become a popular discussion point, and it covers a multitude of topics: from lessening the dependence on a single vendor and geographic requirements, through taking advantage of the most appropriate tool for a particular use case. This is an area of great interest for us at G2. My colleague Zack Busch even published some of his thoughts about the case for multicloud infrastructure adoption.
AWS Outposts brings the cloud closer to the customer
Meeting the use case where it is best served has not always been possible with the public cloud. Three cloud compute-extending announcements at re:Invent stood out as solutions to the expanding range of enterprise requirements.
AWS Outposts were announced at last year’s re:Invent; this year’s event marked the solution's availability. AWS Outposts are AWS cloud computing for the customer’s own data center—bringing the same AWS infrastructure and capabilities, managed and updated by AWS, to on-premises use cases.
Two particular enterprise requirements define the need for AWS Outposts: latency and data location. Where an application requires super-low latency to serve its use case effectively (for example, in financial services) the public cloud may have the right tools, but distance from the cloud data center may introduce unacceptable latency. Data storage and processing may also be subject to regulatory or other requirements that rule out the possibility of storing it in a cloud environment. AWS Outposts is aimed firmly at both these requirements, delivering the cloud computing capabilities of AWS on-premises with unified management to serve the hybrid environment created.
From on-premises to the edge of the network, AWS also announced AWS Local Zones. AWS Local Zones are essentially mini availability regions that will serve AWS capabilities physically located close to users. Again serving low latency requirements, the first AWS Local Zone will be in Los Angeles, CA, to serve film and media use cases. AWS Local Zones will deliver the compute, storage, and database services (among others) familiar to AWS users with standard pricing and are physically connected to one of the AWS Regions.
The edge of the network connects many things, and the rollout of 5G is undeniably driving a raft of new use cases powered by higher bandwidth and lower latency connectivity. AWS announced AWS Wavelength at re:Invent, which embeds AWS compute and storage capabilities directly within telco’s 5G networks. Designed to offer single-digit millisecond latencies, AWS Wavelength will enable new smart device-delivered experiences, most likely to be exercised in mobile gaming first. AWS is currently working with Verizon, Vodafone, KDDI, and SK telecom to deliver AWS Wavelength.
Extending choice, functionality and accessibility for data, analytics and machine learning
The cloud is a natural home for data with elastic scaling, faster spin-up/down and multiple storage options to accommodate both data and use case types. AWS already offers a significant array of storage and database options, covering operational, analytical, graph and time series to name a few. Building on this foundation, AWS announced several new capabilities designed to enhance the accessibility of data for applications and analytics. It also offered some major new products for machine learning (ML) projects, which represent, in this author’s opinion, the practical and realistic use cases of AI for the majority of enterprises—particularly making ML more accessible to a broader audience.
Amazon S3 Access Points provides customizable (unique name and access permissions) access to shared data sets held in S3 for applications. S3 Access Points can scale to hundreds for the same S3 bucket, delivering a simplified and governed way to make frequently used data more accessible.
Amazon Redshift Federated Query delivers on an important capability, the ability to analyze data across multiple data sources from queries built within Redshift. This allows for the combination of data from, for example the data warehouse and a transactional system. (G2 Research called out the necessity of this type of capability in our recent forecast of 2020 trends in technology and ethics management.)
Amazon Aurora Machine Learning brings machine learning-powered predictions on relational data to databases and applications using familiar SQL language, removing the high skills barrier to adopting ML.
Amazon SageMaker Studio delivers a single environment for tracking and management of ML projects, that includes a repository for models, offers data lineage to improve reproducibility and integrated debugging and profiling. SageMaker Studio is, in our view, core to helping manage and govern the growing number of ML projects found in enterprises.
Amazon SageMaker Autopilot offers users automatic model training, a major announcement from re:Invent and another important move to eroding the skills barrier to using ML. Integrated with SageMaker Studio, Sagemaker Autopilot allows users to input data and specify a prediction target; the solution will then run multiple models to find the best fit in terms of both accuracy and latency. Results are not a black box either, as notebooks for audit and control purposes can be generated from the solution as well.
An enterprise-ready tool kit for cloud computing
The addressable market (both existing on-premises workloads and net new) for cloud computing remains a huge opportunity for enterprises to transform the way they operate, and vendors to gain market share. AWS took the opportunity of re:Invent to position its portfolio as one of the broadest and deepest range of cloud services available. It has extended the range of use cases that can be served by the cloud, delivering new compute, storage, analytical, ML, location and developer products.
This note has covered a handful of the announcements which we believe are aimed at making cloud computing more digestible for enterprise IT—my colleague Adam Crivello recently wrote on the release of CodeGuru, and G2 will provide further analysis on other products over the coming weeks.
Tom is vice president of technology research at G2 leading three key topic areas: AI & analytics, cloud & IT, and security & privacy. Tom's entire professional experience has been in information technology where he has worked in both consulting and research roles. His personal research has focused on data and analytics technologies; more recently, this has led to a practical and philosophical interest in artificial intelligence and automation. Prior to G2, Tom held research, consulting, and management roles at Datamonitor, Deloitte, BCG, and Ovum. Tom received a BSc. from the London School of Economics.