Everything we learned at AWS re:Invent 2018

Every year AWS announces a raft of small upgrades to its cloud products and services portfolio during its re:Invent conference in Las Vegas, often making it tricky to nail down a prevailing theme or a 'big' announcement.

This year was no different, with every day sprinkled with announcements, from a new service to ease data transfer from satellites to ground stations, to some more incremental updates to existing services, like three new EC2 instances or updates to the SageMaker machine learning platform.

That doesn't mean that the vendor was losing its lustre however, with some major step changes in the areas of blockchain, hybrid cloud, serverless computing and machine learning all announced from the stage in Las Vegas last week. Here's what we learned.

AWS gets serious about hybrid cloud

AWS gets serious about hybrid cloud

This was the announcement that got plenty of people talking, as AWS made a more concerted effort to get into the hybrid cloud space through its VMware partnership and a new product called AWS Outposts, a fully managed service where customers get pre-configured hardware and software delivered to their on-premise data centre or co-location space to run applications in a cloud-native manner.

Jassy said customers wanted "a way to provide AWS services, like compute, like storage, on-premises but in a way that really seamlessly and constantly interacts with the rest of my applications and services in AWS".

Read next: AWS gets serious about hybrid cloud with Outposts, so who is it for?

Nick McQuire, head of enterprise and artificial intelligence research at CCS Insight said: "The headline of the keynote was the fact that AWS is clearly doubling down on hybrid cloud and its partnership with VMware. AWS Outpost was arguably the shows key announcement and shows the firm is taking necessary and practical steps in delivering on critical customer needs for consistent hybrid cloud services. This is a major move in the firm's evolution in bringing AWS into the data centre."

Two types of user

Two types of user

Jassy used his keynote to earmark two distinct classes of AWS user today: the traditional 'builder' class of developers, and a growing class of enterprise users that value simple solutions over depth of product.

He described this new set of customers as less "interested in getting into the details of all of the services and stitching them together, they are willing to trade some of that flexibility in exchange for more prescriptive guidance that allows them to get started faster".

In the past this has included products like the Elastic Beanstalk container for web apps, or SageMaker for simplifying the design and deployment of machine learning algorithms. This year AWS added Control Tower, Security Hub and Lake Formation.

Control Tower "is the easiest way to set up, govern and secure a compliant, multi-account environment or landing zone on AWS" along with policy guardrails and analytics for visibility into this environment.

Security Hub is a "central hub to view and manage security and compliance across an entire AWS environment," which integrates with a bunch of best-of-breed vendors, including Splunk, AlertLogic and IBM Security.

Lastly, there is Lake Formation, a tool for simplifying the establishment of an enterprise data lake using a range of AWS tools and services. It promises customers the ability to set up a data lake in "days not months" with a point and click interface to identify data sources before automatically taking care of crawling schemas and setting metadata tags, along with a list of prescriptive security policies to put in place from day one.

Lake Formation is available today, while Control Tower and Security Hub are in preview.

Machine Learning

Machine Learning

On the AI announcements front, AWS unveiled: Amazon Personalise, Amazon Forecast, a machine learning Marketplace, a new chip, and some more granular updates to existing tools like SageMaker.

Personalise and Forecast and external versions of tools Amazon uses to build its personalisation algorithms and procurement forecast models. Both are available in preview.

With Personalise you provide AWS with an activity stream from your application and an inventory of the items you want to recommend, choose additional data streams you may want to include and the service will provide a recommendation engine which can be accessed with an API call. It is priced at $0.05 per GB of data ingestion.

Forecast is a managed deep learning service for time series forecasting, aimed primarily at supply chain and retail use cases. "Using historical data and related causal data, Amazon Forecast will automatically train, tune, and deploy custom, private machine learning forecasting models, so that customers can be more confident that they’ll provide the right customer experience while optimising their spend," the vendor said.

For SageMaker there is a new automated labelling tool for building accurate training data sets called Ground Truth, and a set of toolkits for reinforcement learning (RL).

Another significant announcement was for a standalone AWS Marketplace for machine learning, where customers can browse for pre-built algorithms and solutions built by other customers or partners.

Lastly, the vendor also announced a new machine learning inference chip called Inferentia.



On the storage front, AWS announced a new version of its Glacier archive storage option which is a fourth of the cost, called Glacier Deep Archive. Coming in 2019, and priced at at $0.00099 per gigabyte and month, AWS CEO Andy Jassy hailed it as the death of the tape archive.

The vendor also courted Windows users with the announcement of Amazon FSx for Windows File Server. "Amazon FSx provides shared file storage with the compatibility and features that your Windows-based applications rely on, including full support for the SMB protocol and Windows NTFS, Active Directory (AD) integration, and Distributed File System (DFS)," the vendor explained. It also announced FSx for the open source Lustre file system.

The vendor also announced a new type of database called Timestream for time series data. Jassy hailed this product as a "step change in the performance of your time series DB by orders of magnitude because we built it from the ground up to be a time series database, not a general store retrofitted to emerging needs."



AWS announced its first two dedicated blockchain services this year too, with Amazon Quantum Ledger Database and Amazon Managed Blockchain.

The first is a fully managed ledger database, with a central trusted authority and built-in cryptography so that all entries are immutable and transparent to everyone with permissions.

"We had an epiphany," AWS CEO Andy Jassy explained. "We had to build something like this ourselves a few years ago to have a transactional log for every data plane change to make operations and billing easier, so we didn't build that in a relational database and built what we call QLDB, an immutable, transparent ledger that we thought we could externalise."

The second product has been launched to help AWS customers run the two most popular blockchain frameworks - Hyperledger Fabric and Ethereum - with less of the heavy lifting. AWS creates the network and automatically scales to meet demand, complete with a central place to manage and maintain that blockchain network, track certificates, invite new members to join and see operational metrics.

Read next: AWS finally wades into blockchain with two new services

The managed blockchain service is available immediately for Hyperledger Fabric and coming for Ethereum in a couple of months.



There were also a raft of IoT enhancements:

- IoT SiteWise, a managed service which collects, structures, and searches IoT data for industrial facility devices.

- AWS IoT Events promises to make it easy to detect and respond to changes indicated by sensors, such as a malfunctioning conveyor belt, to trigger alerts or actions.

- AWS IoT Things Graph is a drag-and-drop development environment for building low-code IoT applications, such as linking humidity sensors to sprinklers and weather data to create a smart agricultural application.

- Lastly AWS IoT Greengrass Connectors promises developers the ability to connect third-party applications like ServiceNow or Splunk logs via a set of common cloud Application Programming Interfaces (APIs).



For his Thursday keynote CTO Werner Vogels naturally focused on serverless computing, a model of computing where customers don't worry about provisioning or managing infrastructure by leveraging Lambda functions.

Vogels is naturally bullish on the model, and made a raft of granular announcements to help developers go serverless, such as Firecracker, an open source virtual machine monitor for spinning up MicroVMs, Ruby support for Lambda and and AWS toolkit for popular integrated development environments (IDEs).

He also showed a slide of customers going serverless (above), and brought up Fender guitars to tell their serverless story.

Earlier in the week Computerworld UK reported how Danish web company Trustpilot has successfully gone almost completely serverless.

Read next: How Trustpilot takes a 'serverless first' approach to engineering with AWS

Jassy vs Ellison

Jassy vs Ellison

Naturally Jassy couldn't resist aiming some digs at his old rival, Oracle, and its cofounder and CTO Larry Ellison, during his keynote.

"The world of old guard commercial-grade databases has been a miserable world for the last couple of decades for enterprises," he said. "That's because these old guard databases like Oracle and SQL Server are expensive, they have high lock in, they are proprietary and not customer focused. Forget the fact that they are constantly auditing you and fining you for some licence violation, but also they make decisions overnight that are good for them and not for you."

Amazon is in the highly public process of moving away from Oracle databases itself, something CTO Werner Vogels referenced in his keynote. Jassy tweeted earlier this month: "In latest episode of "uh huh, keep talkin' Larry," Amazon’s Consumer business turned off its Oracle data warehouse Nov 1 and moved to Redshift. By end of 2018, they'll have 88% of their Oracle DBs (and 97% of critical system DBs) moved to Aurora and DynamoDB. #DBFreedom."

It had been a source of some pride that even Amazon had to fall back on its database technology for Ellison, and clearly a major project for Jassy to reduce this reliance as soon as possible, but by bringing attention to the issue it has become something of a PR hot potato. What's clear is that neither party can resist taking jabs at the other.

Ground station-as-a-service

Ground station-as-a-service

One of the more left field announcements came on the Tuesday, as Jassy addressed the collected press to announce a new solution aimed at the space sector with Amazon Ground Station.

Read next: AWS looks to the sky with new satellite 'Ground Station-as-a-service' offering

"What they tell us is that it's not so simple dealing with satellites if you want to upload and download data. You need a number of antennas and ground stations across the world," Jassy said. "Then if you uplink and downlink that data you need to write business objects and scripts and workflows to take and analyse that data and use it in applications. That also means if you want to take that data and use it, you need infrastructure to store, process and do analytics, which is all difficult and expensive."

The Ground Station-as-a-service works by taking satellite data at one of these 12 ground stations and processing it in an Amazon Elastic Compute Cloud (Amazon EC2) instance and stores data in Amazon Simple Storage Service (S3), where it can be mined for insights using various AWS analytics and machine learning services.

Copyright © 2018 IDG Communications, Inc.