Amazon re:MARS: a glimpse into a managed machine learning future?
Thursday, August 1, 2019
In early June 2019, I traveled to Las Vegas alongside some of my Liberty IT colleagues to a debutant in the AI conference circuit: Amazon re:MARS. As Amazon exec Dave Limp explained during the opening keynote, re:MARS is the first public version of Amazon’s MARS (machine learning, automation, robotics, and space) conference. MARS is usually a private event where a few hundred scientists, creatives, and business types are hosted by Jeff Bezos.
Compared to the gigantic scale of Amazon re:Invent, re:MARS has almost an intimate feel — it was still nevertheless a huge conference, with thousands of attendees and a selection of keynotes, talks and panel discussions.
Builders and Dreamers
The targeted audience for re:MARS were ‘Builders And Dreamers’ — namely people at the cutting edge/intersection of technology, inspiration and creativity. Certainly the nature of the keynotes reflected this, ranging from Walt Disney Imagineering talking about developing animatronic stunt performers, Kate Darling of MIT Media Labs talking about human/robot interactions and what it means for ethical AI to Robert Downey Jr…..mostly being Tony Stark, and talking about the creativity behind the Marvel MCU.
While the keynotes themselves were fun and interesting, the core focus of the rest of re:MARS was absolutely on how to problem solve and to deliver business value in a increasingly machine learning driven world. These sessions covered topics from conversational interfaces, mitigating customer risk in terms of potentially hazardous products, energy usage prediction, using neural interfaces in everyday life, recommendation systems, to optimising how Amazon should make available Lockers for upcoming deliveries.
Now you will not be terribly surprised to learn that all of these sessions had Amazon products/solutions at the heart of them, given that this was an Amazon conference. However, what was really interesting was how Amazon is positioning itself to be instrumental in how businesses of varying scales approach and crack these problems.
A managed data science universe
As mentioned above, at the crux of all of these talks were of course Amazon services — and in particular, AWS. Of particular focus was the machine learning stack within AWS.
A common theme through the machine learning talks at re:MARS was the relentless pitch for using the Amazon Machine Learning Stack — and certainly, the services available are impressive. Amazon have clearly identified that most businesses would rather buy rather than build a machine learning framework of their own, particularly one that gives the user an enormous amount of flexibility in terms of potential services and at a scale and an ease that an in-house provider couldn’t possibly match — particularly from the point of view of getting a data science model out as a usable product. What is particularly of note is the push on Amazon SageMaker: a managed end to end data science environment with various levels of abstraction that Amazon encourages data scientists to use versus more traditional non-cloud approaches to data science work.
Beware The Citizen Data Scientist
Given the advent of such a managed data science framework, one of the major pitches that Amazon are using behind this is the concept of “the citizen data scientist” — namely that given the availability of SageMaker, anyone can build machine learning models and then push them into production with a lot less effort than is currently the case. Does this sound familiar? It echoes the push in recent years to “democratise” analytics, but may have far more unwelcoming consequences.
While providing access to such powerful tools is laudable, there are inherent risks to make such tools available to those with limited experience of creating models, understanding the output and evaluating the performance and potential impact of such models. If models created via this process are developed and deployed in a cavalier manner without significant safeguards, this may cause significant reputational damage to businesses and other organisations than embrace this path. In my view, we should approach such solutions with extreme caution, and instead take a more measured approach to how data science and machine learning is accessible in organisations. These are powerful tools and can offer enormous rewards — but only if used appropriately. Education and training will be key, but how many companies/organisations will jump upon the bandwagon too quickly (particularly if they lack adequate data science functions) — and what will be the outcome?
Of course, Amazon will not be the only tech giant to offer such a service going forward: it will be an increasingly important toolset in the arsenal of machine leaning in companies and elsewhere, but care needs to be taken in how it is used — and by whom. I will watch with interest at how such services develop in the short to medium term.
Wrapping It Up
Was the re:MARS conference worth it? I think I’d have to answer yes: while it was certainly a marketing spot for AWS and SageMaker, there were some fascinating insights as how companies and academics are using machine learning (advanced and more traditional) on cutting edge projects — how they tackle the problems and how they make their ML/AI solutions usable as products. One way or another, we as data scientists have to ensure that what we make doesn’t sit away as an arcane bit of R&D sitting on a shelf, but becomes central to what the business provides. Amazon re:MARS certainly offers a vision on how that might be achieved — well, through the Amazon ecosystem at the very least.
Plus, I did get to sit in the Blue Origin capsule — which may be the closest this frustrated astronaut ever gets to space!