When I woke up this morning, I asked my assistant a simple question: “Siri, is it going to rain today?”
Siri understood my intent, pulled the local weather data via an API and answered me in less than two seconds: “There’s no rain in the forecast for today.”
In the not-too-distant past, this kind of human-computer interaction would have blown away technologists and delighted consumers — but in 2016, it’s nothing special. Conversations with Siri are commonplace, just like they are with Microsoft’s Cortana and Amazon’s Alexa.
Machine learning (ML) and narrow forms of artificial intelligence (AI) have officially reached the mainstream. The explosion of innovation we’re seeing in AI/ML stems from a series of rapid technological advances over the last few decades: widespread internet connectivity and proliferation of online data, faster/cheaper computers (per Moore’s Law), variable-cost cloud computing, R&D investments from large technology companies and a vibrant open-source software community.
We haven’t yet built HAL 9000, but we’re getting closer.