The machines haven’t taken over. Not yet at least. However, they are seeping their way into our lives, affecting how we live, work and entertain ourselves. From voice-powered personal assistants like Siri and Alexa, to more underlying and fundamental technologies such as behavioral algorithms, suggestive searches and autonomously-powered self-driving vehicles boasting powerful predictive capabilities, there are several examples and applications of artificial intellgience in use today.
However, the technology is still in its infancy. What many companies are calling A.I. today, aren’t necessarily so. As a software engineer, I can claim that any piece of software has A.I. due to an algorithm that responds based on pre-defined multi-faceted input or user behavior. That isn’t necessarily A.I.
A true artificially-intelligent system is one that can learn on its own. We’re talking about neural networks from the likes of Google’s DeepMind, which can make connections and reach meanings without relying on pre-defined behavioral algorithms. True A.I. can improve on past iterations, getting smarter and more aware, allowing it to enhance its capabilities and its knowledge.
That type of A.I., the kind that we see in wonderful stories depicted on television through the likes of HBO’s powerful and moving series, Westworld, or Alex Garland’s, Ex Machina, are still way off. We’re not talking about that. At least not yet. Today, we’re talking about the pseudo-A.I. technologies that are driving much of our voice and non-voice based interactions with the machines — the machine-learning phase of the Digital Age.
While companies like Apple, Facebook and Tesla rollout ground-breaking updates and revolutionary changes to how we interact with machine-learning technology, many of us are still clueless on just how A.I. is being used today by businesses both big and small. How much of an affect will this technology have on our future lives and what other ways will it seep into day-to-day life? When A.I. really blossoms, how much of an improvement will it have on the current iterations of this so-called technology?
Read the source article at Forbes.com.