editors note: AI Trends welcomes our new AI Insider, Dr. Lance Eliot. As an old colleague of mine, Dr. Eliot has been covering AI for 25+ years. In fact, he used to write for a previous version of AI Trends too long ago to mention. Dr. Lance B. Eliot, MBA, PhD, is a serial entrepreneur, having launched, run, and sold three high-tech businesses, and currently serves as CEO of Techbrium Inc. He is especially known for his expertise in Artificial Intelligence and Machine Learning. He serves as a mentor for several incubators in Silicon Beach and Silicon Valley, he is an angel investor in high-tech startups, and he used to host the popular Technotrends radio show. His most recent book is a ground breaking look at AI Guardian Angel Bots for AI Trustworthiness: Advances in AI and Machine Learning, which details the next evolution of chat bots and AI and is available on Amazon. (executive editor, Eliot Weinman)
By Dr. Lance B. Eliot
At the Consumer Electronics Show (CES) of January 5–8, 2017, the Self-Driving Technology Marketplace in the Las Vegas Convention Center (LVCC) had increased by 42% over 2016, expanding to encompass the now myriad of car companies, tech companies, and ride-sharing companies that are all seeking the holy grail of producing a self-driving car. Major automakers including BMW, Mercedes Benz, Nissan, Ford, Chrysler, Hyundai, Honda, Toyota and others were there in great force, offering the latest glimpse at where they believe self-driving cars are heading.
Besides showcasing futuristic cars at their oversized and crowd luring booths, some of the companies even provided a “ride and drive” experience outside the LVCC (a sometimes-alarming road drive by a self-driven car on the streets of Las Vegas). Though the opportunity to be a passenger in a self-driving car was an exciting prospect for the attendees (there were often lengthy lines and waits to try one), there were also many reported indications of self-driving cars that had to be taken over by their “emergency” human driver to avoid mishaps and roadway accidents.
Despite the loud hype that self-driving cars are nearly ready for prime time, the reality is that we are still years away from seeing true self-driving cars. A true self-driving car is often defined as being a Level 5 on the scale of self-driving cars established by the Society for Automotive Engineers (SAE) and as sanctioned by the federal government, and consists of a car that can be driven entirely without any human intervention. In short, anything a human driver can do, a self-driving car that is rated as a Level 5 must be able to do. Currently, most of the self-driving cars under development are rated at a level 3 or 4 which allows for some amount of human intervention in the car driving activity.
One key design difference between the many self-driving car initiatives is the fundamental aspect of what the AI/Human role should be. For Google’s Waymo self-driving car, they are aiming at an all-AI and no-human role in the driving of the car. This means that there are no controls within the car for the human occupant. There isn’t a steering wheel, there isn’t a brake pedal, there isn’t an accelerator pedal. The human is along for the ride and has no part in the driving of the car.
Nissan is also aiming ultimately at no internal controls for human occupants, but takes a different slant at the role of humans in driving of the self-driving car. In particular, Nissan envisions a remote-control capability of self-driving cars involving humans in some far-flung operations center that can take over your self-driving car when the AI cannot figure out what to do. Borrowing from the NASA Mars rover capabilities, Nissan showed that it is adopting a NASA-created system that will allow remotely situated human operators to be invoked when the self-driven car gets stymied.
Called the Seamless Autonomous Mobility (SAM) system, it is Nissan’s answer for the aspect that we don’t yet really know whether or when AI will be strong enough to always autonomously drive a car. Nissan even has a catchy slogan for their futuristic cars, namely a goal of zero emissions (being all-electric vehicles) and zero fatalities (due to the self-driving and remote operations capabilities).
Ford showcased a self-driving design that allows for an occupant of the car to still be able to take over the controls of the car. Presumably, the human would do so when alerted by the AI system, or might also do so if for whatever reason the human decided they either had to take the wheel or at least wanted to take over the driving. Hyundai is likewise providing for in-car human operation, plus they are adding a special new function to their cars. The newest trend is a driver mood detection system that tries to discern the emotional state of the driver. By the use of biometric sensors, the AI of the car will adjust the human driver’s seat to a different posture, and alter the interior temperature, sounds, lighting, and even generate a scent through the car vents to try and pick-up the mood of the driver.
The mood adjustment features will likely begin to appear on many other car brands as a relatively easy gimmick that can be implemented readily, doing so without having to ensure that a car is an actual self-driving car. Will human drivers want to have their car detect their personal mood and then react by altering the internal environment of the vehicle? Only time will tell whether car buyers and car drivers perceive this mood bending capability as being worthwhile. As I drove back from Vegas, and after having lost quite a bit at the blackjack tables, I am not sure whether even if my car was equipped with a mood adjuster that it would have made me happy. Meanwhile, in terms of what happens in Vegas stays in Vegas, the CES conference showed that what happens in self-driving cars will actually be spreading outside Vegas as we begin to see the latest in AI-based cars hit the roads.