By Dr. Lance Eliot, the AI Trends Insider
Do you care about the privacy of your data?
Most of us do, at least when we are asked a question as bluntly as whether we care about the privacy of our data, but the reality of our behavior frequently seems to belie how much we really do care about privacy. It’s amazing how much data some people are willing to give up about themselves in order to get a prize or some other perceived token of value.
For example, the other day I was at a mall that had the Tesla Model 3 car on display, and people were waiting in line to be able to sit in the car. As they came up in line, the Tesla staffer asked to see their driver’s licenses, even though none of them were going to actually drive the car, and the staff then scanned the info from their driver’s license.
In exchange for giving up that data, along with answering some survey questions that the Tesla staffer then entered answers into a mobile tablet, each person would get two minutes to sit inside the parked Tesla Model 3. That’s right, a timed maximum of two minutes. Not five minutes, not twenty minutes, not a test drive, etc. Two minutes of sitting in a Tesla Model 3, for giving up your driver’s license info and for answering survey questions. Is that a worthwhile exchange? Apparently so.
Admittedly it’s pretty good bragging rights currently to be able to tell your friends that you sat inside a Tesla Model 3, but is it worth giving up your private data to do so?
None of the people in line seemed to quibble over this and were more than willing to give up their data. As far as I could tell, none of them asked what Tesla intends to do with the data. They had no idea whether Tesla alone would use the data, or maybe provide the data to third parties, or what. I suppose they were so excited about the opportunity to sit in the Tesla that these questions did not occur to them. Also, I supposed that the overall positive brand image of Tesla is such that the general public would assume that Tesla would responsibly make use of the data.
I’ll widen this discussion by emphasizing that I’ve seen many circumstances of people giving out data for all sorts of seemingly trivial matters – tokens or prizes of far less “value” than the vaunted bragging rights of at least saying you sat in a Tesla Model 3. People seem willing to give their private info to just about anyone, especially if the circumstances of the requests are seemingly innocuous. How often have you wanted to download a free song on the Internet and had to provide your name, email address, phone number, date of birth, and maybe your street address? Or, get a nifty new emoji for your mobile phone and provide that kind of private data. And so on.
In those cases, it is often done on a web site that could be run by just about anybody. A nefarious hacker in a faraway land might be running the web site. Or, it might be a web site that seems reputable, but they intend to sell your info to anyone that wants to buy it, no questions asked. Your privacy is being poked at all the time, and if you give up even seemingly small morsels, the data collectors are often able to piece together different aspects about you into one cohesive whole. Remember that data that you gave up a month ago? Well, the data that you give up today can get pieced together potentially to that data from a month ago, and all of a sudden an astute data collector can garner a much greater sense of your life, such as what you like, where you live, what you do, etc.
People are nowadays voluntarily giving up private data about themselves and their cars.
You might have seen the ads by Progressive Insurance and other insurers that if you are willing to have a device put into your car (it’s a dongle, which connects into the OBD-II port inside of your car and usually under your dashboard, and the device can transmit data into the cloud), they will give you a potential discount on your insurance rates. This certainly seems logical in that if a car insurer knows more about your driving habits, they can tailor how much they charge you for insurance. In a sense, it’s good for the insurance company and presumably good for the driver, since the driver will get lower insurance rates (of course, you could also say that you might get higher rates in that if you are a “bad” driver then the insurance company would know as such and charge you more accordingly).
Progressive Insurance was one of the first car insurers to offer this approach, doing so starting in 1998. They have had millions of drivers that have made use of the service. Those millions of humans made a decision that they were willing to give up their private data in order to get hopefully discounted car insurance. What kind of private data? In theory, the data could include how much you drive, how fast you drive, how often you brake, how hard you brake, and so on. It could also potentially include where you drive, how long you stay there (in terms of the car being parked), and other kinds of location related data.
A Pew Research study done in 2016 had analyzed aspects of what people have to say about their willingness to give up private data for getting discounted car insurance. Some of the survey respondents took a fatalistic perspective and said that there is no such thing anymore of privacy and so it didn’t really matter that they were giving up their driving info. Some said that they were perfectly fine with giving up the data about how fast they drive and how often they brake, but that the location related data was a bit chilling to them. Nonetheless, they were still willing to give up the location data to get the discounted insurance rates. We don’t even really know how much money they saved for giving up the data, since you’d need to have figured out what they would normally have paid and compared it to what they ended-up actually paying. It could be that they saved only pennies, or maybe even paid more, rather than truly having saved big dollars for giving up their driving data.
What does this have to do with AI self-driving cars?
At the Cybernetic Self-Driving Car Institute, we are developing systems to help us humans be aware of how much data is being given up by our use of AI self-driving cars.
When you use a conventional car, by-and-large there’s not that much data that you are giving up. Unless you go out of your way to give up data, such as by agreeing to place a dongle onto your ODB-II port such as with an insurance company, otherwise the amount of private data is going to be relatively modest that you’re going to give up.
You might enter into the on-board infotainment system your preferred radio listening channels. You might enter into an on-board navigation system your friend’s addresses. This data is usually kept within the confines of the car, unless you take some kind of overt action to share it. Indeed, some people sell their cars and forget to clear out the memory of the on-board infotainment system and the on-board navigation system, and unknowingly give up that private data to whomever next is getting the car. I bought a used car that had a hundred or more contacts in the on-board navigation system, which, I doubt the former owner had realized would still be in there once the car came into my possession.
There’s another place in your car that has some private data that you probably don’t even realize exists, namely the EDR (Electronic Data Recorder) or blackbox that’s in your car. Not all cars have an EDR, but nowadays most do. The EDR is similar to the blackbox that you see being found when a plane crashes and they want to find out what the pilot and the plane was doing prior to the crash. In cars, the EDR tends to keep just a limited number of minutes of recording just prior to a crash, and it does the recording in a loop so that prior data is overwritten. Still, whether you know it or not, there is likely an EDR in your car and once you get into a bad accident then the authorities can try to get the EDR and see what it says about your driving activities.
So, yes, there is some amount of private data in a conventional car, and though it generally isn’t being spread around, it’s possible for it to get spread. As conventional cars get more and more computers on them, the data collection increases, and so does the potential for data sharing.
AI self-driving cars are like the bonanza of data collecting and data sharing. We are entering into an era of big time data collection by your self-driving car. Likewise, it will be big time data sharing. Your self-driving car is going to be a tell-all.
Let’s consider why this is the case.
AI self-driving cars are being designed and fielded in a manner that has the AI system communicating with some cloud-based system setup by the auto maker or tech firm that developed the system. While the self-driving car is driving around, it is simultaneously transmitting data that indicates what it is doing. The cloud-based system uses that data to presumably provide added guidance to the AI self-driving car. By getting hundreds, thousands, and someday millions of self-driving cars contributing to a massive database, the database can be used to the benefit of each of those individual cars. Patterns can be found in the large database that are then fed to the individual cars.
Suppose your self-driving car is driving along a road and an aardvark leaps out into the street. Let’s assume that this is the first time that an aardvark has done this for any and all of the self-driving cars fielded by the auto maker. Your self-driving car feeds this to the cloud, and now all other of the self-driving cars can be fed the data about the aardvark. This provides a collective benefit to all other of the AI self-driving cars of that auto maker or tech firm.
Under the logic, this ensures that all other self-driving cars of the auto maker benefit and that it presumably will reduce the chances of auto accidents and increase driving safety, and so for that bona fide reason your AI self-driving car is going to be a tattle tell. Are you OK with this? Are you even aware that it’s happening? I realize that you might say that sharing the picture of an aardvark seems pretty harmless from a privacy perspective. Why care about it? But, the aardvark is just an example and there’s a lot more data to be considered.
Your AI self-driving car will have all kinds of sensors, including cameras, radar, LIDAR, ultrasonic, along with GPS and other location identifying capabilities. In theory, everything that your car detects, all images and video that it captures during your driving journey, all radar data, and so on, can be readily shared to the cloud of the auto maker or tech firm. Your everyday use of the self-driving car can provide a moment-by-moment reenactment of where you’ve been, where you drove, how you drove, what you saw along the way, and a myriad of other facets.
An auto maker or tech firm might say that it is oversimplifying to suggest that all of that data will be shared into the cloud of the auto maker or tech firm. And, it is true that it is somewhat unlikely that all of that raw data would be uploaded. Instead, it is more likely to be summarized data or transformed data, rather than all of the pure raw stuff.
An auto maker or tech firm might also say that they are going to make the data into being anonymous or nearly so. When your self-driving car uploads the data, it might provide some kind of unique identifier assigned by the auto maker or tech firm, and nothing else about you per se. Presumably, this makes the data somewhat anonymous. Of course, it is likely that if the auto maker was pressed to do so, they could ultimately say who has that unique identifier and trace it back to your other personal data that they might have.
I’ll also point out that even if an auto maker or tech firm says what they will protect the data, you never really know whether they will follow through on that promise. There have been cases of firms outside the auto industry that have said they will never reveal your private data, and then later on those firms got bought up by another company that then decided to no longer honor that pledge. Or, there are cases where the pledge was made earlier on, but later on the firm changed its mind and said they would start making use of the data. Even if that firm tells you they are changing the terms of the use of the data, you as a consumer might not be readily able to switch away from their product or service, and had gotten invested into them under one kind of promise.
Let’s add more fuel to the AI self-driving car privacy aspects.
We are gradually seeing more inclusion of in-car commands or in-cabin data collection systems. For self-driving cars at the Level 5, which is a self-driving car that can drive as a human can and for which no human driver is needed, the human occupants inside the self-driving car will presumably be able to instruct the AI about what they want the AI to do. Take me to Disneyland, and after that take me over to In-and-Out Burger, and then take me home, you might tell the AI. All of this is data that now the self-driving car has in its possession, including your actual voice instructions. The self-driving car might also have cameras pointed inward and be doing facial recognition, which is handy for you as the owner of the self-driving car because the AI automatically can know that you are in the car. But, this is also more video or images that can be shared up to the cloud.
At levels less than the Level 5, there is a human driver required for the self-driving car, and the AI will hand-over control to the human driver as needed. You’ve likely seen that to try and ensure that human drivers maintain concentration on the driving task and do not become adrift of doing so, the auto makers and tech firms are putting devices into the self-driving car to monitor the driver. This includes an eye scanner to make sure that the human driver’s eyes are kept straight ahead and aimed at the roadway, it includes steering wheel sensors to make sure your hands are on the steering wheel, and even facial recognition to try and determine your mood so as to set the climate control of the car accordingly. All of this generates data. All of it could be uploaded. I am not saying it will be, and just pointing out that it could be.
The amount of data that could be collected is enormous. There’s data from the sensors of the self-driving car. There’s the sensor fusion that brings together the data about the sensors. There’s the AI maintained virtual world model of the car and its surroundings. There’s the AI action plans of what the system is going to do. There’s the controls activation commands that the AI sends to the car. The number of computer processors and memory on-board of a self-driving car is what makes it all work. This also provides tons of potential data that can be shared outside of your self-driving car.
Keep in mind that this data sharing can apply to not just the owner of an AI self-driving car, but also to any human occupants. Your friend buys an AI self-driving car and takes you to the beach with him. The in-cabin data collection system now has possibly video of you, audio of you, and so on. Whether you realize it or not, by getting into that car you have essentially agreed to have your private aspects collected by the auto maker or tech firm.
Consider too that self-driving cars are going to be used for ride-sharing. Anyone that uses someone’s AI self-driving car will now be giving up private info of themselves, audio and video, and it could be paired to say your credit card that you are using to pay for the ride. There you are, drunk from being at the bars late at night, and the self-driving car that gave you a lift has you in all your glory, drunk as a skunk and tossing that shrimp plate you had, all happening while heading home in the self-driving car. You might say that using Uber or Lyft today is the same, but it isn’t in the sense that few conventional cars are setup to have video and audio captured of the occupants of the car. It can be done, certainly, but it’s not the norm. With AI self-driving cars, it will pretty much be the norm. It will be a standard aspect of most AI self-driving cars.
We’re also going to have V2V (vehicle-to-vehicle) communications taking place with AI self-driving cars. This means that one car can share data with another car. This will be handy when you are in traffic and a self-driving car ahead of you informs your self-driving car that traffic is snarled. But, notice that this also means that the other self-driving car knows generally where your self-driving car is, and can possibly collect other data about your self-driving car too. I realize that there can be limits imposed on this aspect, and so just trying to point out that its more data about you and your self-driving car.
The same can be said of V2I (vehicle-to-infrastructure) communications, in which your AI self-driving car will communicate with roadways, bridges, and other parts of the infrastructure. It could allow for a much easier way to trace your movements. We already have a lot of video cameras out there, watching us, but this V2I is going to be everywhere. Almost like a science fiction movie, the movements of your self-driving car are going to be readily easy to trace.
We are seeing various regulation emerging to try and deal with this torrent of private data based on AI self-driving cars, but it’s still in its infancy. There will be rules and regulations at the local level, state level, federal level, and differing too by country. It remains to be seen whether the regulations can be established at the pace of the advent of AI self-driving cars. Even if there are regulations, there will need to be some means to determine whether the regulations are being abided by. How will we know that data privacy laws are not being subverted?
One aspect about much of this is that you might not even be aware of what data is being shared about you. You’ll get into an AI self-driving car and have no ready means to realize what data is being recorded, what data is being kept, and what data is being shared. When you visit a web site, it might say that there are cookies there to improve your experience, which also means that you are being tracked. Some web sites warn you, some do not. Will our AI self-driving cars warn us about what data is being collected and what data is being shared? And, for shared data, will it be able to indicate in what manner and to whom it will be shared?
There are also the data exhaust aspects too. Data exhaust is a phrase that refers to your actions and preferences such as log files, temporary files, and data that is generated for each transaction or process that we invoke. If you use your AI self-driving car to take a toll road, it’s a kind of data exhaust that is a byproduct of your using the car (when you entered onto the toll road, when you exited, etc.). When you park your self-driving car at a parking meter, there’s data about when you parked, how long you parked, and so on.
Another twist is the computer security aspects. So far, we’re assuming that the legitimate auto maker or tech firm is collecting the data from your AI self-driving car. Suppose a hacker has cyberattacked your self-driving car and they are able to grab the data. Or, a hacker has gotten into the cloud of the auto maker or tech firm and grabbed the data from there. Maybe the hacker gets a malware downloaded into your self-driving car, and can pretty much continuously monitor everything that the AI self-driving car can see and is doing. The importance of proper computer security practices for AI self-driving cars is essential for not only the safety of the self-driving car and its human occupants, but also for the privacy of the human occupants too.
I’ve referred here to the auto maker or tech firm that has developed the AI self-driving car, but there’s a slew of other companies that might be involved in getting to your data. The infotainment system of your self-driving car is likely connected with entertainment firms that are getting data about you. I’ve already mentioned that insurance companies are likely to want this data. Your car dealer will want this data. When you take your car into an auto repair or maintenance shop, they could want the data. Other third-parties might want the data. In essence, there is an entire army of various firms that will want to tap into the data of the self-driving car.
The AI self-driving car should be designed to try and prevent intruders from getting data, and also be designed to allow for legitimate uses of the data, when so properly required. If a car mechanic is trying to fix the engine of the car, there presumably should be no valid reason for them to also be able to take a look at say video or audio of the self-driving car that was recorded during your most recent trip in the AI self-driving car. In that sense, the manner of giving up the data should not be just a wide open aspect and instead should be structured into protective coves.
Suppose you want to give up your data, such as if you’d recently driven to the Grand Canyon and you want to share the video captured by the AI self-driving car cameras with your friends. Or, maybe you have your own YouTube channel and want to show your latest driving trip along Route 66. Most of the auto makers and tech firms are currently saying that you won’t be able to get that data. Instead, the auto makers and tech firms are indicating that the data is solely for purposes of the AI self-driving car and for the auto maker or tech firm. We’ll see how long that position will last, once self-driving cars become more prevalent there might be a push by consumers (and regulators) to alter that stance.
Your self-driving car can be a handy “spy” on others too. Suppose we have several AI self-driving cars driving around a neighborhood. These are handily simultaneously taking video, radar, LIDAR, and the like. With this, you could figure out when your neighbor Joe walks his dog. You could piece together the data to get a sense of by-time, by-location, and assemble an impressive indication of what’s happening in your neighborhood. You might argue that’s not what the self-driving cars are intended to do, but it is something that they could be made to do. And, if we’re going to have self-driving cars driving around without any human occupants inside, you’d think that it’s just a self-driving car making its way home (and not be the wiser of its spying mission).
If you think this notion is farfetched, take a look at Google maps and switch over to the street view – this gives you an inkling of what it will be like, though that picture was taken by one Google vehicle maybe a month ago as a snapshot in time, while in the future it will be video taken by the thirty self-driving cars that drove down that same street today.
We have millions of people carrying smartphones and they take video. We have cameras on buildings that take pictures. When an incident happens, we all seek out those sources to figure out what occurred. Imagine with hundreds or thousands of AI self-driving cars, all driving around throughout the day and night, we will have lots and lots of that kind of data, available from all kinds of locations and all times of the day.
You could say that we should restrict AI self-driving cars into immediately discarding any video or images, or other such data, and that it can only be used for the immediate task of driving the car. This has its downsides. We might want the data for purposes of sharing to benefit others, as described earlier, and it might be needed for purposes of dealing with any car accidents as to what happened, and so on. It also could be hacked to possibly violate this. I don’t think there’s an easy answer.
As an example of the kinds of disclaimers that the existing auto makers are using, let’s take a look at the Tesla Model S. Their approach is generally similar to the other auto makers right now and a handy exemplar. By the way, the word “telematics” is commonly used by the auto makers to refer to the system that collects data from your car and uploads/downloads it.
Here’s this from the owner’s manual and it informs you about the range of data that can be collected:
“Model S is equipped with electronic modules that monitor and record data from various vehicle systems, including the motor, driver assistance components, battery, braking and electrical systems. The electronic modules record information about various driving and vehicle conditions, including braking, acceleration, trip and other related information regarding your vehicle. These modules also record information about the vehicle’s features such as charging events and status, the enabling/disabling of various systems, diagnostic trouble codes, VIN, speed, direction and location.”
This then describes where the data might go and why:
“The data is stored by the vehicle and may be accessed, used and stored by Tesla service technicians during vehicle servicing or periodically transmitted to Tesla wirelessly through the vehicle’s telematics system. This data may be used by Tesla for various purposes, including, but not limited to: providing you with Tesla telematics services; troubleshooting; evaluation of your vehicle’s quality, functionality and performance; analysis and research by Tesla and its partners for the improvement and design of our vehicles and systems; and as otherwise may be required by law. In servicing your vehicle, we can potentially resolve issues remotely simply by reviewing your vehicle’s data log.”
“Tesla’s telematics system wirelessly transmits vehicle information to Tesla on a periodic basis. The data is used as described above and helps ensure the proper maintenance of your vehicle. Additional Model S features may use your vehicle’s telematics system and the information provided, including features such as charging reminders, software updates, and remote access to, and control of, various systems of your vehicle.”
Notice that the data can be provided to the partners of the auto maker. Though this is typical, and the same is the case with say credit card companies that share data with “partners,” it is something that tends to leave consumers in the dark as to specifically who the partners are, what data is being shared, how often it is being shared, etc.
In terms of disclosure of the data, here’s what the owner’s manual says:
“Tesla does not disclose the data recorded in your vehicle to any third party except when:
- An agreement or consent from the vehicle’s owner (or the leasing company for a leased vehicle) is obtained.
- Officially requested by the police or other authorities.
- Used as a defense for Tesla in a lawsuit.
- Ordered by a court of law.
- Used for research purposes without disclosing details of the vehicle owner or
- Disclosed to a Tesla affiliated company, including their successors or assigns, or our information systems and data management providers.
In addition, Tesla does not disclose the data recorded to an owner unless it pertains to a non-warranty repair service and in this case, will disclose only the data that is related to the repair.”
In terms of being able to opt-in or opt-out, here’s what the owner’s manual says:
“For quality assurance and to support the continuous improvement of advanced features such as Autopilot, Tesla measures road segment data of all participating vehicles. All Tesla vehicles can learn from the experience of the billions of miles that Tesla vehicles have driven. Although Tesla shares this data with partners that contribute similar data, the data does not include any personally identifiable information about you or your vehicle. To allow data sharing, touch Controls > Settings > Safety & Security > Data Sharing, then touch the I agree checkbox to confirm that you agree to allowing Tesla to collect this data. Note: Although Model S uses GPS in connection with driving and operation, as discussed in this owner’s manual, Tesla does not record or store vehicle-specific GPS information. Consequently, Tesla is unable to provide historical information about a vehicle’s location (for example, Tesla is unable to tell you where Model S was parked/traveling at a particular date/time).”
If you are interested in a particular AI self-driving car, you’d be wise to take a close look at the owner’s manual to see what the auto maker or tech firm states about the data aspects. In some cases, the indications can be found via the online system within the vehicle, and in some cases the AI might even be able to explain to you what the self-driving car does with the data.
For those of you that say you don’t care about giving up your data, or those that are fatalistic and say that data is already now impossible to keep private, I suppose none of the foregoing is important to you. I’d guess that for the rest of us, we are mindful of the data about us. As you can see, it’s going to be a long road ahead for trying to deal with the private data that our AI self-driving cars are going to have. Let’s be on our guard.
This content is originally posted on AI Trends.