By Dr. Lance B. Eliot, the AI Trends Insider
I recently served on a panel of speakers at a self-driving car event in Silicon Valley and met a fellow panelist, Assemblyman Marc Berman, known for his savvy awareness about self-driving cars. (In photo above, Marc Berman is on left; Dr. Lance Eliot is second from right.) During the panel, Marc offered numerous insights that showcased his prowess in understanding the nature of where self-driving cars are today and where they are heading. His kind of expertise about self-driving cars is sorely needed by our regulators at all of the state, federal, and local levels.
This new innovation of self-driving cars will either be helped or possibly hampered by regulations and so somehow regulators need to get up-to-speed akin to what this Assemblyman representing the 24th District in California is doing (his district covers essentially the heart of Silicon Valley, including Palo Alto, Sunnyvale, and parts of Santa Clara county and San Mateo County). You might want to take a look at his TEDx talk that he gave on driverless cars back in February 2013. He’s been thinking about self-driving cars for a while and has his fingers on the pulse of the ways in which driverless cars will impact our society.
Indeed, Assemblyman Berman has launched an initiative entitled “Silicon Valley 2028” that includes enabling the safe deployment of self-driving cars. He is urging his fellow legislators to aid in the development of a comprehensive legal framework covering self-driving cars. Readers of my column have likely already seen my several pieces about the upcoming legal issues that will emerge as self-driving cars begin to actually appear on our roadways. Right now, the laws about self-driving cars are patchy at best. Self-driving car makers are in fact worried that what might be legal for a self-driving car in one state might turn out to be illegal in another state. This would be problematic for the selling of self-driving cars and for consumers that want to buy self-driving cars and drive them from one state to the next.
Imagine that you were driving your car today to go from say California to Nevada (it’s a popular driving effort for those in Southern California that favor visiting Las Vegas and trying their hands at the betting tables). Suppose that California driving laws stated that you had to drive a certain way, and suppose further that Nevada laws indicated you had to drive in a different manner in terms of rules-of-the-road. It would be difficult for you as the driver to make sure that you switched over to the rules of Nevada when you crossed the state border line. This is the same aspect that we might face with self-driving cars. If each state decides to go in a particular direction on what they expect a self-driving car to do, the myriad of disparate regulations can make life arduous for both the makers of the self-driving cars and for those that buy one and want to travel from state to state.
Now, when I’ve mentioned this concern about the state-to-state disparity on self-driving car regulations, I’ve had some techies that instantly retort that the beauty of a self-driving car is that it should presumably be easily reprogrammable such that it can adapt to multiple regulations. In other words, the AI should be setup to have a module that knows about the driving regulations in California. Once it reaches Nevada, it should simply swap out the California module and then plug in the Nevada module. This might be akin to having two human drivers in a conventional car and having one that is licensed to drive in California and one that is licensed to drive in Nevada. Each takes the driver’s wheel when in the state for which they are licensed to drive.
Yes, I realize that this notion of modules for driving within states is certainly feasible. But, what is missing from that discussion is that suppose one state says that a self-driving car can use LIDAR (see my column on LIDAR technology for self-driving cars), and another state bans its use. If the self-driving car you are in happens to use LIDAR, and it goes into another state that has banned it, the self-driving car might possibly now be partially blind to the roadway (see my column on going blind when sensors of the self-driving car aren’t functioning). I am not saying that any state currently has made such a ban, but merely pointing out that the regulations in a given state could be of an onerous nature that it would possibly cripple or undermine a self-driving car that is readily allowed to drive in another state.
I am an adviser to the U.S. congressman that serves as Vice Chair on the congressional science and technology committee and so I try to keep well-informed of what the feds and the states are doing in terms of technology related regulations. Recently, the Senate Committee on commerce, science, and transportation had a hearing on the topic of self-driving cars. Testimony by several industry notables kept coming back to the suggestion that a single, unified national standard for self-driving car regulations is needed.
For example, currently each state is establishing its own policies and procedures on the testing of self-driving cars. Some states don’t have any stated policies and procedures. States that do have on-the-books policies and procedures have at times crafted those regulations from scratch and so the regulations are not as robust as they could be. Besides the regulations varying from state to state, there are at times provisions in one that are not found in another. Some critics say it is the wild, wild west right now in terms of what self-driving car makers can and cannot do, within each state, and this makes it more costly and confounding to be an innovator trying to make self-driving cars.
Safety guidelines are probably the most important of the regulations that are to be passed. In the state of Washington, they recently passed new regulations about self-driving cars. One aspect that registered some controversy involved the indication that there does not need to be a human driver ready to take over from the AI of the self-driving cars that are being tested on the public roadways of Washington. Critics liken this looseness and the aspect that Washington has formed a state-wide committee to further study the matter as similar to what Arizona established in 2015. Critics say that Arizona has been overly lax in their regulations, meanwhile Arizona has attracted many of the self-driving car makers due to the less-is-more mindset of having minimal regulatory hurdles on self-driving cars.
If you look closely at the Executive Order 17-02 “Autonomous Vehicle Testing & Technology in Washington State” you’ll see that it offers two different kinds of allowed pilot programs of self-driving car testing. One pilot program involves the “safe testing and operation of autonomous vehicles with human operators present” and requires that the vehicle needs to be monitored and operation “only by a trained employee, contractor, or other person authorized by the entity developing autonomous technology.” Furthermore, the human operator must have a valid U.S. driver’s license and the vehicle owner must provide proof of financial responsibility of the car. This is similar to the self-driving car regulations in California.
The other pilot program in the Washington law involves “autonomous vehicles without human operators present,” which is the controversial element of the Washington state law. This pilot program says that “vehicles shall be equipped with an automated driving system that performs all aspects of the driving task on a part- or full-time basis within the vehicle’s operational design limits, and it must be capable of bringing the vehicle to safe condition in the event of a system failure.” You might recognize this language as being similar to the Society for Automotive Engineers (SAE) provisions for the levels of self-driving cars (see my column on the Richter scale for self-driving cars).
The reason that there are opponents of this second type of allowed pilot program is that it presumably means that we’ll have self-driving cars being tested on the public roads of Washington and yet not need to have a human operator ready to intervene when needed. In the levels of self-driving cars, it is only the true self-driving car of Level 5 that indicates no human driver is needed and that the car must be able to drive in whatever manner a human driver can drive. We are still a long way away from having a Level 5 self-driving car. Furthermore, notice that the provision says “part- or full-time basis” which implies that the self-driving car does not need to be full-time devoted to the driving task of the car. What will happen when the self-driving car reaches a point of not knowing what to do and tries to hand the controls over to a presumed human operator?
Further criticism is aimed at the provision because it says that “developing entities shall self-certify to the DOL [Department of Licensing] that they are compliant.” Notice that it says that the self-driving car maker is able to self-certify. This implies that the state government is not going to do the certification itself to ensure that the self-driving car can safety operator without a human operator present. Those critical of this provision are saying that it is allowing the wolf to say it is safe to be amongst the sheep. Governor Jay Inslee of the state of Washington has indicated that he wanted purposely to allow for having a “relatively light touch” with regulations imposed upon the self-driving car industry.
Assemblyman Marc Berman in his remarks on the panel that he and I served on expressed rightfully that there needs to be a balance between having too few regulatory requirements and too many. He has called for self-driving car stakeholders to come together and derive consistent and workable regulations. This is a welcomed outreach.
If we have too few regulations it could lead to regrettably life threatening or even death-producing situations during this pioneering era of the self-driving car emergence. The result could be a stampede toward shutting down the self-driving car parade. We could end-up seeing self-driving car innovations come nearly to a halt and the progress hindered for years to come. At the same time, if regulations are overly restrictive at this time, it could stunt the innovation before it gets a chance to get off the ground. We need the proverbial “Goldilocks” set of legal provisions, ones that are just right, and they need to be adopted across the board at state, federal, and local levels.
Many regulators seem to think only in terms of one end of the spectrum or another. There are either a thousand pages of regulations needed and every time bit of legal ground needs to be covered, or there is a one page regulation that leaves wide open interpretation that can produce high safety risks. These extremes aren’t a good way to go on this. All it will take is one bad apple of a particular self-driving car maker that happens to put a car onto the roadway that kills someone in a jurisdiction that allowed for a wide-open landscape, and then all the other states and the feds will be (forced) to step into this like a ton of bricks. Before we get to that unfortunate and undesirable juncture, it would be better to identify and codify an appropriate balance and put regulations onto the books that we can all satisfactorily live with.
As mentioned at the start of today’s column, we need more regulators like Assemblyman Marc Berman. In my role as the Executive Director of the Cybernetic Self-Driving Car Institute, I urge regulators from other states that are desirous of a fellow informed legislator, one that they can brainstorm with about this topic, they should give him a call. His being based in Silicon Valley partially derives his penchant for new technology like self-driving cars. That being the case, it makes sense to leverage his expertise and passion for this topic. If Silicon Valley can be the generator of AI and innovations for making self-driving cars, it can also be the innovator and guiding light as to what self-driving car regulations should be like too.
This content is original to AI Trends.