Google’s Driverless Car Technology Threatens New Wave of Privacy Issues

Google's driverless car.
Google’s driverless car.
Forbes has revealed that Google may be pressing ahead with plans to manufacture and distribute driverless cars on its own without the aid of Detroit’s aging automobile industry. In an earlier story, Forbes singled out liability and regulatory issues that may hinder the production and sale of driverless cars; nonetheless, several states have now passed legislation authorizing the use of their road systems for experimental development of driverless technology.

It is an idea taken directly out of science fiction: self-driving taxi-cabs and vehicles for the rich and movement-impaired could soon be scampering across America’s highway system. Google claims the driverless car technology would actually make vehicles safer than human-driven cars. But skeptics ask if that is really the case.

No one doubts that computers won’t fall asleep at the wheel or that machines can react with pinpoint precision every time an emergency develops. Human beings’ performance as drivers can be impaired by illness, alcohol, drugs, physical injury, and even an idle conversation with passengers. Computers suffer from none of those weaknesses.

But can driverless cars be weaponized by terrorists, asks one skeptic. Imagining a scenario where enemies of the nation infiltrate a civilian fleet of self-driving cars, people question the rationale behind the widespread adoption of driverless cars. With virtually no safeguards against tampering or perversion in place, driverless cars could be zombified by virus software injected into their systems from repair and maintenance facilities or even from home networks.

A zombie car could be packed with explosive materials and turned into a self-guided bomb; or it could be used to crash into other vehicles, public buildings and private homes, school playgrounds, power sub-stations, and other critical facilities. Imagine a driverless car pushing its way into a hospital emergency room, a grocery store, or a library filled with people. What is to prevent such a nightmare scenario?

Drawing a little upon swarm theory, skeptics also insist that driverless cars could be managed from remote locations — special civil command and control centers that would have the authority to send driverless cars packed into a traffic jam in all directions to relieve congestion. Such systems, it is supposed, could be compromised or co-opted to create swarming fleets of driverless cars that set up roadblocks, run blocker for criminal operations, attack police vehicles and facilities, or bring whole cities to a halt by creating traffic jams and blocking important entryways.

Before allowing self-driving vehicles onto our highways, Congress needs to consider the safety risks entailed by robotic transportation that can run virtually anywhere. They need to pass legislation to protect civilians from weaponization programs. And they need to provide for frequent inspection and oversight to ensure the integrity of driverless car command and control systems, both onboard and in any civil engineering operations centers.

And most importantly, law enforcement must be charged and empowered to develop effective counter-measures to be used against weaponized driverless vehicles. Putting this technology into the hands of consumers puts military-capable technology into the hands of anyone, including drunken teenagers, militant racists, jealous spouses, bank robbers, and terrorists.

The future may be filled with robotic cars but they need to share that future with governing mechanisms that provide the human population with adequate protections against militant conversion of the robots.

To effect these measures, governments may have to require more rigorous and more frequent background checks; and citizens may have to agree to allow their automated vehicles to be monitored for public safety and security. An American public that is already emotionally devastated by the recent debates over government monitoring of private information may not be as receptive to such oversight of dangerous robotic vehicles as they need to be.