Economist Robin Hanson recently sparked debate on social media, questioning the conventional expectation for autonomous vehicles (AVs) to strictly obey traffic laws. "Do we actually want self driving cars to obey traffic laws? Seems not. But we are in denial about this," Hanson stated, highlighting a core tension in the development and regulation of self-driving technology. This provocative statement comes amidst ongoing discussions about how AVs should navigate real-world driving scenarios, where human drivers often deviate from strict rules for perceived safety or efficiency.
Current traffic legislation globally, including in the US and Australia, largely defines a "driver" as a "person," creating a significant "driver dilemma" for autonomous systems. This legal framework complicates enforcement and liability for vehicles operating without direct human input. Experts note that while AVs are designed to adhere to regulations, the laws themselves were not written with machine intelligence in mind, leading to potential ambiguities.
The primary motivation for autonomous vehicle technology is to reduce traffic collisions, with human error cited as a factor in approximately 94% of all motor-vehicle crashes, according to the National Highway Traffic Safety Administration (NHTSA). Proponents argue that AVs, by eliminating human distractions and strictly following rules, will drastically improve road safety. However, some research suggests that AVs will need to do more than just obey laws, requiring adaptability to road conditions and anticipating human behavior, which might sometimes involve actions that technically bend a rule for overall safety.
The debate extends to ethical considerations, particularly regarding risk distribution in everyday traffic. While AVs are expected to be more orderly, the question arises whether strict adherence to every letter of the law is always the safest or most efficient course of action in dynamic environments. For instance, a human driver might slightly exceed the speed limit to merge safely, a maneuver an AV strictly programmed to obey limits might struggle with, potentially creating new risks. The evolving legal landscape also grapples with assigning liability when an AV is involved in an incident, often holding AV manufacturers to a high standard.
As autonomous technology advances, regulatory bodies face the challenge of updating laws to accommodate these systems. This involves not only defining the legal status of an AV but also addressing the complex interplay between strict rule-following and dynamic, human-like decision-making. The ongoing discussion, spurred by figures like Robin Hanson, underscores the need for a nuanced approach to ensure both safety and practical integration of self-driving cars into existing traffic ecosystems.