U.S.
vehicle safety regulators have said the artificial intelligence system piloting
a self-driving Google car could be considered the driver under federal law, a
major step toward ultimately winning approval for autonomous vehicles on the
roads.

The
National Highway Traffic Safety Administration told Google, a unit of Alphabet
Inc, of its decision in a previously unreported Feb. 4 letter to the company
posted on the agency’s website this week.

Google’s
self-driving car unit on Nov. 12 submitted a proposed design for a self-driving
car that has ‘no need for a human driver,’ the letter to Google from National
Highway Traffic Safety Administration Chief Counsel Paul Hemmersbaugh said.

‘NHTSA
will interpret ‘driver’ in the context of Google’s described motor vehicle
design as referring to the (self-driving system), and not to any of the vehicle
occupants,’ NHTSA’s letter said.

‘We
agree with Google its (self-driving car) will not have a ‘driver’ in the
traditional sense that vehicles have had drivers during the last more than one
hundred years.’

Major
automakers and technology companies such as Google are racing to develop and
sell vehicles that can drive themselves at least part of the time.

All
participants in the autonomous driving race complain that state and federal
safety rules are impeding testing and eventual deployment of such vehicles.

California
has proposed draft rules requiring steering wheels and a licensed driver in all
self-driving cars.

Karl
Brauer, senior analyst for the Kelley Blue Book automotive research firm, said
there were still significant legal questions surrounding autonomous vehicles.

But
if ‘NHTSA is prepared to name artificial intelligence as a viable alternative
to human-controlled vehicles, it could substantially streamline the process of
putting autonomous vehicles on the road,’ he said.

If
the car’s computer is the driver for legal purposes, then it clears the way for
Google or automakers to design vehicle systems that communicate directly with the
vehicle’s artificial pilot.

In
its response to Google, the federal agency offered its most comprehensive map
yet of the legal obstacles to putting fully autonomous vehicles on the road.

It
noted existing regulations requiring some auto safety equipment can not be
waived immediately, including requirements for braking systems activated by
foot control.

‘The
next question is whether and how Google could certify that the (self-driving
system) meets a standard developed and designed to apply to a vehicle with a human
driver,’ NHTSA said.

Google
is ‘still evaluating’ NHTSA’s lengthy response, a company spokesperson said on
Tuesday.

Google
executives have said they would likely partner with established automakers to
build self-driving cars.

Google
told NHTSA that the real danger is having auto safety features that could tempt
humans to try to take control.

Google
‘expresses concern that providing human occupants of the vehicle with
mechanisms to control things like steering, acceleration, braking… could be
detrimental to safety because the human occupants could attempt to override the
(self-driving system’s) decisions,’ the letter stated.

NHTSA’s
Hemmersbaugh said federal regulations requiring equipment like steering wheels
and brake pedals would have to be formally rewritten before Google could offer
cars without those features.

For
example, current federal rules require alerts on dashboards if tire pressure
runs low. NHTSA said a test would need to be created that shows the vehicle
computer is informed of the problem. NHTSA raised the question of whether
humans in the vehicles should also be made aware.

In
January, NHTSA said it may waive some vehicle safety rules to allow more
driverless cars to operate on U.S. roads as part of a broader effort to speed
up development of self-driving vehicles.

NHTSA
said then it would write guidelines for self-driving cars within six months.
Transportation Secretary Anthony Foxx said the administration may seek new
legal authority to allow deployment of autonomous vehicles ‘in large numbers,’
when they are deemed safe, the department said.

The
process of rewriting federal regulations governing the design, placement and
operation of vehicle controls could take months or years.

The
NHTSA counsel said Google could consider applying for exemptions for certain
regulations, providing NHTSA with supporting documents.

Google,
Lyft and industry executives are urging lawmakers to help create a ‘regulatory
fast lane’ to help push through the development of self-driving cars – as
experts warn they will kill.

At
a Senate hearing, representatives of General Motors and Delphi touted numerous
safety and environmental benefits of autonomous vehicles.

However,
Mary Cummings, who heads the Humans and Autonomy Laboratory at Duke University,
warned ‘There is no question that someone is going to die in this technology.

‘The
question is when and what can we do to minimize that.’

Chris
Urmson, who heads the Google self-driving car project, said that a consistent
regulatory framework is important to implementing those technologies, and that
conflicting rules could limit innovation.

‘The
leadership of the federal government is critically important given the growing
patchwork of state laws and regulations on self-driving cars,’ he said.

In
the past two years, 23 states have introduced legislation that affect
self-driving cars, ‘all of which include different approaches and concepts,’ he
noted.

Five
states have passed such legislation, all with different rules, Urmson said.

‘If
every state is left to go its own way without a unified approach, operating
self-driving cars across state boundaries would be an unworkable situation and
one that will significantly hinder safety, innovation, interstate commerce,
national competitiveness and the eventual deployment of autonomous vehicles,’
Urmson said in his prepared testimony.

He
also cited government statistics showing 38,000 people were killed last year in
US road accidents and that ’94 percent of those accidents involve human error.’

Joseph
Okpaku, vice president of government relations for the ridesharing group Lyft,
echoed those comments, saying consistent rules would be important for the
planned deployment of self-driving cars by Lyft and GM.

‘We
are on the doorstep of another evolutionary leap in transportation and
technology, where concepts that once could only be imagined in science fiction
are on the verge of becoming a reality,’ he said.

‘The
worst possible scenario for the growth of autonomous vehicles is an
inconsistent and conflicting patchwork of local, municipal and county laws that
will hamper efforts to bring AV (autonomous vehicle) technology to market,’
Okpaku added.

‘Regulations
are necessary, but regulatory restraint and consistency is equally as important
if we are going to allow this industry to reach its full potential.’

GM
vice president Michael Ableson said the auto giant ‘enthusiastically supports
policy initiatives to accelerate the development and adoption of safe, high-level
vehicle automation.’

Delphi
vice president Glen De Vos added that ‘uniform rules that allow for the safe
operation of driverless vehicles in all 50 states will be critical.’

But
the Senate panel was told to exercise caution by Mary Cummings, who heads the
Humans and Autonomy Laboratory at Duke University.

‘There
is no question that someone is going to die in this technology,’ she said

‘The
question is when and what can we do to minimize that.’

Cummings
said it’s not yet clear that self-driving cars can safely operate in all
situations.

‘We
know that many of the sensors on self-driving cars are not effective in bad
weather, we know people will try to hack into these systems,’ she told the
panel.

Cummings
said it is possible to ‘spoof’ a car’s GPS to send it off course, or to use
laser devices to trick a vehicle into sensing objects which are not there.

She
said a Rand Corporation study said that self-driving cars would need to drive
275 million miles (442 million kilometers) to show they are as safe as
human-operated vehicles.

Cummings
said the federal government needs to ensure that testing is done in a rigorous
way to ensure safety.

‘I
am wholeheartedly in support of the research and development of self-driving
cars,’ she said.

‘But
these systems will not be ready for fielding until we move away from
superficial demonstrations to principled, evidenced-based tests and
evaluations.’

The
activist group Consumer Watchdog warned meanwhile that the federal government
should not take shortcuts on safety by ‘rushing new technology to the roads.’

‘Federal
regulators have a process for writing rules to keep the public safe,’ Consumer
Watchdog’s John Simpson said in a statement.

‘Congress
shouldn’t skirt those rules just because tech industry giants like Google ask
them to.’


新闻来源:Self driving car boost as regulators say Google’s self driving
software WILL be legally considered a driver