Wednesday, May 22, 2019

Moving Forward in the Mainstreaming of Autonomous Vehicles

A few years ago, autonomous vehicles were as much a wild idea as anything, even though automakers were promising fully autonomous vehicles (AV) by 2020.
Here it is, 2019, and while fully autonomous vehicles may not hit the streets next year, they are not far off.

Driverless vehicle technology update

Autonomous auto developers are now acknowledge the fact that as autonomous technology takes hold, drivers will be freed up to engage in other activities behind the wheel. To that end, Tesla has announced that it is expanding its in-car gaming services. Model X, S, and 3 owners can update their car software to play Atari’s Missile Command, Asteroids, Lunar Lander and Centipede, and newly added 2048 and Atari’s Super Breakout games (added to vehicles last week). Mercedes-Benz has issued a challenge looking for individuals to create games for vehicles of the future.

Earlier this year, Tesla announced that it would have full autonomous capabilities to vehicles during 2019. At first, monitoring will still be necessary, but estimates are that by the end of 2020 the vehicles could be totally autonomous, although regulators may step in and not allow fully autonomous vehicles to operate. This presents a number of problems. First, the vehicle should be totally autonomous before in-cab games should be able to be activated.

Inviting distractions

While there are many who look forward to having the vehicle drive them to work while they read a book, play games or engage in other activities, a recent survey conducted by AAA shows that 75% of people are afraid to ride in a fully autonomous vehicle. However, respondents who had experience with existing semi-autonomous features such as lane departure warnings were 70% more likely to say they were willing to ride in a fully autonomous vehicle. This is significant, because data from the National Highway Traffic Safety Administration (NHTSA) shows that 94% of serious auto accidents are due to human error. This is one of the main factors driving the interest and development of AVs.

Elon Musk recently stated that Teslas could be turned into robo taxis in just over a year, and that he is going to develop battery packs that last for a million miles. He is reported as saying robo taxis will be in some areas next year with the regulatory approval to use them on the roads. Part of the idea of robo taxis is that existing Tesla owners could use their vehicles as autonomous Uber vehicles, making money while the owner sleeps or is at work. While this may seem far-fetched, the fact that semi-autonomous technology is now common indicates that this is closer than you might expect, especially with the ability of Tesla to update the cars remotely with software updates.

Musk states that vehicles that meet Tesla’s requirements for full autonomy will be available second quarter of 2020, and then regulators have to approve the vehicles for use on public roads. The regulators must be convinced that the vehicles are truly safe to run without a driver present. Musk believes he will have approval in at least a few jurisdictions by the end of 2020.

The greatest common good

A number of elements factor into fully autonomous vehicles and their use by the public. One reason most of the AV testing so far has been done in the southwest part of the country is the good weather. Snow obstructs the camera’s view of the road, and the lasers can bounce off of snowflakes mistaking them for obstacles. Snow can cover the pavement lines and curbs the vehicle uses to judge lane locations.  Rain, fog and sandstorms can also mislead the cameras.

Road markings are not standardized across the country, and different line markings and the lack of curbs on some roads can confuse the vehicle. Apartment complexes, condominium complexes and some residential roads may have no markings at all, making it difficult for the vehicle to navigate certain terrains.

Some driving tasks are just difficult. People often have trouble with maneuvers, like unprotected left turns, in heavy traffic. Autonomous vehicles have the same issues, maybe more so. Humans will take a safe risk by turning left quickly to avoid oncoming traffic but still make the turn safely. An autonomous vehicle may wait longer, causing traffic to back up behind it.

Nagging safety issues

How the vehicle perceives objects is crucial and still not perfect. There was a fatal crash in Arizona involving an AV and a pedestrian that happened at night, which proves this point. Humans automatically register many things that the computer system has to be taught to identify in a variety of ways. A bicyclist from the rear is readily recognizable to a human, but the computer may see it as a vertical line. These types of identification are critical to the safe navigation and operation of a vehicle relying on data on its own.

Whether something is a stationary object or a FedEx truck or a van is important for predicting future movement. Is the object on the corner a newspaper box or a child ready to cross the street? A person can use body language clues to gauge whether or not someone is apt to step into the roadway, while such tasks are difficult for a computer to identify and act upon.

While it is predicted that AVs will lessen traffic and pollution, the opposite could be true. Instead of parking, the vehicles could be cruising around the streets while waiting for owners or someone needing a ride. A study by the World Economic Forum and the Boston Consulting Group found that AVs would increase traffic in Boston by 5%, mostly from people using AVs instead of public transportation. When ride-sharing began, it was predicted that it too would lessen traffic and pollution. Conversely, it has increased traffic by 180% in already crowded cities, and the users aren’t taking private vehicles off the road, but are using the ride-sharing vehicles instead of public transportation.

Liability questions persist

As the technology progresses and the option of AVs becomes closer, there is still the issue of responsibility. Who is liable if the vehicle has an accident, the owner of the vehicle or the manufacturer? If Uber has a fleet of autonomous vehicles, is Uber responsible for an accident if the vehicle malfunctions, or is the manufacturer liable for the damage? This has yet to be answered, and is a large sticking point.

The personal auto policy provides liability coverage for accidents for which the insured becomes legally liable. With autonomous vehicles, the vehicle should obey all traffic laws so there should be no speeding, no failure to yield, no missed stop signs or stop lights, or any of the other normal behaviors that cause accidents. The likelihood that a vehicle will slide on the ice or snow is reduced, because the vehicle will adjust to conditions and drive accordingly. Therefore, accidents should be the fault of the other party, perhaps driving a non-autonomous vehicle, someone trying to force the vehicle to react, or a malfunction of the autonomous system.

The last option is the biggest issue: If the vehicle malfunctions, who is responsible? Will the vehicle systems be warrantied for five years or 100,000 miles and after that, any malfunction becomes the owner’s responsibility? Will the manufacturer assume all liability as long as the owner ensures that the vehicle is maintained and all system upgrades are received? Most states that allow testing have specific insurance requirements while the vehicles are being tested. Will insurance requirements be different for autonomous vehicles? In the early days of discussion certain manufacturers stated that that would provide coverage if the vehicle malfunctioned. That is not mentioned much anymore, so it is anyone’s guess as to whether manufacturers will hold to those initial claims or not.

Another issue is the potential for hacking; with the vehicle so completely automated, and the fact that system updates are downloaded to it automatically, means the vehicle can potentially be hacked and made to behave in different ways, or even go somewhere the driver does not want it to go. A hacker could easily steal a vehicle in the middle of the night by having it drive itself to them. Hackers could also deliberately cause accidents by instructing the vehicle to behave erratically or run into other vehicles. How the owner would prove this is another matter, depending on how the vehicle stores information. It could be recorded in which case the hacker could be found, arrested and held responsible for any damages or injuries caused by his actions.

Liability is the critical issue that is still unanswered, and may not be answered until the vehicles are actually on the road. It is almost impossible to account for all variables, so some accidents are bound to happen. What happens as a result of these first few accidents is going to be extremely important.
There are people who may try to blame the vehicle when it was really their actions that caused an accident, or it could be that the vehicles drive so cautiously that they cause an accident that involves vehicles other than the autonomous one. Some of these issues are just going to have to be sorted out after the vehicles are actively driving on the roadways.

Source:  Christine G. Barlow of Property Casualty 360

No comments:

Post a Comment