SproutNews logo

The Car Wreck Lawyers of 1-800-Car-Wreck Report On Tesla Auto Accident That Could Trigger Insurer Lawsuit Against the Car Maker

Fort Worth, TX, 10/31/2016 /SubmitPressRelease123/

The Texas car wreck accident lawyers at 1800 Car Wreck are reporting on a very recent auto accident involving a Tesla Autopilot vehicle.

The accident occurred in Kaufman, Texas on Highway 175.

According to the driver, Mark Molthan, the Tesla he was driving was unable to take a turn on the highway, and instead, smashed into a cable guardrail several times before coming to a stop.

Although Molthan admitted that he was not fully focused on the road – due most likely to the Tesla’s Autopilot feature – he was shocked that his vehicle’s self-driving capabilities did not take over and properly navigate the turn.

Molthan said that in the moments prior to the auto accident, he had taken a cloth out of the glove box and was polishing the dashboard, confident that the Tesla could navigate the preset route.

“I used Autopilot all the time on that stretch of the highway,” Molthan stated. “But now I feel like this is extremely dangerous. It gives you a false sense of security. I’m not ready to be a test pilot. It missed the curve and drove straight into the guardrail. The car didn’t stop, it actually continued to accelerate after the first impact into the guardrail.”

The auto accident was so severe that Molthan’s insurance company declared the vehicle a total loss.

Molthan was lucky to walk away without being seriously hurt, but said that he had no plans to file a lawsuit against Tesla for the accident.

However, that may not be the case with his insurance company, Chubb Ltd.

Precursor To Insurer Lawsuit?

As more Tesla accidents occur, it is only a matter of time before an insurance company files suit against the carmaker for negligence.

Lawyers representing Molthan’s car insurer sent a notice letter to Tesla Motors Inc. after the accident, asking that the automaker perform a joint inspection of the total loss vehicle.

At issue is whether the Autopilot feature’s malfunction was the major contributing factor to the auto accident, or whether Molthan’s admitted inattention was the main cause of the crash.

It’s a legal gray area, because Tesla has repeatedly stressed to customers that Autopilot is an “assist feature,” that isn’t intended to fully pilot a vehicle.

Prospective buyers interested in a Tesla Autopilot vehicle are provided with a manual that explains the importance of keeping their hands on the wheel at all times in preparation to take over driving responsibilities in the event that Autopilot fails.

Tesla officials have maintained that the fact that Autopilot is semi autonomous conveys to drivers that they cannot rely on the feature to safely operate their vehicles without human vigilance.

But the fact that there has already been one fatality involving a Tesla Autopilot vehicle has raised the stakes in terms of legal action.

Recent Tesla Autopilot Fatal Accident

In May, the owner of a Tesla Autopilot was killed in a devastating auto accident that highlighted the potential dangers of semi autonomous vehicles.

At about 3:40 p.m., Joshua Brown, 45, a resident of Ohio, was driving a 2015 Tesla Model S with Autopilot on U.S. 27 in Williston, Florida.

Unfortunately, the Tesla’s sensor system was not able to detect the presence of a tractor-trailer that was making a left turn across the highway, directly in the Tesla’s path.

As a result, Brown attempted to speed his way out of danger, but the Tesla crashed into the undercarriage of the large truck, shearing off the vehicle’s top half.

After the Tesla ran all the way under the tractor-trailer, it smashed into two fences and a utility pole before coming to a stop.

Brown died at the scene, but the driver of the truck, 62-year-old Frank Baressi was not injured.

Brown was an enthusiastic supporter of Tesla, and had posted videos on social media of his vehicle on Autopilot, marveling at the technology.

Tesla quickly responded to the tragic death by informing consumers that this was the first known death in one of the company’s Autopilot vehicles, and that Tesla drivers had driven more than 130 million miles without an incident.

The statement also said that the driver’s assistance feature is not meant to replace a driver’s hands on the wheel.

“Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert,” said part of the statement.

Tesla officials believe that the Autopilot failed to engage the brakes in the Brown accident, because the sensors could not distinguish between the tractor-trailer, which was white, and the brightly lit sky.

Although Brown’s family retained a personal injury lawyer, they have not filed a lawsuit against Tesla regarding the auto accident.

In a statement, the family said:

“In honor of Josh’s life and passion for technological advancement, the Brown family is committed to cooperating in these efforts, and hopes that information learned from this tragedy will trigger further innovation which enhances the safety of everyone on the roadways.”

NTSB Investigation Into Tesla Autopilot Death

Brown’s death triggered a full-scale investigation by the National Transportation Safety Board (NTSB).

In July, the agency released its preliminary findings and said that Brown was speeding in the moments prior to the fatal accident.

In fact, the NTSB said that the Tesla Model S was traveling at 74 miles per hour in a 65-mph zone when it smashed into the tractor-trailer.

The report also said that Brown had activated the vehicle’s Traffic-Aware Cruise Control and Autosteer (a lane integrity assistance feature).

The NTSB said it had not yet determined the probable cause of the accident, and that speed was likely only a contributing factor in the accident.

“All aspects of the accident remain under investigation,” stated NTSB spokesman Christopher O’Neil.

A final report is not expected until July 2017.

Tesla’s Autopilot feature is also under investigation by the U.S. National Highway Transportation Safety Administration (NHTSA) as to whether the semi-autonomous system poses a safety risk to drivers.

False Sense of Security

Part of the issue with vehicles that have semi autonomous features is that they can convince drivers to pay less attention to the road.

“The risk is that these features are intended to be a secondary set of “eyes” on the road,” stated Amy Witherite, partner at the Texas personal injury law firm, Eberstein & Witherite, which also has an office in Atlanta. “But psychologically, drivers are aware that these automatic safety features are supposed to take over the vehicle’s controls if something goes wrong. So there’s a false sense of security that creates more risk-taking behavior, such as taking their hands off the wheel. It’s a real issue as to where the responsibility lies in these types of accidents, especially if the semi-autonomous system fails.”

And part of that false sense of security may be due to the name that Tesla has chosen to give the driver assistance feature: Autopilot.

Recently, Consumer Reports asked Tesla to change the name of the technology, because it was misleading, and caused drivers to mistakenly believe that the vehicle was fully autonomous.

“What Consumer Reports asked Tesla to do was to modify the name of the technology, but more importantly, to disable the feature until the company develops driver assistance technology that will only engage when a driver’s hands are on the wheel,” Witherite added. “That seems to be a reasonable request, given the risks right now to Tesla Autopilot drivers and to other motorists. And we’re about to see more of these types of vehicles on the road, which is quite scary if the technology isn’t foolproof.”

New Guidelines On Semi-Autonomous Features

The debate over semi-autonomous vehicles will only get more intense, because the federal government is soon expected to issue new guidelines related to self-driving vehicles.

And car manufacturers such as Ford Motor Co. are going all-in, preparing to release fully autonomous vehicles in the next year.

Ford officials believe that semi-autonomous vehicles leave too much of a gray area that can cause liability issues, which is why the car maker is designing its self-driving vehicles to perform all operating functions, taking the responsibility off of the person behind the wheel.

Whether that will lower the number of auto accidents involving these vehicles remains to be seen, but one thing that’s certain is that self-driving cars are the wave of the future.

What Should You Do After a Car Accident?

Car wrecks are devastating and often cause damage beyond just physical injuries. At 1-800 Car-Wreck, we understand how difficult and confusing the time after an accident can be, and that’s why it’s important for you to contact us as soon as possible.

We are trained to handle every aspect of an accident, from providing you with the right medical care, to ensuring that no evidence is tampered with at the accident scene. If you live in Dallas, Austin, El Paso, Houston or Texarkana, call us right now at 1-800-Car-Wreck, and speak to one of the lawyers at Eberstein & Witherite. We have decades of experience helping people who have been injured in every kind of accident imaginable, find justice and obtain peace of mind. Call today, or reach us online, and we’ll be happy to contact you and offer a free case evaluation.

Media Contact

Lucy Tiseo

Eberstein & Witherite, LLP

Phone: 800-779-6665

Email: lucy.tiseo@ewlawyers.com

Connect with Eberstein & Witherite on FacebookTwitterLinkedIn, and Google+

source: http://www.1800-car-wreck.com/car-wreck-lawyers-1-800-car-wreck-report-tesla-auto-accident-trigger-insurer-lawsuit-car-maker.html

Like Submit Press Release 123 on Facebook

Read the full story at http://newsreleases.submitpressrelease123.com/2016/10/31/the-car-wreck-lawyers-of-1-800-car-wreck-report-on-tesla-auto-accident-that-could-trigger-insurer-lawsuit-against-the-car-maker/

ReleaseID: 21254

Go Top