<p class="copyright">Beck Diefenbach / Reuters</p>
<p>” src=”https://s.yimg.com/ny/api/res/1.2/nJj.lJKKC848FZ1B24uMGA–/YXBwaWQ9aGlnaGxhbmRlcjt3PTcwNTtoPTQ2OS44ODI1/https://media.zenfs.com/EN/business_insider_articles_888/1eb45469a322c89bf5f4a31bc04f3dea” data-src=”https://s.yimg.com/ny/api/res/1.2/nJj.lJKKC848FZ1B24uMGA–/YXBwaWQ9aGlnaGxhbmRlcjt3PTcwNTtoPTQ2OS44ODI1/https://media.zenfs.com/EN/business_insider_articles_888/1eb45469a322c89bf5f4a31bc04f3dea”></p></div>
</div><figcaption class=
  • Canadian authorities said they charged a driver for dangerous driving after they appeared to be sleeping while his Tesla Model S drove on Autopilot.

  • It’s the latest in a long string of incidents involving abuse of the assisted-driving software. 

  • In some cases, there have been accidents and deaths. Tesla maintains the software is not misleading.

  • “The people who misuse Autopilot, it’s not because they’re new to it and don’t understand it,” Elon Musk told Automotive News in July in response to criticism of the feature’s name.

  • Visit Business Insider’s homepage for more stories.

A driver in Canada is the latest to be caught appearing to abuse Tesla’s Autopilot functionality.

Authorities in the Canadian province of Alberta say the driver of a Model S was charged with dangerous driving after being caught speeding with the front seats fully reclined and appearing to be asleep.

“The car appeared to be self-driving, travelling over 140 km/h [86 mph] with both front seats completely reclined & occupants appeared to be asleep,” RCMP Alberta said on Twitter.

“Nobody was looking out the windshield to see where the car was going,” a RCMP sergeant told CBC. “I’ve been in policing for over 23 years and the majority of that in traffic law enforcement, and I’m speechless.

The incident is the latest in a string of accidents and criminal charges to come from misuse of Tesla’s assisted-driving software, which requires drivers remain attentive. The system will notify a driver to remain attentive if it detects they’ve been distracted, and will eventually pull the car over if no input is received, Tesla says.

Last month in North Carolina a Tesla slammed into a police car during a traffic stop as the driver behind the wheel watched a movie, authorities said. And, in June, a similar crash occurred in Massachusetts.

Still, drivers have found plenty of simple ways to bypass the system’s safety features. CEO Elon Musk’s own comments about the software’s current and future capabilities have also come under fire by consumer safety groups who call them misleading.

“The people who misuse Autopilot, it’s not because they’re new to it and don’t understand it,” Musk told Automotive News in July of the criticism.

“The people who first use Autopilot are extremely paranoid about it. It’s not like, ‘If you just introduced a different name, I would have really treated it differently.’ If something goes wrong with Autopilot, it’s because someone is misusing it and using it directly contrary to how we’ve said it should be used.”

Read the original article on Business Insider