Like a human driver, a self-driving car needs to “see” its environment in order to navigate through it.
But instead of eyes, autonomous vehicles use an array of sensors—including cameras and radar—to detect objects and obstacles.
So what does a self-driving car’s view of the road look like?
DON’T MISS: Tesla’s own numbers show Autopilot has higher crash rate than human drivers
The second of Tesla’s recent self-driving demonstration videos offers side-by-side peeks at what an occupant sees against what the car is sensing.
It includes feeds from some of a fully self-driving Tesla test car’s onboard sensors, alongside video of the driver doing, well, nothing.
The car controls all steering, acceleration, and braking while the person behind the wheel simply sits there, presumably waiting to take over in case something goes wrong.
Tesla Model S Autopilot
Meanwhile, the car’s cameras track and classify various objects, placing color-coded boxes around them on the video feed.
Objects—such as other cars—are placed in blue boxes if they are outside the vehicle’s path, and green boxes if they are in its path.
Road signs are tagged purple, and lane markers are highlighted in a pinkish color.
ALSO SEE: Tesla Autopilot 8.0 upgrade would have averted fatal crash, Musk says
Current Tesla autonomous prototypes rely primarily on cameras and radar, the same combination used in the Autopilot driver-assistance system that is already available to consumers in a “public beta” program.
Tesla CEO Elon Musk is adamant that production self-driving cars can operate using only those two primary sensors, although most other automakers developing self-driving cars use Lidar as well.
Lidar operates on the same basic principle as radar, but using light instead of radio waves.
Tesla Model S Autopilot system
Musk has called Lidar “unnecessary,” and said the technology does not work well in snow, rain, or dust.
Instead, Musk believes Tesla can achieve full autonomy by fine tuning the interplay between cameras and radar units, as well as a heavy reliance on fleet learning.
MORE: Tesla upgrades self-driving sensors, hardware; full autonomy test next year?
Doing so would save the cost of purchasing and installing Lidar units on Tesla electric cars.
Musk has said a new suite of hardware—known as “Hardware 2″—that launched in the Tesla Model S and Model X last month will allow cars to achieve “Level 5” autonomy, once more-sophisticated software becomes available.
Tesla Enhanced Autopilot
The National Highway Traffic Safety Administration (NHTSA) defines Level 5 autonomy as a vehicle not fitted with a steering wheel or any driver controls, one where the occupant simply tells the car where to go and then could, theoretically, go to sleep while the car drives itself.
The Hardware 2 suite consists of eight cameras, 12 ultrasonic sensors, and a forward-facing radar system.
When Hardware 2 was announced, Musk said his goal would be for a self-driving Tesla vehicle to leave Los Angeles, take one or more occupants to New York City, and park there without any driver input, as soon as next year.
Musk believes that the technology for self-driving cars will be available relatively soon, but that regulations may take some time to catch up.
_______________________________________________
Follow GreenCarReports on Facebook and Twitter
View original article at: “https://www.greencarreports.com//news/1107393_what-a-prototype-self-driving-tesla-sees-on-public-roads”