To accomplish all this, it relies on the following key technologies –
Front Camera –
A camera mounted on the windshield helps the car see objects in front of it. This camera also detects and records information about road signs and traffic lights, which is intelligently interpreted by the car’s software The camera is considered to be the eye of the vehicle through which it sees the world around it.
Ultrasonic Sensors –
An ultrasonic sensor on one or more of the rear wheels help keep track of the movements of the car and will alert the car about the obstacles in the rear. It uses the concept of doppler effect to track the distance and relative velocity between it and the other stationary and non-stationary objects.
LIDAR –
An autonomous car is driven along the route and maps out the route and it’s road conditions including poles, road markers, road signs and more. This map is fed into the car’s software helping the car identify what is a regular part of the road. As the car moves, LIDAR generates a detailed 3D map of the environment at that particular moment which is used for semantic segmentation which means detailing objects in each and every pixel of image. The use of LIDAR is a bit controversial as some researchers believe that it could be replaced with two cameras both inclined so as to map a perspective image of what the car sees from the front. This could result in a drop in prices as LIDAR is quite expensive.
Aerial –
An aerial on the rear of the vehicle receives information about the precise location of the car. The car’s GPS inertial navigation unit works with the sensors to help the car localize itself. As the vehicle moves, the vehicle’s internal map is updated with new positional information displayed by the sensors which help the car understand where it is at a particular instance of time.
Computer Software –
This is the heart of self-driving cars. It is responsible for processing all of the data received in real time as input from the various sensors and acting based on it with the means of various actuators. So essentially the role of computer software is to process the inputs, plot a path and send instructions to the actuators to control acceleration, braking and steering.
Autonomous vehicle technology as progressed considerably in the past decade due to the increased processing capabilities of the processors and improvement in accuracy of deep learning algorithms. Sensor fusion combined with advancements in computer vision technology is probably going to change the landscape of transportation. Although there are still a lot of work to be done but researchers are convinced that it’s just a matter of time when these vehicles would be a common place on the roads.