Vehicles across the continent are increasingly shifting from ‘passive’ to ‘active’ safety features with the aim of preventing crashes from happening at all – particularly since an EU mandate came into effect. One part of the shift towards active safety is the evolution of in-cabin driver and occupant monitoring systems. These systems combine critical data within one central intelligence network to create a better picture of human behavior in the car and contributing to a safer driving experience.
But the other – perhaps more crucial part – will be in the connected infrastructure that supports drivers on the road. Technologies like RADAR (Radio Detection and Ranging) and LiDAR (light detection and ranging) are set to detect traffic, perceive potential obstacles and understand the environment outside the vehicle. With these technologies combined, cars will be able to ‘talk’ to the environment around them – with the potential to usher-in a new era of connectivity and road safety.
RADAR vs LiDAR
Let’s dive into these technologies and how they work together. Both RADAR and LiDAR enable accurate depth sensing. RADAR uses radio waves to provide long-range object detection in even the most adverse weather conditions. LiDAR emits laser light pulses to offer high precision and detail for 3D mapping. By emitting radio waves that bounce off objects and return to the sensor, RADAR systems can detect how far away an object is, the relative speed of that object, the direction of its movement and – depending on resolution and signal processing – the size and shape of that object.
With this information, vehicles are better equipped to identify potentially dangerous situations and prevent crashes. Radar technology, for example, can detect if the car in front suddenly slows down and as the gap between vehicles decreases, automatic emergency braking systems can be deployed. On a technical level, automotive radar solutions typically use 24 GHz or 77 GHz bands, balancing range and resolution requirements. While 24 GHz radars are used in Advanced Driver Assistance Systems (ADAS) to provide safety features such as blind-spot detection, rear cross traffic alerts and collision avoidance, 77 GHz radars can detect obstacles like other vehicles, cyclists or pedestrians in the 30 to 250 meter range, even in low visibility conditions like fog, rain and snow.
By comparison, LiDAR systems emit a series of short bursts of light that reflect off objects and surfaces before returning to the sensor. The “time of flight” data is used to calculate the distance to an object and create a dense collection of 3D points, mapping out a detailed 3D model. This precise object detection and classification is ideal for more complex or even semi-autonomous driving scenarios, where distinguishing between pedestrians, vehicles, and road edges is crucial. LiDAR is typically more precise than RADAR, however LiDAR is more susceptible to distortion or lower performance in fog or rainy conditions.
The V2X layer
Many modern vehicles combine RADAR and LiDAR to formulate a more detailed and complete picture of the vehicle’s surroundings – and enables a future of Vehicle-to-Everything (V2X) communication. V2X refers to an intelligent ecosystem where cars are just one piece of the puzzle. With V2X communication, vehicles, road infrastructure, pedestrians and cellular networks or cloud-based services can exchange information in real-time.
The symbiotic relationship between these technologies is certainly exciting – with the potential to improve journeys and reduce road accidents. Environmental sensors would monitor road surface conditions. When moisture levels build up, this data can be communicated to adaptive road signs to automatically reduce speed limits. Thinking of congestion, traffic sensors can measure vehicle volumes and speeds, feeding into traffic signals so green lights can be extended when necessary to reduce traffic jams on the roads. This technology is yet to be deployed on a massive scale, but is already being tested in cities across the United States, China and Portugal – and the momentum will only increase as the benefits are felt by road users.
A connected ecosystem of cars
V2X technology transforms connected cars into mobile sensors. Each car will collect anonymized data about road conditions, hazards, and traffic patterns – and also hard braking events and airbag deployments to areas of poor visibility – to benefit all road users without compromising driver privacy. In this world, authorities could apply automatic speed limitations based on real-time data from vehicle clusters – by sending a warning to the driver about a pothole, or the car auto-adjusting for those conditions.
Pedestrians would also be safer. If a vehicle’s sensors may not “see” a child on a bike about to emerge from behind a parked car, a smart roadside unit equipped with V2X technology may catch it from another vantage point to warn nearby drivers so they can slow down or even trigger automatic braking.
Data governance in the V2X age
Questions of data privacy and accountability are also emerging as V2X capabilities continue to scale. Who is accountable if a software update introduces a safety flaw? And should anonymized safety data – from near-miss incidents to driver behavior patterns – be shared between automotive manufacturers to improve system-wide learning?
Safety improvements need to be balanced with the diminishing role of human agency in the vehicle. Though we are some years from fully autonomous vehicles, the shift is underway to reduce the margin for human error on the roads. Yet if drivers are less engaged or less able to intervene quickly if an incident arises, it could paradoxically increase risk in situations where manual override becomes necessary. The evolution of increasingly connected and autonomous vehicles must go hand-in-hand with transparency, good data stewardship, and appropriate human oversight if the industry is to build trust in a V2X-enabled future.
Driving 2.0
With RADAR, LIDAR and V2X technology, vehicles are on track to become one node in a much larger and more intelligent ecosystem. They will be able to make sense of the world around them, detecting and interacting with the road, other vehicles and their wider external environment so evasive action can be taken early to avoid collisions.
A comprehensive 360-degree view of a vehicle’s surroundings combined with the potential to safely share anonymized data will enable a new era of road safety.
- Learn more about how ST is building towards the future of driving