Logo
FrontierNews.ai

Why Raw Radar Data Is Becoming the Secret Weapon in Autonomous Vehicle Development

Autonomous vehicle developers are increasingly demanding direct access to raw sensor data rather than relying on closed systems, and a new radar platform from South Korean company bitsensing is positioning itself to meet that need. The AIR4D Imaging Radar, launched specifically for autonomous driving applications, provides engineers with unfiltered 4D sensor information including point cloud and Doppler data, allowing them to train perception models and validate performance using real-world testing information.

Why Does Raw Radar Data Matter for Self-Driving Cars?

The distinction between open and closed sensor systems has become central to how autonomous vehicle programs move from controlled pilot zones to broader commercial deployment. Most existing 4D radar products operate as closed systems, meaning manufacturers process the sensor data internally and provide only the final results to customers. This approach limits engineers' ability to refine their own machine learning models or troubleshoot performance issues in specific conditions.

Access to raw radar outputs changes that equation. Engineers can use the underlying data to train perception models, validate how well their systems perform, and adjust software based on information gathered during real-world testing. For companies developing robotaxis, autonomous shuttles, and other self-driving fleets, this capability becomes increasingly important as they scale beyond test programs and face pressure to demonstrate consistent operation outside tightly controlled environments.

bitsensing designed AIR4D specifically for autonomous driving rather than adapting an existing radar product built for advanced driver assistance systems in passenger cars. That distinction matters because autonomous vehicle developers generally need sensor data tailored to the machine learning models used for perception, planning, and object tracking.

What Technical Advantages Does AIR4D Offer?

The radar system addresses several practical challenges that self-driving vehicles face in real-world conditions. It can detect objects up to 300 meters away, giving automated driving systems more time to identify hazards and react. The sensor operates effectively in near-total darkness and remains stable in rain, fog, and snow, conditions where camera systems alone often struggle.

One of the most significant improvements is the addition of elevation data to the 4D imaging capability. This extra dimension allows the system to build a richer picture of surroundings and better distinguish between different types of road users and objects. In urban driving scenarios, where automated systems must separate pedestrians from vehicles, identify roadside structures, and track movement across multiple lanes, this capability becomes essential.

The radar also provides direct velocity readings for individual objects, allowing a vehicle to determine how quickly nearby cars, cyclists, and pedestrians are moving. bitsensing designed the product with power and heat efficiency in mind, factors that affect how reliably sensor suites perform when vehicles operate for extended periods in real traffic conditions.

How Are Autonomous Vehicle Companies Using Sensor Data to Improve Performance?

  • Model Training: Engineers use raw sensor outputs to train machine learning perception models specific to their autonomous driving software, rather than relying on vendor-processed data that may not match their system architecture.
  • Performance Validation: Direct access to underlying test data allows companies to validate how well their systems perform in specific conditions like poor visibility, heavy traffic, or adverse weather, identifying gaps before broader deployment.
  • Software Refinement: Fleet operators can adjust and optimize their own software based on real-world information rather than being locked into fixed sensor processing from vendors, enabling faster iteration and improvement cycles.
  • Cost Optimization: The camera-and-radar approach supported by AIR4D could lower per-vehicle sensor costs compared with more complex sensor stacks that combine multiple lidar, camera, and radar systems.

In practice, radar is often used as part of a combined sensing system rather than as a standalone tool. Camera inputs provide image detail while radar contributes distance and velocity measurements. This complementary approach helps autonomous vehicles maintain situational awareness across a wider range of environmental conditions.

bitsensing, founded in 2018, has expanded from automotive radar into sectors including smart cities, connected living, and health technology. The company has raised $52 million from investors including AF WPartners, Korea Development Bank, and Mando.

"By delivering high-resolution 4D perception data, including, importantly, all raw data outputs, our goal at bitsensing is to empower autonomous vehicle companies to build systems that at speed and at scale," stated Dr. Jae-Eun Lee, chief executive officer of bitsensing.

Dr. Jae-Eun Lee, Chief Executive Officer at bitsensing

The wider market for autonomous vehicle sensors remains crowded, with radar, lidar, and camera suppliers all arguing for different system designs as operators seek a balance between cost, redundancy, and real-world reliability. For fleet operators trying to move beyond test programs, access to raw sensor outputs may become increasingly important as they refine in-house software and compete on the basis of their perception and decision-making algorithms rather than sensor hardware alone.