封面
市场调查报告书
商品编码
1518884

DJI Automotive自动驾驶业务分析(2024年)

Analysis on DJI Automotive's Autonomous Driving Business, 2024

出版日期: | 出版商: ResearchInChina | 英文 140 Pages | 商品交期: 最快1-2个工作天内

价格
简介目录

DJI Automotive的调查:以独特技术路线引领NOA市场。

2016年,DJI Automotive内部工程师将完整的立体感测器+视觉融合定位系统安装到汽车上并成功驾驶。DJI Automotive在无人机领域累积的感知、定位、判断、规划等技术已成功转移到智慧驾驶领域。

DJI Automotive的创办人和管理阶层几乎全来自DJI无人机计画。DJI Automotive刚成立时,成员约有10人,主要由当时从DJI Flight Control Department和Vision Department临时调来的代表组成。

DJI创新是一家专门从事智慧机器人研究的公司,并声称无人机和自动驾驶汽车是不同形式的智慧机器人。DJI创新透过其专有技术路线,在NOA的量产和使用方面处于领先地位。DJI Automotive预计,到2025年,约200万辆乘用车将配备DJI Automotive的智慧驾驶系统。

立体视觉感测器持续优化

DJI Automotive的核心技术之一是立体视觉。即使GPS等其他感测器故障,无人机也可以基于立体摄影机的视觉识别进行悬停、避障、测量速度等。

将立体视觉技术应用于自动驾驶汽车后,DJI Automotive根据不同自动驾驶等级的需求,不断优化立体视觉感测器。

2023年,为了满足NOA的需求,DJI Automotive推出了第二代惯导立体视觉系统。该系统透过添加客製化光学偏光片来消除整体镜头遮光罩,并使用更好的自校准演算法取消刚性连桿。这使得感测器的安装变得很容易,两个摄影机之间的距离可以从180mm到400mm灵活设定。取消刚性连桿是立体视觉感测器的一大进步,让立体相机能够应用于更多场景。

基于L3级自动驾驶的需求,DJI Automotive于2024年发布了集光达、立体感测器、长焦单摄影机和惯性导航于一体的光达视觉系统。与目前市场上常见的 "雷射雷达+前置摄影机" 解决方案相比,该系统实现了100%的性能并取代了所有功能,同时降低了30%至40%的成本。由于采用一体化设计, "LiDAR-vision" 解决方案还可以整合在整个驾驶室中,降低整体安装成本。

"LiDAR-vision" 解决方案可以进一步增强车辆纵向控制的安全性。凭藉雷射雷达的精确测距能力和对光照的稳健性, "雷射雷达视觉" 解决方案可用于短距离切入、处理城市地区复杂的交通流、适应弱势道路使用者(VRU)以及克服任意障碍物智慧驾驶系统可以进一步提高避障、绕道、夜间VRU等场景的安全性和舒适性。

利用无人机技术进行资料撷取与模拟

三种自动驾驶资料撷取方式中,以车辆撷取最为常见,但有效资料比例较低,容易干扰周围车辆的实际行为,且感测器盲区的资料无法取得记录了。另一种方法是现场采集,但灵活性较差且不可靠。

根据亚琛工业大学汽车技术研究所fka和DJI Automotive近两年来的深入研究,无人机航测资料收集具有明显的优势。无人机可以收集更丰富、更完整的场景资料,可以在没有任何障碍物的情况下直接采集目标车辆盲点内所有车辆的空拍客观照片,使其更加真实且无需人为干扰,可以更有效率地采集特定路段和资料。

本报告针对DJI Automotive自动驾驶业务进行调查分析,提供公司核心技术、解决方案和发展趋势等资讯。

目录

第1章 概述

  • 简介
  • 团队
  • 发展历程
  • 全场景智慧驾驶用户旅程图
  • 产品布局(1)
  • 产品布局(2)
  • 产品布局(3)
  • DJI创新对高阶智慧驾驶演进趋势的判断

第2章 核心技术

  • 惯性导航立体视觉:发展至第二代
  • 立体辨识技术(1)
  • 立体辨识技术(2)
  • 视觉感测器参数对比
  • LiDAR-vision系统与规格
  • 智慧型驾驶域控制器
  • 网域控制站中介软体
  • Lingxi Intelligent Driving System 2.0
  • BEV辨识技术

第3章 智慧驾驶解决方案

  • 智慧驾驶解决方案
  • 解决方案 1
  • 解决方案 2
  • 解决方案 3
  • 解决方案 4
  • 解决方案 5
  • 解决方案 6
  • 解决方案比较:停车功能与感测器配置
  • 解决方案比较:驾驶功能与感测器配置
  • 灵犀智慧驾驶技术路线

第4章 DJI Automotive智慧驾驶解决方案量产及应用

  • 应用范例1
  • 应用范例2
  • 应用范例3
  • 宝骏云朵灵犀智能驾驶2.0
  • 应用范例4
  • 应用范例5

第5章 DJI Automotive如何开发NOA系统

  • DJI AutomotiveNOA解决方案简介
  • 如何选择全域视觉辨识路线
  • 如何建立环保意识与预测能力
  • 如何建立高精度局部位姿估计函数
  • 实作DJI AutomotiveNOA的演算法与模型
  • DJI Automotive如何实现 NOA(1)
  • DJI Automotive如何实现 NOA(2)
  • DJI Automotive如何实现 NOA(3)
  • DJI Automotive如何实现 NOA(4)
  • DJI Automotive如何实现 NOA(5)
  • DJI Automotive如何实现 NOA(6)
  • DJI Automotive如何实现 NOA(7)
  • DJI Automotive如何实现 NOA(8)
  • DJI Automotive如何实现 NOA(9)
  • DJI Automotive如何实现 NOA(10)
  • DJI Automotive如何实现 NOA(11)
  • 如何确保可靠性
简介目录
Product Code: LYS014

Research on DJI Automotive: lead the NOA market by virtue of unique technology route.

In 2016, DJI Automotive's internal technicians installed a set of stereo sensors + vision fusion positioning system into a car and made it run successfully. DJI Automotive's technologies such as perception, positioning, decision and planning accumulated in the drone field have been successfully transferred to intelligent driving field.

Almost all founding and management team members of DJI Automotive came from DJI's drone projects. DJI Automotive had only about 10 members at the beginning, mainly composed of representatives temporarily transferred from the Flight Control Department and Vision Department of DJI at that time.

DJI claims that it is a company specializing in the research of intelligent robots, and drones and autonomous vehicles are different forms of intelligent robots. Relying on its unique technology route, DJI holds lead in the mass production and application of NOA. By DJI Automotive's estimates, around 2 million passenger cars taking to road will be equipped with DJI Automotive's intelligent driving systems in 2025.

Continuously optimize stereo vision sensors

One of the core technologies of DJI Automotive is stereo vision. Even when other sensors like GPS fail, based on visual perception of the stereo camera, drones can still enable hovering, obstacle avoidance, and speed measurement among others.

After applying stereo vision technology to autonomous vehicles, DJI Automotive continues to optimize stereo vision sensors according to requirements of different autonomous driving levels.

In 2023, to meet the needs of NOA, DJI Automotive launched the second-generation inertial navigation stereo vision system, which eliminates the overall lens hood by adding a customized optical polarizer and cancels the rigid connecting rod using a better self-calibration algorithm. This makes it easier to install the sensor, and the distance between two cameras can be flexibly configured from 180 mm to 400 mm. Elimination of the rigid connecting rod is a huge progress in stereo vision sensors, allowing stereo cameras to be applied in much more scenarios.

Based on the needs of L3 autonomous driving, in 2024 DJI Automotive introduced a LiDAR-vision system, which combines LiDAR, stereo sensor, long-focus mono camera and inertial navigation. Compared with the currently common "LiDAR + front camera" solution on the market, the system can reduce the costs by 30% to 40%, while enabling 100% performance and replacing all the functions. Thanks to the integrated design, the "LiDAR-vision" solution can also be built into the cabin as a whole, reducing the overall installation costs.

The "LiDAR-vision" solution can further enhance safety in vehicle longitudinal control. Thanks to LiDAR's precise ranging capabilities and robustness to illumination, the "LiDAR-vision" solution can further improve safety and comfort of intelligent driving system in such scenarios as cut-in at close range, complex traffic flow in urban areas, response to vulnerable road users (VRU), arbitrary obstacle avoidance, detour, and VRU at night.

Use drone technologies for data acquisition and simulation

Among the three autonomous driving data acquisition methods, acquisition by vehicles is the most common, but the proportion of effective data is low, and it is easy to interfere with real behaviors of surrounding vehicles, and it is unable to record data in blind spots of sensors. Another method is acquisition in field, with low flexibility and insufficient reliability, a result of angle skew and low image accuracy.

According to the in-depth research by fka, the automotive technology research institute of RWTH Aachen University, and DJI Automotive's own practices in the past two years, aerial survey data acquisition by drones has obvious advantages. Drones can collect richer and more complete scenario data, and can directly collect aerial objective shots of all vehicles in blind spots of the target vehicle without obstruction, reflecting more realistic and interference-free human driving behaviors, and more efficiently collecting data in specific road sections and special driving scenarios, for example, on/off-ramps and frequent cut-ins.

Why does the implementation of vision-only autonomous driving suddenly accelerate?

Why has the pace of implementing vision-only technology solutions suddenly quicken since 2024? The answer is foundation models. The research shows that a truly autonomous driving system needs at least about 17 billion kilometers of road verification before being production-ready. The reason is that even if the existing technology can handle more than 95% of common driving scenarios, problems may still occur in the remaining 5% corner cases.

Generally, learning a new corner case requires collecting more than 10,000 samples, and the entire cycle is more than 2 weeks. Even if a team has 100 autonomous vehicles conducting road tests 24 hours a day, the time required to accumulate data is measured in "hundred years" - which is obviously unrealistic.

Foundation models are used to quickly restore real scenarios and generate corner cases in various complex scenarios for model training. Foundation models (such as Pangu model) can shorten the closed-loop cycle of autonomous driving corner cases from more than two weeks to two days.

Currently, DJI Automotive, Baidu, PhiGent Robotics, GAC, Tesla and Megvii among others have launched their vision-only autonomous driving solutions. This weekly report summarizes and analyzes vision-only autonomous driving routes.

Table of Contents

1 Overview

  • 1.1 Profile
  • 1.2 Team
  • 1.3 Development History
  • 1.4 All-scenario Intelligent Driving User Journey Map
  • 1.5 Products Layout (1)
  • 1.6 Products Layout (2)
  • 1.7 Products Layout (3)
  • 1.8 DJI's Judgment on Evolution Trends of High-level Intelligent Driving

2 Core Technologies

  • 2.1 Inertial Navigation Stereo Vision: Developed to the 2nd Generation
  • 2.2 Stereo Vision Perception Technology (1)
  • 2.3 Stereo Vision Perception Technology (2)
  • 2.4 Parameter Comparison between Vision Sensors
  • 2.5 LiDAR-vision System and Specifications
  • 2.6 Intelligent Driving Domain Controller
  • 2.7 Domain Controller Middleware
  • 2.8 Lingxi Intelligent Driving System 2.0
  • 2.9 BEV Perception Technology

3 Intelligent Driving Solutions

  • 3.1 Intelligent Driving Solutions
  • 3.2 Solution 1
  • 3.3 Solution 2
  • 3.4 Solution 3
  • 3.5 Solution 4
  • 3.6 Solution 5
  • 3.7 Solution 6
  • 3.8 Solution Comparison: Parking Functions and Sensor Configurations
  • 3.9 Solution Comparison: Driving Functions and Sensor Configurations
  • 3.10 Lingxi Intelligent Driving Technology Route

4 Mass Production and Application of DJI Automotive's Intelligent Driving Solutions

  • 4.1 Application Case 1
  • 4.2 Application Case 2
  • 4.3 Application Case 3
  • 4.4 Lingxi Intelligent Driving 2.0 for Baojun Yunduo
  • 4.5 Application Case 4
  • 4.6 Application Case 5

5 How Dose DJI Automotive Develop NOA System?

  • 5.1 Introduction to DJI Automotive's NOA Solution
  • 5.2 How to Choose Perception Routes for Global Vision
  • 5.3 How to Establish Environment Perception and Prediction Capabilities
  • 5.4 How to Establish High-precision Local Pose Estimation Capabilities
  • 5.5 DJI Automotive's Algorithms and Models to Enable NOA
  • 5.6 How DJI Automotive Realizes NOA (1)
  • 5.7 How DJI Automotive Realizes NOA (2)
  • 5.8 How DJI Automotive Realizes NOA (3)
  • 5.9 How DJI Automotive Realizes NOA (4)
  • 5.10 How DJI Automotive Realizes NOA (5)
  • 5.11 How DJI Automotive Realizes NOA (6)
  • 5.12 How DJI Automotive Realizes NOA (7)
  • 5.13 How DJI Automotive Realizes NOA (8)
  • 5.14 How DJI Automotive Realizes NOA (9)
  • 5.15 How DJI Automotive Realizes NOA (10)
  • 5.16 How DJI Automotive Realizes NOA (11)
  • 5.17 How to Ensure Reliability