封面
市场调查报告书
商品编码
2007699

具身人工智慧机器人(包括VLA)的大规模模型(2026)

Embodied AI Robot Large Model (Including VLA) Research Report, 2026

出版日期: | 出版商: ResearchInChina | 英文 480 Pages | 商品交期: 最快1-2个工作天内

价格
简介目录

具身人工智慧机器人的大规模模型(以下简称「机器人大规模模型」)无需精确建模即可执行端到端或分层决策,与传统机器人控制演算法相比,它们能够在非结构化的开放环境(例如家庭、户外、杂乱的桌面等)中运行。与通用大规模模型相比,具身人工智慧机器人的大规模模型更注重多模态资讯(视觉+雷射雷达+触觉+文字等)的整合和理解,旨在完成物理世界中的闭合迴路操作,并输出关节角度、速度和抓取力等运动指令。

近年来,嵌入式人工智慧机器人的大规模模式领域呈现以下发展趋势:

1. 嵌入式人工智慧公司开始应用世界模型。

目前,视觉-语言-动作(VLA)模型等大规模机器人模型在「感知、决策和执行」的闭合迴路中取得了显着进展,使机器人能够理解指令并产生动作。然而,这些模型在应对物理世界的高度多样性和不确定性时仍然面临瓶颈。本质上,它们只是「模仿」训练资料中的模式,缺乏对自身行为后果的预见性,也无法理解背后的物理逻辑。

世界模型的整合正是为了突破这一限製而设计的。世界模型的核心在于赋予机器人「想像未来」的能力。透过学习多模态数据,机器人可以建立物理环境的内部动态表征,并基于当前状态和计画动作预测未来多个步骤的状态变化。这意味着机器人有可能从被动的服从型指令者转变为能够进行「大脑推理」的主动决策者。例如,在执行「倒水」任务时,配备世界模型的机器人不仅可以识别杯子和水壶,还能在行动前预测水流轨迹、杯子倾斜角度以及溢出的可能性,从而规划出更安全、更精准的动作定序。

2. 大型机器人模型可实现跨平台使用。

传统的机器人开发模式需要针对每个机器人独特的硬体配置(感测器、致动器、几何结构)进行专门的软体和演算法开发与最佳化,导致研发成本高、开发週期漫长且功能难以重复使用。而跨平台使用大规模机器人模型可以克服这些挑战。透过建构一个稳健的端到端多模态基础模型,我们可以将可迁移的通用智慧整合到机器人中,从而克服不同本体、任务和环境(例如人形机器人、四足机器人和机械臂)的局限性,并实现功能的快速泛化和部署。

3. 增加大规模机器人模型的开放原始码。

开放原始码大规模模型不仅仅是共用技术。开放原始码模型能够汇集全球开发者的智慧,快速解决现实世界中复杂的「长尾问题」。同时,开放原始码打破了传统的封闭式源经营模式,使中小企业能够基于开放原始码模型快速发展,集中资源进行硬体创新和实际应用,从而构建「巨头搭建平台,百家企业在其上运行」的产业结构。

4. OEM厂商进入市场,是为了解决嵌入式AI机器人大规模模型缺乏真实世界资料的问题,并提供现场检验场景。

随着众多OEM厂商涌入嵌入式人工智慧和人形机器人市场,海量的工业场景资料、车规级感测器资料以及成熟的自动驾驶技术堆迭正被引入到大规模嵌入式人工智慧模型(如VLA模型、世界模型等)中。诸如电动车感知、多模态融合和端到端决策等演算法可以直接移植到机器人上,从而帮助训练和提升模型的环境理解、任务规划和运动控制能力。 OEM生产线场景可以检验大规模机器人模型的可靠性和成功率,同时揭示模型缺陷,并为未来的模型改进提供可靠的真实世界机器人互动数据,有效弥合模拟与现实之间的巨大差距。

本报告深入分析了具身人工智慧机器人的大规模模型市场,提供了有关基本概念、产业成长驱动因素、关键技术发展方向以及主要企业和产品的资讯。

目录

第一章:具身人工智慧机器人大规模模型概述及关键技术发展方向

  • 具身人工智慧机器人大规模模式的核心定义
  • 全球大型人工智慧机器人模型产业生态系统地图
  • 嵌入式人工智慧机器人大规模模式的分类
  • 推动嵌入式人工智慧机器人大规模模式产业化发展的因素
  • 大规模嵌入式人工智慧机器人关键技术的发展方向
  • 嵌入式人工智慧机器人大规模模式的商业化方法

第二章:主要企业与产品:大型科技公司

  • 主要科技公司领先的具身人工智慧大规模建模产品概述(1)至(3)
  • Alibaba Group
  • NVIDIA
  • Google DeepMind
  • OpenAI
  • Microsoft
  • Huawei
  • Tencent RoboticsX
  • Baidu
  • ByteDance
  • iFlytek
  • SenseTime

第三章:主要企业与产品:机器人企业训练营

  • 机器人公司代表性具身人工智慧大规模模型产品概述(1)至(3)
  • UBTECH Robotics(UBTECH)
  • Unitree Robotics
  • AgiBot
  • Leju Robotics
  • Galbot
  • RobotEra
  • FigureAI
  • Sanctuary AI
  • 1X Technologies
  • Neura Robotics

第四章:主要企业与产品:跨境OEM阵营

  • 各OEM厂商代表性压纹AI大型模型产品概述(1)至(4)
  • Tesla
  • Toyota
  • Honda
  • Hyundai
  • Xiaomi
  • XPeng
  • GAC Group
  • Chery
  • Leapmotor
  • BYD
  • Dongfeng Motor
  • 其他主要全球OEM厂商在具身人工智慧机器人领域的布局总结(1)-(4)
简介目录
Product Code: BHY011

Research on Robot Large Models: World Models Are About to Become Standard, and OEMs Enter and Accelerate Mass Production and Application

ResearchInChina has released the Embodied AI Robot Large Model (Including VLA) Research Report, 2026, which focuses on the research, analysis, and summary of the following content:

The basic concepts, industrial ecosystem map, multi-dimensional classification (application scope, capability modality, architecture), industry development drivers, key technology development directions, and commercialization modes of Embodied AI robot large models;

The layout planning, team building, core talents, large model products and their applications, detailed introduction and implementation status of Embodied AI robot large model products, Embodied AI ecosystem partners, and recent key dynamics of 11 tech giants in the Embodied AI robot field, including Alibaba Group, NVIDIA, Google DeepMind, OpenAI, Microsoft, Huawei, Tencent RoboticsX, Baidu, ByteDance, iFlytek, and SenseTime;

The profile, development history and planning, robot products and large model installation, detailed introduction of self-developed large models, large model ecosystem cooperation, and recent key dynamics of 10 well-known robot enterprises, including UBTECH Robotics, Unitree Robotics, AgiBot, Leju Robotics, Galbot, RobotEra, FigureAI, Sanctuary AI, 1X Technologies, and Neura Robotics;

The layout planning, team building, core talents, robot products and large model installation, summary of large model products, detailed introduction of Embodied AI robot large model products, Embodied AI ecosystem partners, and recent key dynamics of 11 OEMs in the Embodied AI robot field, including Tesla, Toyota, Honda, Hyundai, Xiaomi, XPeng, GAC Group, Chery, Leapmotor, BYD, and Dongfeng Motor. In addition, this report summarizes the layout of 13 other global OEMs in Embodied AI robot field.

Embodied AI robot large models ("robot large models" for short) can make end-to-end or hierarchical decisions compared with traditional robot control algorithms, without the need for precise modeling, and can operate in unstructured and open environments (families, outdoors, cluttered desktops). Compared with general large models, Embodied AI robot large models pay more attention to the fusion and understanding of multi-modal information (vision + lidar + touch + text, etc.), aiming to complete closed-loop actions in the physical world and output motion commands such as joint angles, speeds, and grasping forces.

In recent years, Embodied AI robot large model field has shown the following development trends:

1. Embodied AI Players Have Begun to Apply World Models

Currently, robot large models represented by Vision-Language-Action (VLA) models have made significant progress in the "perception-decision-execution" closed loop, enabling robots to understand instructions and generate actions. However, such models still face bottlenecks in coping with the high diversity and uncertainty of physical world. In essence, they are more like "imitating" patterns in training data, lacking the foresight of action consequences and the understanding of physical logic.

The introduction of world models is precisely to break this limitation. The core of a world model is to enable robots to acquire the ability to "imagine the future". Through training with multi-modal data, it constructs an internal dynamic representation of physical environment, and can predict state changes of multiple future steps based on current state and planned actions. This means that robots can transform from passive instruction followers to active decision-makers capable of "brain deduction". For example, when performing the "pouring water" task, a robot equipped with a world model can not only identify cups and kettles but also predict the water flow trajectory, cup tilt angle, and possible spills before action, thereby planning a safer and more accurate action sequence.

Driving forces for the application of world models mainly come from three aspects:

Solving the data bottleneck: The collection of high-quality real robot data is extremely costly and limited in scale, having become a core constraint on capability upgrading. World models can serve as powerful "data generators" and "simulation engines", generating massive, controllable, and high-fidelity synthetic training scenarios, and greatly reducing the reliance on expensive real robot data.

Improving decision and generalization capabilities: Through prediction and deduction, world models enable robots to have a certain degree of causal reasoning and physical intuition, capable of handling new scenarios and new objects not seen in training, and achieving "learning by analogy".

Realizing the collaborative evolution of "cerebrum" and "cerebellum": The industry consensus is that future robots' intelligence will be the result of collaborative evolution of the "cerebrum" (high-level cognition and planning) and the "cerebellum" (low-level motion control). As a key component of the high-level "cerebrum", the world model forms a complementary relationship with execution-oriented models such as VLA, jointly constituting a complete intelligent system.

Many enterprises have developed their own world models, such as Alibaba's WorldVLA, NVIDIA's WAM, Tencent Hunyuan 3D World Model, Unitree Robotics' UnifoLM-WMA-0, and AgiBot's GE-1. Among them, Unitree Robotics' UnifoLM-WMA-0 was released and open-sourced around September 2025. It is an open-source model specially designed for general robot learning. It has been adapted to the company's humanoid and quadruped robots, with two modes: decision and simulation. The decision mode can predict future physical interactions (such as stacking stability and collision risks), correct actions, and improve the robustness of complex tasks. The simulation mode can generate high-fidelity synthetic data to solve the problem of scarcity of real robot training data.

AgiBot's world model GE-1 was released in August 2025, which is a video-generative world model for robot control. With a closed-loop architecture of "video generation + strategy learning + simulation evaluation", it realizes end-to-end reasoning from "seeing" to "thinking" and then to "acting". GE-1 collaborates with AgiBot's GO-1 series base models: GO-1 focuses on general task planning and common-sense knowledge support, while GE-1 specializes in spatiotemporal prediction and action rehearsal, improving the task success rate and stability of G2 in complex scenarios.

GE-1 was officially deployed with the industrial-grade interactive embodied operation robot G2 in October 2025, and AgiBot announced that it had won an order worth hundreds of millions of yuan from Longcheer Technology. The robot has performed tasks such as "making sandwiches", "pouring tea", and "wiping the desktop".

2. Robot Large Models Achieve Cross-Platform Applications

In traditional robot development mode, the software and algorithms of each robot need to be specially developed and optimized for its unique hardware configuration (sensors, actuators, form), leading to high R&D costs, long cycles, and non-reusable capabilities. The cross-platform application of robot large models can break this drawback. By building a powerful end-to-end multi-modal foundation model, it implants transferable general intelligence into robots, enabling them to cross the limitations of different ontologies (such as humanoid, quadruped, robotic arm), different tasks and different environments, and realize rapid generalization and deployment of capabilities.

Starting from 2025, robot large models such as NVIDIA's GR00T series, Google DeepMind's Gemini Robotics, Microsoft's Rho-alpha, Huawei's CloudRobo, and RobotEra's ERA-42 all support cross-robot platform development and cross-scenario applications.

In Q3 2025, NVIDIA released the GR00T N1.6 large model, positioned as a general humanoid robot VLA large model. Through a unified multi-modal interface + modular adaptation layer + simulation-reality collaborative pipeline + hierarchical deployment architecture, it realizes the cross-platform application of "one training, multi-machine adaptation". It supports humanoid dual-arm/mobile robotic arms, warehouse AGVs, medical assistive robots, scientific research robots, etc. It can execute tasks for new objects/new scenarios without a large amount of data, and can be flexibly adapted to various application scenarios such as industrial manufacturing, logistics and warehousing, household and commercial services, medical care and health, and scientific research and development.

RobotEra's end-to-end VLA embodied large model ERA-42 was released in December 2024 and initially adapted to its dexterous hand XHAND1. In mid-2025, the model was successively applied across platforms to the wheeled service robot Q5 and the bipedal humanoid robot L7, enabling rapid adaptation to new tasks without pre-programming.

3. An Increasing Number of Robot Large Models Are Open-Sourced

The open-sourcing of large models is not a simple technical sharing. Open-source models gather the wisdom of global developers and can quickly overcome complex "long-tail problems" in the physical world. At the same time, open-sourcing breaks traditional closed-source business mode, allowing small and medium-sized enterprises to quickly develop based on open-source models, focus resources on hardware innovation and implementation in scenarios, and form an industrial pattern of "giants build the platform, and hundreds of enterprises perform on it".

The core of open-sourcing is to lower the R&D threshold, accelerate technological iteration, build ecosystem barriers, promote large-scale implementation, and form a positive flywheel of "open-source - ecosystem- data - more powerful models".

Xiaomi's VLA large model for Embodied AI robots, Xiaomi-Robotics-0, was officially open-sourced on February 12, 2026, adopting the Apache License 2.0 (allowing commercial use, modification, and distribution without "contagion") open-source protocol, with full-stack and unreserved open-sourcing of complete code, pre-trained weights, technical documents, papers, deployment solutions, etc. Xiaomi-Robotics-0 reuses Xiaomi's autonomous driving perception/decision technology to realize technology interoperability between robots and automobiles. It adopts a Mixture of Experts (MoE) architecture, separating the "cerebrum" (vision-language understanding) and the "cerebellum" (action execution). This design ameliorates the reasoning delay problem that may exist in traditional VLA models, making it more suitable for consumer robot products that require real-time response.

4. OEMs Enter the Market to Solve the Scarcity of Real Data for Embodied AI Robot Large Models and Provide Field Verification Scenarios

The entry of multiple OEMs into the Embodied AI and humanoid robot track brings massive industrial scenario data, automotive-grade sensor data and a mature autonomous driving technology stack to Embodied AI large models (VLA, world models, etc.). Algorithms such as BEV perception, multi-modal fusion, and end-to-end decision can be directly migrated to robots to train and improve environmental understanding, task planning and motion control capabilities of models. The production line scenarios of OEMs can verify the reliability and success rate of robot large models, expose model defects at the same time, provide high-reliable real robot interaction data for future model correction, and effectively narrow the large gap between simulation and reality.

In addition, OEMs introduce automotive-grade safety standards and hardware collaborative design into robots, greatly optimizing the reasoning delay, reliability and implementation efficiency of large models; the core supply chains of automobiles and robots (batteries, motors, sensors, domain controllers, etc.) have a high degree of overlap. Some institutions estimate that the overlap rate exceeds 50%. The scale effect greatly reduces the cost of core hardware, and the model deployment cost also decreases synchronously.

For example, to solve the data problem, GAC Group borrows the experience of autonomous driving data collection. It sends robots to real scenarios to collect real data, and at the same time carries out in-depth adaptation and field verification of core functions, forming a closed-loop data growth model of "learning by using, using by learning". In terms of cost reduction strategies, its robots first multiplexes vehicle components (such as chips, lidar, etc.) and realize 100% localization of key components. GAC has clearly planned to mass-produce its fourth-generation product GoMate Mini in 2027, taking the security scenario as the first commercial application field for its robots.

Table of Contents

1 Overview of Embodied AI Robot Large Models and Key Technology Development Directions

  • 1.1 Core Definitions of Embodied AI Robot Large Models
    • 1.1.1 Definition and Evolution of Embodied AI: Shifting from Weak Interaction to High Autonomy
    • 1.1.2 Definition of Embodied AI Robots: Autonomously Understanding the Environment and Completing Tasks via Artificial Intelligence
    • 1.1.3 Definition of Embodied AI Robot Large Models
  • 1.2 Global Industrial Ecosystem Map of Embodied AI Robot Large Models
  • 1.3 Classification of Embodied AI Robot Large Models
    • 1.3.1 By Application Scope
    • 1.3.2 By Capability Modality
    • 1.3.3 By Architectural Form
  • 1.4 Industry Development Drivers of Embodied AI Robot Large Models
    • 1.4.1 Overview
    • 1.4.2 Policies as the Core Engine
    • 1.4.3 Technology
    • 1.4.4 Market Demand
    • 1.4.5 Increased Capital Investment
    • 1.4.6 Industrial Collaboration
    • 1.4.7 Data Closed Loop Facilitates Model Iteration
    • 1.4.8 Aggregation of Interdisciplinary Talents
  • 1.5 Key Technology Development Directions of Embodied AI Robot Large Models
    • 1.5.1 Overview
    • 1.5.2 Multi-Modal Perception and Unified Representation
    • 1.5.3 World Model
    • 1.5.4 VLA End-to-End Architecture
    • 1.5.5 Hierarchical Fast-Slow System
    • 1.5.6 Enhancing Generalization Ability and Data Efficiency
    • 1.5.7 Safety and Reliability
    • 1.5.8 Lightweight and Edge Deployment
  • 1.6 Commercialization Modes of Embodied AI Robot Large Models
    • 1.6.1 Model Technology Output
    • 1.6.2 Integrated Hardware and Software Sales
    • 1.6.3 Scenario-Based Service Operation
    • 1.6.4 Data and Tool Ecosystem Services
    • 1.6.5 Key Strategies and Evolution Directions for Commercial Implementation of Embodied AI Robot Large Models

2 Global Major Players and Products: Tech Giant Camp

  • 2.1 Summary of Typical Embodied AI Large Model Products of Tech Giants (1)-(3)
  • 2.2 Alibaba Group
    • 2.2.1 Industrial Layout and Planning for Embodied AI Robots
    • 2.2.2 Large Model R&D and Engineering Team: Tongyi Lab
    • 2.2.3 Establishment of the "Robot and Embodied AI Business Unit": Detailed Introduction
    • 2.2.4 Establishment of the "Robot and Embodied AI Business Unit": 2026-2028 Business Plan
    • 2.2.5 Core Team Members Their Resumes of Embodied AI Robot Large Models
    • 2.2.6 Large Model Product System
    • 2.2.7 Embodied AI Robot Large Models: Milestones in the Development
    • 2.2.8 Embodied AI Robot Large Models: Products Summary
    • 2.2.9 Embodied AI Robot Large Models: RynnBrain Series - The World's First Embodied AI Brain Foundation Model with Spatiotemporal Memory
    • 2.2.10 Embodied AI Robot Large Models: Flagship General Embodied Model RynnBrain30BA3B
    • 2.2.11 Embodied AI Robot Large Models: RynnVLA001
    • 2.2.12 Embodied AI Robot Large Models: RynnEC - Video Multi-Modal Embodied Cognition Model
    • 2.2.13 Embodied AI Robot Large Models: WorldVLA - Fully Autoregressive Embodied AI Large Model
    • 2.2.14 Embodied AI Robot Large Models: Summary of Implemented Robots
    • 2.2.15 Embodied AI Robot Large Models: Ecosystem Partners
  • 2.3 NVIDIA
    • 2.3.1 Profile
    • 2.3.2 Industrial Layout History of Embodied AI Robots
    • 2.3.3 Core Team for Embodied AI Robots
    • 2.3.4 Summary of Embodied AI Robot-Related Products
    • 2.3.5 Embodied AI Robot Large Models: Development History
    • 2.3.6 Embodied AI Robot Large Models: Products Summary
    • 2.3.7 Embodied AI Robot Large Models: Isaac GR00T - VLA Large Model
    • 2.3.8 Embodied AI Robot Large Models: Dream Zero - World Action Model
    • 2.3.9 Embodied AI Robot Large Models: Implementation Status
    • 2.3.10 Embodied AI Robot Large Models: Ecosystem Partners
    • 2.3.11 Embodied AI Robot Large Models: Key Dynamics
  • 2.4 Google DeepMind
    • 2.4.1 Core Team for Embodied AI Robots:Google DeepMind
    • 2.4.2 Profile
    • 2.4.3 Development History
    • 2.4.4 Core Research Directions: 10 Major Fields
    • 2.4.5 Core Team Members and Their Resumes
    • 2.4.6 Summary of Large Models
    • 2.4.7 Major Large Model: Gemini
    • 2.4.8 Embodied AI Robot Large Models: Gemini Robotics
  • 2.5 OpenAI
    • 2.5.1 Profile
    • 2.5.2 Financing History: Valuation Increased More Than 25 Times in Three Years
    • 2.5.3 Development History
    • 2.5.4 Organizational Structure
    • 2.5.5 Product Matrix
    • 2.5.6 Industrial Layout and Planning for Embodied AI Robots
    • 2.5.7 Core Team Members and Resumes of the Humanoid Robot Lab
    • 2.5.8 Embodied AI Robot Large Models: Products Summary
    • 2.5.9 Embodied AI Robot Large Models: GPT-5 Embodied Adaptation Version
    • 2.5.10 Embodied AI Robot Large Models: VLA Foundation Model
    • 2.5.11 Embodied AI Robot Large Models: Ecosystem Partners
  • 2.6 Microsoft
    • 2.6.1 Industrial Layout History and Planning of Embodied AI Robots
    • 2.6.2 Team Setup for Embodied AI Robots
    • 2.6.3 Core Members and Their Resumes of the Embodied AI Team
    • 2.6.4 Summary of Self-Developed Large Model Products
    • 2.6.5 Embodied AI Robot Large Models: R&D History
    • 2.6.6 Embodied AI Robot Large Models: Rho-alpha - VLA+ Model
    • 2.6.7 Embodied AI Robot Large Models: Ecosystem Partners
    • 2.6.8 Embodied AI Robot Large Models: Recent Key News and Dynamics
  • 2.7 Huawei
    • 2.7.1 Industrial Layout and Planning for Embodied AI Robots
    • 2.7.2 Panoramic Table of Core Teams and Platforms for Embodied AI Robots
    • 2.7.3 Core Team Members and Resumes of the Embodied AI Special Task Group
    • 2.7.4 Overview of Pangu Large Model Products
    • 2.7.5 Pangu Large Model Capabilities: Multi-Modal Technology
    • 2.7.6 Pangu Large Model Capabilities: Reasoning Technology
    • 2.7.7 Pangu Large Model AI Cloud Services
    • 2.7.8 Embodied AI Robot Large Models: Cloud Robo
    • 2.7.9 Embodied AI Robot Large Models: Ecosystem Partners
  • 2.8 Tencent RoboticsX
    • 2.8.1 Profile (1)-(2)
    • 2.8.2 Development History
    • 2.8.3 Embodied AI Robot Large Models: Products Summary
    • 2.8.4 Embodied AI Robot Large Models: Tairos-Perception
    • 2.8.5 Embodied AI Robot Large Models: Tairos-Planner
    • 2.8.6 Embodied AI Robot Large Models: Tairos-Action
    • 2.8.7 Embodied AI Robot Large Models: Ecosystem Partners
    • 2.8.8 Embodied AI Robot Large Models: Key Dynamics
  • 2.9 Baidu
    • 2.9.1 Industrial Layout History and Planning for Embodied AI Robots
    • 2.9.2 Introduction to Teams for Embodied AI Robots
    • 2.9.3 Summary of Large Model Products
    • 2.9.4 Embodied AI Robot Large Models: ERNIE Embodied Control Model
    • 2.9.5 Embodied AI Robot Large Models: Ecosystem Partners
    • 2.9.6 Embodied AI Robot Large Models: Key Dynamics
  • 2.10 ByteDance
    • 2.10.1 Industrial Layout History and Planning for Embodied AI Robots
    • 2.10.2 Introduction to Teams for Embodied AI Robots
    • 2.10.3 Core Team Members and Their Resumes of SeedRobotics for Embodied AI Robots
    • 2.10.4 Summary of Large Model Products
    • 2.10.5 Embodied AI Robot Large Models: Products Summary
    • 2.10.6 Embodied AI Robot Large Models: GR Series - Robot Cerebellum
    • 2.10.7 Embodied AI Robot Large Models: Robix - Robot Cerebrum
    • 2.10.8 Embodied AI Robot Large Models: M3-Agent - Multi-Modal Long-Term Memory
    • 2.10.9 Embodied AI Robot Large Models: Ecosystem Partners
  • 2.11 iFlytek
    • 2.11.1 Industrial Layout and Planning for Embodied AI Robots
    • 2.11.2 Related Teams/Enterprises for Embodied AI Robots
    • 2.11.3 Summary of Large Model Products
    • 2.11.4 Embodied AI Robot Large Models: Products Summary
    • 2.11.5 Embodied AI Robot Large Models: iFlyBot-VLM
    • 2.11.6 Embodied AI Robot Large Models: iFlyBot-VLA
    • 2.11.7 Embodied AI Robot Large Models: Ecosystem Partners
    • 2.11.8 Embodied AI Robot Large Models: Key Dynamics
  • 2.12 SenseTime
    • 2.12.1 Industrial Layout and Planning for Embodied AI Robots
    • 2.12.2 Related Teams/Enterprises for Embodied AI Robots
    • 2.12.3 Summary of Large Model Products
    • 2.12.4 Embodied AI Robot Large Models: Products Summary
    • 2.12.5 Embodied AI Robot Large Models: Wuneng Embodied AI Platform
    • 2.12.6 Embodied AI Robot Large Models: A1 - Embodied Super Brain
    • 2.12.7 Embodied AI Robot Large Models: Ecosystem Partners
    • 2.12.8 Embodied AI Robot Large Models: Key Dynamics

3 Global Major Players and Products: Robot Enterprise Camp

  • 3.1 Summary of Typical Embodied AI Large Model Products of Robot Enterprises (1)-(3)
  • 3.2 UBTECH Robotics(UBTECH)
    • 3.2.1 Profile
    • 3.2.2 Revenue
    • 3.2.3 Overview of Robot Products
    • 3.2.4 Core Technology System
    • 3.2.5 Development Strategy and Planning
    • 3.2.6 Embodied AI Robot Large Models: BrainNet Architecture
    • 3.2.7 Layout of Embodied AI Robot Large Models
    • 3.2.8 Embodied AI Robot Large Models: Core Information of Three Major Models
    • 3.2.9 Embodied AI Robot Large Models: Development History of Thinker Multi-Modal Large Model
    • 3.2.10 Embodied AI Robot Large Models: Thinker - Multi-Modal Large Model for Humanoid Robots
    • 3.2.11 Embodied AI Robot Large Models: Details of Large Model Installation on Humanoid Robots
    • 3.2.12 Embodied AI Robot Large Models: Ecosystem Partners
  • 3.3 Unitree Robotics
    • 3.3.1 Profile
    • 3.3.2 Market and Product Strategic Planning
    • 3.3.3 Embodied AI Robot Large Models: Development History
    • 3.3.4 Embodied AI Robot Large Models: Self-Developed UnifoLM Series
    • 3.3.5 Embodied AI Robot Large Models: UnifoLM-WMA-0
    • 3.3.6 Embodied AI Robot Large Models: UnifoLM-VLA-0
    • 3.3.7 Embodied AI Robot Large Models: Ecosystem Partners
    • 3.3.8 Embodied AI Robot Large Models: Details of Large Models Adapted to Robots
  • 3.4 AgiBot
    • 3.4.1 Profile
    • 3.4.2 Product Overview
    • 3.4.3 Embodied AI Robot Large Models: Five Self-Developed Core Models
    • 3.4.4 Embodied AI Robot Large Models: GO-1
    • 3.4.5 Embodied AI Robot Large Models: Guiguang Dongyu Large Model
    • 3.4.6 Embodied AI Robot Large Models: WorkGPT
    • 3.4.7 Embodied AI Robot Large Models: ActionGPT - Motion Large Model
    • 3.4.8 Embodied AI Robot Large Models: GE-1 - World Model
    • 3.4.9 Embodied AI Robot Large Models: Ecosystem Partners
    • 3.4.10 Embodied AI Robot Large Models: Details of Large Model Installation on Humanoid Robots
    • 3.4.11 "Project A"
  • 3.5 Leju Robotics
    • 3.5.1 Profile
    • 3.5.2 Development History
    • 3.5.3 Product Overview
    • 3.5.4 Development Strategy and Planning
    • 3.5.5 Embodied AI Robot Large Models: Development History
    • 3.5.6 Embodied AI Robot Large Models: Embodied AI Module (Self-Developed Multi-Modal by Leju) & Education Large Model
    • 3.5.7 Embodied AI Robot Large Models: Ecosystem Partners
    • 3.5.8 Embodied AI Robot Large Models: Details of Large Model Installation on Humanoid Robots
    • 3.5.9 Latest Dynamics
  • 3.6 Galbot
    • 3.6.1 Profile
    • 3.6.2 Core Team Members
    • 3.6.3 Product Overview
    • 3.6.4 Strategic Planning
    • 3.6.5 Embodied AI Robot Large Models: Summary of Self-Developed Large Models
    • 3.6.6 Embodied AI Robot Large Models: GraspVLA - Grasping Foundation Model
    • 3.6.7 Embodied AI Robot Large Models: Navigation Large Model
    • 3.6.8 Embodied AI Robot Large Models: GroceryVLA - Retail Scenario Large Model
    • 3.6.9 Embodied AI Robot Large Models: Ecosystem Partners
  • 3.7 RobotEra
    • 3.7.1 Profile
    • 3.7.2 Overview of Robot Products
    • 3.7.3 Four Stages of Embodied AI Robot Large Model Exploration
    • 3.7.4 Embodied AI Robot Large Models: ERA-42
    • 3.7.5 Embodied AI Robot Large Models: CtrlWorld - Controllable Generation World Model
    • 3.7.6 Embodied AI Robot Large Models: Joining Two Top Industry-University-Research Alliances Simultaneously
    • 3.7.7 Embodied AI Robot Large Models: Joint Open-Sourcing of AIGC Robot Large Model with Tsinghua University
    • 3.7.8 Embodied AI Robot Large Models: Ecosystem Partners
  • 3.8 FigureAI
    • 3.8.1 Profile
    • 3.8.2 Embodied AI Robot Large Models: Milestones in the Development
    • 3.8.3 Embodied AI Robot Large Models: Helix - End-to-End VLA General Embodied AI Model
    • 3.8.4 Embodied AI Robot Large Models: Details of Large Model Installation on Humanoid Robots
    • 3.8.5 Embodied AI Robot Large Models: Industry Chain Partners
  • 3.9 Sanctuary AI
    • 3.9.1 Profile
    • 3.9.2 Core Team Members and Their Resumes
    • 3.9.3 Embodied AI Robot Large Models: Development History
    • 3.9.4 Embodied AI Robot Large Models: Details of Large Model Installation on Humanoid Robots
    • 3.9.5 Embodied AI Robot Large Models: Summary of Self-developed Large Model
    • 3.9.6 Embodied AI Robot Large Models: Carbon(TM)v3
    • 3.9.7 Embodied AI Robot Large Models: LBM - Large Behavior Model
    • 3.9.8 Embodied AI Robot Large Models: Industry Chain Partners
  • 3.10 1X Technologies
    • 3.10.1 Profile
    • 3.10.2 Development History
    • 3.10.3 Core Team Members and Background
    • 3.10.4 Embodied AI Robot Large Models: R&D and Deployment History
    • 3.10.5 Embodied AI Robot Large Models: Summary of Self-Developed Large Models and Their Implementation Status
    • 3.10.6 Embodied AI Robot Large Models: Redwood AI
    • 3.10.7 Embodied AI Robot Large Models: 1X World Model
    • 3.10.8 Embodied AI Robot Large Models: Ecosystem Partners
  • 3.11 Neura Robotics
    • 3.11.1 Profile
    • 3.11.2 Embodied AI Robot Large Models: Development History
    • 3.11.3 Embodied AI Robot Large Models: Summary of Self-Developed Model System
    • 3.11.4 Embodied AI Robot Large Models: NEFM
    • 3.11.5 Embodied AI Robot Large Models: Ecosystem Partners
    • 3.11.6 Latest Dynamics: China Headquarters Settled in Xiaoshan, Hangzhou

4 Global Major Players and Products: Cross-Border OEMs Camp

  • 4.1 Summary of Typical Embodied AI Large Model Products of OEMs (1)-(4)
  • 4.2 Tesla
    • 4.2.1 Industrial Layout History and Planning for Embodied AI Robots
    • 4.2.2 Strategic Positioning in the Embodied AI Field
    • 4.2.3 Team Setup for Embodied AI Robots
    • 4.2.4 Core Team Members and Resumes of the Optimus Robot Team
    • 4.2.5 Embodied AI Robot Products and Large Model Deployment Status
    • 4.2.6 Embodied AI Robot Large Models: Summary of Large Model Products
    • 4.2.7 Embodied AI Robot Large Models: FSD - End-to-End Embodied Control Model
    • 4.2.8 Embodied AI Robot Large Models: Grok4 - Embodied Interaction Large Model
    • 4.2.9 Optimus Humanoid Robot Brain Adopting Dojo Supercomputer System
    • 4.2.10 Multiplexing FSD Software Algorithms for Robots
    • 4.2.11 AI Humanoid Robot Software Algorithms - Perception Algorithms
    • 4.2.12 AI Humanoid Robot Software Algorithms - Motion Planning
    • 4.2.13 Embodied AI Robot Large Models: Ecosystem Partners
    • 4.2.14 Embodied AI Robot Large Models: Key Dynamics
  • 4.3 Toyota
    • 4.3.1 Industrial Layout History and Planning for Embodied AI Robots
    • 4.3.2 Related Teams/Companies for Embodied AI Robots
    • 4.3.3 Core Members and Their Resumes of the Research Institute
    • 4.3.4 Embodied AI Robot Products and Large Model Deployment Status
    • 4.3.5 Embodied AI Robot Large Models: Summary of Large Model
    • 4.3.6 Embodied AI Robot Large Models: LBM
    • 4.3.7 Embodied AI Robot Large Models: Ecosystem Partners
    • 4.3.8 Embodied AI Robot Large Models: Key Dynamics
  • 4.4 Honda
    • 4.4.1 Industrial Layout History and Planning for Embodied AI Robots
    • 4.4.2 Related Teams/Companies for Embodied AI Robots
    • 4.4.3 Embodied AI Robot Products and Large Model Deployment Status
    • 4.4.4 Release of 2026 Core Technology Roadmap for Embodied Robots
    • 4.4.5 Overview of Embodied AI Robot Large Models
    • 4.4.6 Embodied AI Robot Large Models: Key Dynamics
  • 4.5 Hyundai
    • 4.5.1 Industrial Layout History and Planning for Embodied AI Robots
    • 4.5.2 AIRobotics Strategy: Partnering Human Progress
    • 4.5.3 Related Teams/Companies for Embodied AI Robots
    • 4.5.4 Holding a Controlling Stake in Boston Dynamics
    • 4.5.5 Boston Dynamics: Profile
    • 4.5.6 Boston Dynamics: Core Team Members and Resumes
    • 4.5.7 Embodied AI Robot Products and Large Model Deployment Status
    • 4.5.8 Embodied AI Robot Large Models: Overview of Large Model
    • 4.5.9 Embodied AI Robot Large Models: Ecosystem Partners
    • 4.5.10 Embodied AI Robot Large Models: Key Dynamics
  • 4.6 Xiaomi
    • 4.6.1 Industrial Layout History and Planning for Embodied AI Robots
    • 4.6.2 Related Teams/Companies for Embodied AI Robots
    • 4.6.3 Panoramic View of Investment Ecosystem in the Embodied AI Robot Field
    • 4.6.4 Embodied AI Robot Products and Large Model Deployment Status
    • 4.6.5 Embodied AI Robot Large Models: Overview of Large Model
    • 4.6.6 Embodied AI Robot Large Models: Xiaomi-Robotics-0
    • 4.6.7 Embodied AI Robot: Self-Developed Software Algorithms
    • 4.6.8 Empowerment of Automotive Technology on Embodied AI Robots
    • 4.6.9 Embodied AI Robot Large Models: Ecosystem Partners
    • 4.6.10 Embodied AI Robot Large Models: Key Dynamics
  • 4.7 XPeng
    • 4.7.1 Industrial Layout History and Planning for Embodied AI Robots
    • 4.7.2 Related Teams/Companies for Embodied AI Robots
    • 4.7.3 Core Talents and Their Resumes for Embodied AI Robots
    • 4.7.4 Product Iteration History and Large Model Deployment Status of Embodied AI Robots
    • 4.7.5 Embodied AI Robot Large Models: Summary of Large Model
    • 4.7.6 Embodied AI Robot Large Models: VLT - Robot-Specific Decision Large Model
    • 4.7.7 Embodied AI Robot Large Models: 2nd-Generation VLA Physical World Large Model
    • 4.7.8 Embodied AI Robot Large Models: VLM - Multi-Modal Interaction Large Model
    • 4.7.9 Multiplexing Automotive Algorithm Technology for Humanoid Robots
    • 4.7.10 Embodied AI Robot Large Models: Ecosystem Partners
    • 4.7.11 Embodied AI Robot Large Models: Key Dynamics
  • 4.8 GAC Group
    • 4.8.1 Industrial Layout History and Planning for Embodied AI Robots
    • 4.8.2 Related Teams/Companies for Embodied AI Robots
    • 4.8.3 Establishment of Huilun Technology: Responsible for Core R&D, Production and Sales of Robots
    • 4.8.4 Huilun Technology: Core Members and Their Resumes
    • 4.8.5 Embodied AI Robot Products and Large Model Deployment Status
    • 4.8.6 Embodied AI Robot Large Models: Summary of Large Model
    • 4.8.7 Embodied AI Robot Large Models: GoMate - General Multi-Modal Large Model
    • 4.8.8 Embodied AI Robot Large Models: Embodied AI Motion Control Small Model
    • 4.8.9 Embodied AI Robot Large Models: GoMate Mini - Security Vertical Large Model
    • 4.8.10 Application of Autonomous Driving Technology on Humanoid Robots
    • 4.8.11 Embodied AI Robot Large Models: Ecosystem Partners
    • 4.8.12 Embodied AI Robot Large Models: Summary of Key Dynamics
  • 4.9 Chery
    • 4.9.1 Industrial Layout History and Planning for Embodied AI Robots
    • 4.9.2 Related Teams/Companies for Embodied AI Robots
    • 4.9.3 Embodied AI Robot Products and Large Model Deployment Status
    • 4.9.4 Embodied AI Robot Large Models: Overview of Large Model
    • 4.9.5 Embodied AI Robot Large Models: Ecosystem Partners
  • 4.10 Leapmotor
    • 4.10.1 Industrial Layout History and Planning for Embodied AI Robots
    • 4.10.2 Related Teams/Companies for Embodied AI Robots
    • 4.10.3 Core Team Members and Their Resumes of Embodied AI Robot Team
    • 4.10.4 Embodied AI Robot Products and Large Model Deployment Status
    • 4.10.5 Embodied AI Robot Large Models: Summary and Planning of Large Model
    • 4.10.6 Embodied AI Robot Large Models: Ecosystem Partners
    • 4.10.7 Embodied AI Robot Large Models: Key Dynamics
  • 4.11 BYD
    • 4.11.1 Industrial Layout History and Planning for Embodied AI Robots
    • 4.11.2 Related Teams/Companies for Embodied AI Robots
    • 4.11.3 Core Talents and Their Resumes for Embodied AI Robots
    • 4.11.4 Investment Ecosystem in the Embodied AI Robot Field
    • 4.11.5 Embodied AI Robot Products and Large Model Deployment Status
    • 4.11.6 Summary of Embodied AI Robot Large Models
    • 4.11.7 Embodied AI Robot Large Models: Key Dynamics
  • 4.12 Dongfeng Motor
    • 4.12.1 Industrial Layout History and Planning for Embodied AI Robots
    • 4.12.2 Related Teams/Companies for Embodied AI Robots
    • 4.12.3 Embodied AI Robot Products and Large Model Deployment Status
    • 4.12.4 Embodied AI Robot Large Models: Taiji Large Model
    • 4.12.5 Embodied AI Robot Large Models: Ecosystem Partners
    • 4.12.6 Embodied AI Robot Large Models: Key Dynamics
  • 4.13 Summary of Global Other Main OEMs' Layout in the Embodied AI Robot Field (1)-(4)