AI资料中心的网路和电力必要条件:10年的市场预测与技术评估
市场调查报告书
商品编码
1569518

AI资料中心的网路和电力必要条件:10年的市场预测与技术评估

Networks and Power Requirements for AI Data Centers: A Ten-year Market Forecast and Technology Assessment

出版日期: | 出版商: Communications Industry Researchers (CIR) | 英文 | 订单完成后即时交付

价格
简介目录

本报告探讨了AI资料中心的网路和电力需求,并提供了市场概况和AI格局、新兴机遇,以及对AI资料中心数量、AI伺服器和连接埠速度、按网路技术/协定划分的AI伺服器、AI资料中心的资料储存、AI资料中心的功耗以及AI资料中心的冷却技术的10年预测。

仔细评估当今 AI 产品(LLM 和虚拟助理)的资料中心要求。例如,随着视讯 AI 的影响力不断扩大,我们将会研究这些要求将如何发展。 IT、网路和能源领域的新收入将会非常巨大,但它们也受到人工智慧炒作的威胁。本报告从实际角度探讨了人工智慧推理和训练的兴起将如何影响资料中心。

第 2 章重点介绍占据市场主导地位的超大规模资料中心,并展示了随着 AI 促使更低延迟需求,AI 资料中心将如何变化。 CIR 认为,这一趋势不仅将推动产业走向 AI 边缘网路和更快的数据速率网络,还将更加关注资料中心的位置。本章也探讨了房地产产业人工智慧资料中心的兴起将使谁受益。

第 3 章涉及对资料中心设计、布局和设备选择的重大重新思考,以适应 AI 的特殊需求。因此,本章重点在于如何以超级乙太网路的形式适应乙太网路以满足 AI 资料中心的新要求,预计将发展到 I.6T。我们也介绍如何为人工智慧时代重新构想伺服器和储存盒。

我们也探讨了光学整合如何实现高资料速率、低延迟的资料中心。本章透过研究主要的整合平台实现了部分目标:晶片、硅光子学和共封装光学元件(CPO)。它还仔细研究了 AI 处理器作为商业机会以及 CPU、GPU、FPGA、ASIC 以及专门的推理和训练引擎在资料中心中未来将发挥的作用。

第 4 章得出结论,只有核能才能 "拯救" 人工智慧。本章讨论了儘管风力发电可能只占人工智慧市场的一小部分,但人工智慧将为小型模组化(核)反应器创造市场机会。同时,报告指出,人工智慧资料中心有效冷却的途径很多,但大多数都涉及液体冷却策略。 CIR 预测资料中心冷却的新标准将会保持良好状态。

目录

摘要整理:AI资料中心和今后的机会

第1章 AI资料中心不停的成长:AI的实情

  • 该报告的目的
    • 资讯来源
  • AI:现状
  • 法学硕士:未来客户需求、技术需求、机遇
  • 虚拟助理与人工智慧基础设施
  • 视讯 AI 的作用不断扩大
  • 机器学习笔记
  • AI 软体服务和 AIaaS
  • 可能有什么问题?

第2章 转动了AI革命的资料中心的重组:新机会

  • AI资料中心的开端
  • 人工智慧资料中心 "东西向" 流量不断增加
  • 人工智慧将如何推动资料中心对低延迟的需求
  • 人工智慧资料中心的区域变化:位置
  • 人工智慧与边缘网络
  • 资料中心互连注意事项

第3章 供给:AI网路:硬体设备和可利用的科技

  • 资料中心硬体简介
  • 人工智慧、资料中心和半导体
  • AI 资料中心中的 PIC、互连和光学集成
  • 人工智慧资料中心的光网路基础设施
  • 人工智慧资料中心的超级乙太网路:IEEE P802.3Dj
  • 人工智慧资料中心共封装光学元件的未来
  • 重新思考伺服器
  • AI资料中心的储存需求
  • 高效能 AI 资料中心注意事项

第4章 AI资料中心的电力和冷却的必要条件

  • AI资料中心的电力和冷却的必要条件
  • AI资料中心的电力消耗
  • 核选择:核的小型化
  • 液体冷却:酷的AI资料中心的未来

第5章 10年的市场预测

  • 市场预测的序论
  • AI资料中心的数
  • AI资料中心连接的10年预测:伺服器和埠口速度
  • AI资讯中心资料储存的10年预测
  • AI资料中心的冷却和电力的10年预测

关于作者

该报告所使用的缩写和简称

简介目录

"Network and Power Requirements for AI Data Centers: A Ten-Year Market Forecast and Technology Forecast" is an up-to-the-minute market study forecasting business opportunities flowing from the new breed of AI data centers.

  • Report embraces a realistic take on AI: The report begins with a careful assessment of the data center requirements of today's AI products (LLMs and Virtual Assistants) and how these requirements will evolve as, for example, video AI makes has its impact. New revenues for the IT, Networking and Energy sectors will be vast, but also threatened by AI hype. In this report, we take a realistic look at how the rise of AI inference and training will impact the data center
  • How AI data centers will deal with latency issues: Chapter Two focuses on the dominant Hyperscale Data Centers and shows how AI data centers will change as AI leads to demands for lower latency. CIR claims this trend will drive the industry to AI edge networks and higher data rate networking as well as to a growing attention to the location of data centers. This Chapter also examines who will benefit from the rise of AI data centers in the real estate industry
  • Novel products for networking, servers and storage for the AI era. Chapter Three looks at the major re-think in the design, layout and equipment choices for data centers to meet the special needs of AI. Thus, the Chapter focuses on how Ethernet is being adapted to match the emerging requirements of the AI data center, taking on form of Ultra Ethernet with the prospect of growing to I.6T. It also covers how servers and storage boxes are being rethought for the AI era
  • How optical integration and novel processors will be an AI enabler. Chapter Three also examines how optical integration enables high-data rate, low latency data centers. The chapter accomplishes this goal in part by looking at key integration platforms such as chiplets, silicon photonics and co-packaged optics (CPO). It also takes a close look at AI processors as a business opportunity and the future role that will be played by CPUs, GPUs, FPGAs, ASICs and specialized inference and training engines in the data center
  • New power and cooling sources are vital for AI. Chapter Four concludes that only nuclear can "save" AI. This Chapter discusses how AI creating a market opportunity for Small Modular (Nuclear) reactors, although wind power may have a small share of the AI market. Meanwhile, the report points out that effective cooling in the AI data center has many paths leading to it, although most can be characterized as liquid cooling strategies. CIR predicts creates the new standard for data center cooling will do well
  • Report contains detailed ten-year market forecasts. The report contains ten-year projections for the number of AI data centers, AI servers and port speeds, AI servers by networking technology/protocol, data storage for AI data centers, power consumption by AI data centers, and cooling technologies in AI data centers

The strategic analysis provided throughout the report is illustrated with case studies from the recent history of major equipment companies and service providers. This report will be essential reading for networking vendors, service providers, AI software firms, computer companies and investors. This report will be essential reading for networking vendors, service providers, AI software firms, computer companies and investors.

Table of Contents

Executive Summary: AI Data Centers and Opportunities to Come

  • E.1. Summary of AI Data Center Evolution: Ten-year Market Forecasts
  • E.2. Chip Development Opportunities for AI Data Center
  • E.3. PICs, Interconnects, Optical Integration and AI
  • E.4. Connectivity Solutions in the AI Data Center
    • E.4.1. The Future of Co-Packaged Optics in the AI Data Center
  • E.5. Rethinking Servers
  • E.6. Storage Requirements for AI Data Centers
  • E.7. Power Consumption by AI Data Centers
    • E.7.1. Liquid Cooling: The Future of Cool AI Data Centers

Chapter One: The Unstoppable Rise of the AI Data Center: AI Real

  • 1.1. Objective of this Report
    • 1.1.1. Sources of Information
  • 1.2. AI: The State of Play
    • 1.2.1. How AI Data Throughput Creates Opportunities in Data Centers
  • 1.3. LLMs: Future Customer Needs, Technical Needs and Opportunities
    • 1.3.1. Inference Requirements
    • 1.3.2. Training Requirements
    • 1.3.3. LLM Infrastructure Opportunities
  • 1.4. Virtual Assistants and the AI Infrastructure
  • 1.5. A Growing Role for Video AI
    • 1.5.1. Machine Perception (MP)
    • 1.5.2. Comparing Video Services and AI: Cautionary Tales
  • 1.6. Notes on Machine Learning
    • 1.6.1. Neural Networks and Deep Learning
    • 1.6.2. Consumer vs. Enterprise AI
    • 1.6.3. Importance of Consumer AI to Traffic Growth
    • 1.6.4. Impact of Enterprise AI
  • 1.7. AI Software Services and AIaaS
  • 1.8. What Can Possibly Go wrong?
    • 1.8.1. AI Hallucinations
    • 1.8.2. AI Underperforms
    • 1.8.3. A Future with Too Many Features

Chapter Two: Restructuring the Data Center for The AI Revolution: Emerging Opportunities

  • 2.1. AI Data Centers Begin
    • 2.1.1. The Critical Role of AI Clusters: How They Are Being Built Today
  • 2.2. The Rise of "East-West" Traffic in the AI Data Center
  • 2.3. How AI Drives the Need for Low Latency in Data Centers
    • 2.3.1. High Data Rate Interfaces as a Solution to the AI Data Center Latency Problem
  • 2.4. The Changing Geography of AI Data Centers: Location, Location, Location!
    • 2.4.1. Who is Playing the AI Data Center Game in the Real Estate Industry?
    • 2.4.2. Hyperscalers: Dominant Players in the AI Data Center Space
  • 2.5. AI and Edge Networks
    • 2.5.1. Some Notes on Edge Hardware
  • 2.6. Some Notes on Data Center Interconnection

Chapter Three Supply: AI Networking: Hardware and the Available Technologies

  • 3.1. A Preamble to Data Center Hardware
  • 3.2. AI, Data Centers and the Semiconductor Sector
    • 3.2.1. CPUs the AI Data Center
    • 3.2.2. GPUs in the AI Data Center
    • 3.2.3. Inference and Training Engines: The Hyperscaler Response
    • 3.2.4. FPGAs in the AI Data Center
    • 3.2.5. ASICs
  • 3.3. PICs, Interconnects and Optical Integration in the AI Data Center
    • 3.3.1. Silicon Photonics in the AI Data Center
    • 3.3.2. Other Platforms for Interconnects in the AI Data Center
    • 3.3.3. Some Notes on Chiplets and Interconnects
  • 3.4. Optical Networking Infrastructure for AI Data Centers
  • 3.5. Ultra Ethernet in the AI Data Center: IEEE P802.3Dj
    • 3.5.1. FEC and Latency
    • 3.5.2. Ultra Ethernet Consortium (UEC)
  • 3.6. The Future of Co-Packaged Optics in the AI Data Center
    • 3.6.1. Uncertainties about when CPO will happen in the AI Data Center
  • 3.7. Rethinking Servers
    • 3.7.1. Scale-out Networks for AI: Horizontal Scaling
    • 3.7.2. Scale-up Networks
  • 3.8. Storage Requirements for AI Data Centers
  • 3.9. Notes Toward High-Performance AI Data Centers

Chapter Four: Power and Cooling Requirements for AI Data Centers

  • 4.1. Power and Cooling Requirements for AI Data Centers
  • 4.2. Power Consumption by AI Data Centers
    • 4.2.1. Conventional and "Green" Power Solutions for Data Centers
  • 4.3. Nuclear Option: Nuclear Miniaturized
    • 4.3.1. Current Plans for Using Nuclear Power in the AI Sector
  • 4.4. Liquid Cooling: The Future of Cool AI Data Centers
    • 4.4.1. Evolution of Liquid Cooling
    • 4.3.2. Liquid Immersion Cooling
    • 4.3.3. Microconvective Cooling
    • 4.3.4. Direct Chip-chip Cooling
    • 4.3.5. Microchannel Cooling
    • 4.3.6. Oil Cooling

Chapter Five Ten-year Market Forecasts

  • 5.1. Preamble to the Market Forecasts
    • 5.1.1. Do We Have Hard Market Data for AI?
  • 5.2. How Many AI Data Centers are there?
    • 5.2.1. Worldwide AI Data Centers in Operation
  • 5.3. Ten-year Forecast of AI Data Center Connectivity: Servers and Port Speeds
    • 5.3.1. Ten-year Forecast of AI Servers
    • 5.3.2. Ten-year Forecast of AI Server Ports by Speed
    • 5.3.3. Ten-year Forecast of AI Server Ports by Technology Type/ Protocol
  • 5.4. Ten-year Forecast of Data Storage for AI Data Centers
  • 5.5. Ten-year Forecast of Cooling and Power for AI Data Centers

About the Author

Acronyms and Abbreviations Used in this Report

List of Exhibits

  • Exhibit E-1: Opportunities from AI Data Centers at a Glance ($ Millions, Except Data Centers)
  • Exhibit 1-1: Enterprise Applications for Virtual Assistants
  • Exhibit 1-2: Uses of Enterprise AI
  • Exhibit 2-1: Selected Opportunities Stemming from Rebuilding Data Centers
  • Exhibit 2-2: Solutions to Latency Problem
  • Exhibit 3-1: Connectivity Technologies for AI Data Centers
  • Exhibit 3-2: AI - A CPO Future for AI?
  • Exhibit 4-1: Power and Cooling Solutions for AI Data Centers
  • Exhibit 5-1: Ten-Year Forecasts of AI Market ($ Billions)
  • Exhibit 5-2: Worldwide AI Data Center In Operation Worldwide
  • Exhibit 5-3: Worldwide AI Server Markets
  • Exhibit 5-4: Distribution of Ports Shipped by Speed ($ Million)
  • Exhibit 5-5: Distribution of Ports Shipped by Protocol ($ Million)
  • Exhibit 5-6: Forecast of Data Storage for AI Data Centers
  • Exhibit 5-7: Ten-year Forecast of Power Consumption
  • Exhibit 5-8: Ten-year Forecast of Cooling Technology in AI Data Centers