AI資料中心的網路和電力必要條件:10年的市場預測與技術評估
市場調查報告書
商品編碼
1569518

AI資料中心的網路和電力必要條件:10年的市場預測與技術評估

Networks and Power Requirements for AI Data Centers: A Ten-year Market Forecast and Technology Assessment

出版日期: | 出版商: Communications Industry Researchers (CIR) | 英文 | 訂單完成後即時交付

價格
簡介目錄

本報告探討了AI資料中心的網路和電力需求,並提供了市場概況和AI格局、新興機遇,以及對AI資料中心數量、AI伺服器和連接埠速度、按網路技術/協定劃分的AI伺服器、AI資料中心的資料儲存、AI資料中心的功耗以及AI資料中心的冷卻技術的10年預測。

仔細評估當今 AI 產品(LLM 和虛擬助理)的資料中心要求。例如,隨著視訊 AI 的影響力不斷擴大,我們將會研究這些要求將如何發展。 IT、網路和能源領域的新收入將會非常巨大,但它們也受到人工智慧炒作的威脅。本報告從實際角度探討了人工智慧推理和訓練的興起將如何影響資料中心。

第 2 章重點介紹佔據市場主導地位的超大規模資料中心,並展示了隨著 AI 促使更低延遲需求,AI 資料中心將如何變化。 CIR 認為,這一趨勢不僅將推動產業走向 AI 邊緣網路和更快的數據速率網絡,還將更加關注資料中心的位置。本章也探討了房地產產業人工智慧資料中心的興起將使誰受益。

第 3 章涉及對資料中心設計、佈局和設備選擇的重大重新思考,以適應 AI 的特殊需求。因此,本章重點在於如何以超級乙太網路的形式適應乙太網路以滿足 AI 資料中心的新要求,預計將發展到 I.6T。我們也介紹如何為人工智慧時代重新構想伺服器和儲存盒。

我們也探討了光學整合如何實現高資料速率、低延遲的資料中心。本章透過研究主要的整合平台實現了部分目標:晶片、矽光子學和共封裝光學元件(CPO)。它還仔細研究了 AI 處理器作為商業機會以及 CPU、GPU、FPGA、ASIC 以及專門的推理和訓練引擎在資料中心中未來將發揮的作用。

第 4 章得出結論,只有核能才能 "拯救" 人工智慧。本章討論了儘管風力發電可能只佔人工智慧市場的一小部分,但人工智慧將為小型模組化(核)反應器創造市場機會。同時,報告指出,人工智慧資料中心有效冷卻的途徑很多,但大多數都涉及液體冷卻策略。 CIR 預測資料中心冷卻的新標準將會保持良好狀態。

目錄

摘要整理:AI資料中心和今後的機會

第1章 AI資料中心不停的成長:AI的實情

  • 該報告的目的
    • 資訊來源
  • AI:現狀
  • 法學碩士:未來客戶需求、技術需求、機遇
  • 虛擬助理與人工智慧基礎設施
  • 視訊 AI 的作用不斷擴大
  • 機器學習筆記
  • AI 軟體服務和 AIaaS
  • 可能有什麼問題?

第2章 轉動了AI革命的資料中心的重組:新機會

  • AI資料中心的開端
  • 人工智慧資料中心 "東西向" 流量不斷增加
  • 人工智慧將如何推動資料中心對低延遲的需求
  • 人工智慧資料中心的區域變化:位置
  • 人工智慧與邊緣網絡
  • 資料中心互連注意事項

第3章 供給:AI網路:硬體設備和可利用的科技

  • 資料中心硬體簡介
  • 人工智慧、資料中心和半導體
  • AI 資料中心中的 PIC、互連和光學集成
  • 人工智慧資料中心的光網路基礎設施
  • 人工智慧資料中心的超級乙太網路:IEEE P802.3Dj
  • 人工智慧資料中心共封裝光學元件的未來
  • 重新思考伺服器
  • AI資料中心的儲存需求
  • 高效能 AI 資料中心注意事項

第4章 AI資料中心的電力和冷卻的必要條件

  • AI資料中心的電力和冷卻的必要條件
  • AI資料中心的電力消耗
  • 核選擇:核的小型化
  • 液體冷卻:酷的AI資料中心的未來

第5章 10年的市場預測

  • 市場預測的序論
  • AI資料中心的數
  • AI資料中心連接的10年預測:伺服器和埠口速度
  • AI資訊中心資料儲存的10年預測
  • AI資料中心的冷卻和電力的10年預測

關於作者

該報告所使用的縮寫和簡稱

簡介目錄

"Network and Power Requirements for AI Data Centers: A Ten-Year Market Forecast and Technology Forecast" is an up-to-the-minute market study forecasting business opportunities flowing from the new breed of AI data centers.

  • Report embraces a realistic take on AI: The report begins with a careful assessment of the data center requirements of today's AI products (LLMs and Virtual Assistants) and how these requirements will evolve as, for example, video AI makes has its impact. New revenues for the IT, Networking and Energy sectors will be vast, but also threatened by AI hype. In this report, we take a realistic look at how the rise of AI inference and training will impact the data center
  • How AI data centers will deal with latency issues: Chapter Two focuses on the dominant Hyperscale Data Centers and shows how AI data centers will change as AI leads to demands for lower latency. CIR claims this trend will drive the industry to AI edge networks and higher data rate networking as well as to a growing attention to the location of data centers. This Chapter also examines who will benefit from the rise of AI data centers in the real estate industry
  • Novel products for networking, servers and storage for the AI era. Chapter Three looks at the major re-think in the design, layout and equipment choices for data centers to meet the special needs of AI. Thus, the Chapter focuses on how Ethernet is being adapted to match the emerging requirements of the AI data center, taking on form of Ultra Ethernet with the prospect of growing to I.6T. It also covers how servers and storage boxes are being rethought for the AI era
  • How optical integration and novel processors will be an AI enabler. Chapter Three also examines how optical integration enables high-data rate, low latency data centers. The chapter accomplishes this goal in part by looking at key integration platforms such as chiplets, silicon photonics and co-packaged optics (CPO). It also takes a close look at AI processors as a business opportunity and the future role that will be played by CPUs, GPUs, FPGAs, ASICs and specialized inference and training engines in the data center
  • New power and cooling sources are vital for AI. Chapter Four concludes that only nuclear can "save" AI. This Chapter discusses how AI creating a market opportunity for Small Modular (Nuclear) reactors, although wind power may have a small share of the AI market. Meanwhile, the report points out that effective cooling in the AI data center has many paths leading to it, although most can be characterized as liquid cooling strategies. CIR predicts creates the new standard for data center cooling will do well
  • Report contains detailed ten-year market forecasts. The report contains ten-year projections for the number of AI data centers, AI servers and port speeds, AI servers by networking technology/protocol, data storage for AI data centers, power consumption by AI data centers, and cooling technologies in AI data centers

The strategic analysis provided throughout the report is illustrated with case studies from the recent history of major equipment companies and service providers. This report will be essential reading for networking vendors, service providers, AI software firms, computer companies and investors. This report will be essential reading for networking vendors, service providers, AI software firms, computer companies and investors.

Table of Contents

Executive Summary: AI Data Centers and Opportunities to Come

  • E.1. Summary of AI Data Center Evolution: Ten-year Market Forecasts
  • E.2. Chip Development Opportunities for AI Data Center
  • E.3. PICs, Interconnects, Optical Integration and AI
  • E.4. Connectivity Solutions in the AI Data Center
    • E.4.1. The Future of Co-Packaged Optics in the AI Data Center
  • E.5. Rethinking Servers
  • E.6. Storage Requirements for AI Data Centers
  • E.7. Power Consumption by AI Data Centers
    • E.7.1. Liquid Cooling: The Future of Cool AI Data Centers

Chapter One: The Unstoppable Rise of the AI Data Center: AI Real

  • 1.1. Objective of this Report
    • 1.1.1. Sources of Information
  • 1.2. AI: The State of Play
    • 1.2.1. How AI Data Throughput Creates Opportunities in Data Centers
  • 1.3. LLMs: Future Customer Needs, Technical Needs and Opportunities
    • 1.3.1. Inference Requirements
    • 1.3.2. Training Requirements
    • 1.3.3. LLM Infrastructure Opportunities
  • 1.4. Virtual Assistants and the AI Infrastructure
  • 1.5. A Growing Role for Video AI
    • 1.5.1. Machine Perception (MP)
    • 1.5.2. Comparing Video Services and AI: Cautionary Tales
  • 1.6. Notes on Machine Learning
    • 1.6.1. Neural Networks and Deep Learning
    • 1.6.2. Consumer vs. Enterprise AI
    • 1.6.3. Importance of Consumer AI to Traffic Growth
    • 1.6.4. Impact of Enterprise AI
  • 1.7. AI Software Services and AIaaS
  • 1.8. What Can Possibly Go wrong?
    • 1.8.1. AI Hallucinations
    • 1.8.2. AI Underperforms
    • 1.8.3. A Future with Too Many Features

Chapter Two: Restructuring the Data Center for The AI Revolution: Emerging Opportunities

  • 2.1. AI Data Centers Begin
    • 2.1.1. The Critical Role of AI Clusters: How They Are Being Built Today
  • 2.2. The Rise of "East-West" Traffic in the AI Data Center
  • 2.3. How AI Drives the Need for Low Latency in Data Centers
    • 2.3.1. High Data Rate Interfaces as a Solution to the AI Data Center Latency Problem
  • 2.4. The Changing Geography of AI Data Centers: Location, Location, Location!
    • 2.4.1. Who is Playing the AI Data Center Game in the Real Estate Industry?
    • 2.4.2. Hyperscalers: Dominant Players in the AI Data Center Space
  • 2.5. AI and Edge Networks
    • 2.5.1. Some Notes on Edge Hardware
  • 2.6. Some Notes on Data Center Interconnection

Chapter Three Supply: AI Networking: Hardware and the Available Technologies

  • 3.1. A Preamble to Data Center Hardware
  • 3.2. AI, Data Centers and the Semiconductor Sector
    • 3.2.1. CPUs the AI Data Center
    • 3.2.2. GPUs in the AI Data Center
    • 3.2.3. Inference and Training Engines: The Hyperscaler Response
    • 3.2.4. FPGAs in the AI Data Center
    • 3.2.5. ASICs
  • 3.3. PICs, Interconnects and Optical Integration in the AI Data Center
    • 3.3.1. Silicon Photonics in the AI Data Center
    • 3.3.2. Other Platforms for Interconnects in the AI Data Center
    • 3.3.3. Some Notes on Chiplets and Interconnects
  • 3.4. Optical Networking Infrastructure for AI Data Centers
  • 3.5. Ultra Ethernet in the AI Data Center: IEEE P802.3Dj
    • 3.5.1. FEC and Latency
    • 3.5.2. Ultra Ethernet Consortium (UEC)
  • 3.6. The Future of Co-Packaged Optics in the AI Data Center
    • 3.6.1. Uncertainties about when CPO will happen in the AI Data Center
  • 3.7. Rethinking Servers
    • 3.7.1. Scale-out Networks for AI: Horizontal Scaling
    • 3.7.2. Scale-up Networks
  • 3.8. Storage Requirements for AI Data Centers
  • 3.9. Notes Toward High-Performance AI Data Centers

Chapter Four: Power and Cooling Requirements for AI Data Centers

  • 4.1. Power and Cooling Requirements for AI Data Centers
  • 4.2. Power Consumption by AI Data Centers
    • 4.2.1. Conventional and "Green" Power Solutions for Data Centers
  • 4.3. Nuclear Option: Nuclear Miniaturized
    • 4.3.1. Current Plans for Using Nuclear Power in the AI Sector
  • 4.4. Liquid Cooling: The Future of Cool AI Data Centers
    • 4.4.1. Evolution of Liquid Cooling
    • 4.3.2. Liquid Immersion Cooling
    • 4.3.3. Microconvective Cooling
    • 4.3.4. Direct Chip-chip Cooling
    • 4.3.5. Microchannel Cooling
    • 4.3.6. Oil Cooling

Chapter Five Ten-year Market Forecasts

  • 5.1. Preamble to the Market Forecasts
    • 5.1.1. Do We Have Hard Market Data for AI?
  • 5.2. How Many AI Data Centers are there?
    • 5.2.1. Worldwide AI Data Centers in Operation
  • 5.3. Ten-year Forecast of AI Data Center Connectivity: Servers and Port Speeds
    • 5.3.1. Ten-year Forecast of AI Servers
    • 5.3.2. Ten-year Forecast of AI Server Ports by Speed
    • 5.3.3. Ten-year Forecast of AI Server Ports by Technology Type/ Protocol
  • 5.4. Ten-year Forecast of Data Storage for AI Data Centers
  • 5.5. Ten-year Forecast of Cooling and Power for AI Data Centers

About the Author

Acronyms and Abbreviations Used in this Report

List of Exhibits

  • Exhibit E-1: Opportunities from AI Data Centers at a Glance ($ Millions, Except Data Centers)
  • Exhibit 1-1: Enterprise Applications for Virtual Assistants
  • Exhibit 1-2: Uses of Enterprise AI
  • Exhibit 2-1: Selected Opportunities Stemming from Rebuilding Data Centers
  • Exhibit 2-2: Solutions to Latency Problem
  • Exhibit 3-1: Connectivity Technologies for AI Data Centers
  • Exhibit 3-2: AI - A CPO Future for AI?
  • Exhibit 4-1: Power and Cooling Solutions for AI Data Centers
  • Exhibit 5-1: Ten-Year Forecasts of AI Market ($ Billions)
  • Exhibit 5-2: Worldwide AI Data Center In Operation Worldwide
  • Exhibit 5-3: Worldwide AI Server Markets
  • Exhibit 5-4: Distribution of Ports Shipped by Speed ($ Million)
  • Exhibit 5-5: Distribution of Ports Shipped by Protocol ($ Million)
  • Exhibit 5-6: Forecast of Data Storage for AI Data Centers
  • Exhibit 5-7: Ten-year Forecast of Power Consumption
  • Exhibit 5-8: Ten-year Forecast of Cooling Technology in AI Data Centers