Meta founder and CEO Mark Zuckerberg speaks through the Meta Connect occasion at Meta headquarters in Menlo Park, California, on Sept. 27, 2023.
Josh Edelson | AFP | Getty Images
Meta is spending billions of {dollars} on Nvidia’s widespread pc chips, that are on the coronary heart of synthetic intelligence analysis and initiatives.
In an Instagram Reels put up on Thursday, Zuckerberg stated the corporate’s “future roadmap” for AI requires it to construct a “massive compute infrastructure.” By the tip of 2024, Zuckerberg stated that infrastructure will embody 350,000 H100 graphics playing cards from Nvidia.
Zuckerberg did not say how most of the graphics processing items (GPUs) the corporate has already bought, however the H100 did not hit the market till late 2022, and that was in restricted provide. Analysts at Raymond James estimate Nvidia is promoting the H100 for $25,000 to $30,000, and on eBay they will price over $40,000. If Meta have been paying on the low finish of the worth vary, that might quantity to shut to $9 billion in expenditures.
Additionally, Zuckerberg stated Meta’s compute infrastructure will comprise “almost 600k H100 equivalents of compute if you include other GPUs.” In December, tech corporations like Meta, OpenAI and Microsoft stated they’d use the brand new Instinct MI300X AI pc chips from AMD.
Meta wants these heavy-duty pc chips because it pursues analysis in synthetic basic intelligence (AGI), which Zuckerberg stated is a “long term vision” for the corporate. OpenAI and Google’s DeepMind unit are additionally researching AGI, a futuristic type of AI that is corresponding to human-level intelligence.
Meta’s chief scientist Yann LeCun harassed the significance of GPUs throughout a media occasion in San Francisco final month.
″[If] you assume AGI is in, the extra GPUs it’s important to purchase,” LeCun said at the time. Regarding Nvidia CEO Jensen Huang, LeCun said “There is an AI struggle, and he is supplying the weapons.”
In Meta’s third-quarter earnings report, the company said that total expenses for 2024 will be in the range of $94 billion to $99 billion, driven in part by computing expansion.
“In phrases of funding priorities, AI might be our largest funding space in 2024, each in engineering and pc sources,” Zuckerberg said on the call with analysts.
Zuckerberg said on Thursday that Meta plans to “open supply responsibly” its yet-to-be developed “basic intelligence,” an approach the company is also taking with its Llama family of large language models.
Meta is currently training Llama 3 and is also making its Fundamental AI Research team (FAIR) and GenAI research team work more closely together, Zuckerberg said.
Shortly after Zuckerberg’s post, LeCun said in a post on X, that “To speed up progress, FAIR is now a sister group of GenAI, the AI product division.”
— CNBC’s Kif Leswing contributed to this report
WATCH: The AI dark horse: Why Apple could win the next evolution of the AI arms race
Source: www.cnbc.com”