Tech giant Meta recently announced the rollout of the company’s in-house Artificial Intelligence (AI) chip, designed to boost power and speed up training for its unique workloads.
Described as the “next-gen”, Meta Training and Inference Accelerator (MTIA) runs models including ranking and recommending display ads on Meta Platforms. The chip has already been deployed in the data center and is engaged in serving AI applications.
Announcing the roll-out of the AI chip, Meta wrote via a blog post,
Tekedia Mini-MBA edition 16 (Feb 10 – May 3, 2025) opens registrations; register today for early bird discounts.
Tekedia AI in Business Masterclass opens registrations here.
Join Tekedia Capital Syndicate and invest in Africa’s finest startups here.
“The next generation of Meta’s large-scale infrastructure is being built with Al in mind, including supporting new generative Al (GenAl) products and services, recommendation systems, and advanced Al research. It’s an investment we expect will grow in the years ahead as the compute requirements to support Al models increase alongside the models’ sophistication. Last year, we unveiled the Meta Training and Inference Accelerator (MTIA) v1, our first-generation Al inference accelerator that we designed in-house with Meta’s Al workloads in mind, specifically our deep learning recommendation models that are improving a variety of experiences across our products.
“MTIA is a long-term venture to provide the most efficient architecture for Meta’s unique workloads. As Al workloads become increasingly important to our products and services, this efficiency will improve our ability to provide the best experiences for our users around the world. MTIA v1 was an important step in improving the compute efficiency of our infrastructure and better supporting our software developers as they build Al models that will facilitate new and better user experiences.”
Meta says the new version of MTIA more than doubles the compute and memory bandwidth of its previous solution while maintaining close ties to its workloads. The new custom AI Chip is currently live in 16 of its data center regions and delivering up to 3x overall better performance.
Lately, Meta as well as other tech giants such as Google, Microsoft, and Tesla are investing in custom Artificial Intelligence (AI) hardware as they look to rival against the monopoly of the chief GPU supplier, Nvidia. It is worth noting that in the span of a few weeks, Google parent Alphabet, Facebook parent Meta, and Intel have announced plans to develop AI chips.
Google this week made its fifth-generation custom chip for training Al models, TPUv5p, generally available to Google Cloud customers, and revealed its first dedicated chip for running models, Axion. Also, e-commerce giant Amazon has several custom Al chip families under its belt. Microsoft last year jumped into the fray with the Azure Maia Al Acce|erator and the Azure Cobalt 100 CPU.
According to D.A. Davidson analyst Gil Luria, he noted that two-thirds of Nvidia’s revenue comes from its top five customers, which include Meta, Microsoft, Google, and Amazon. All four have started making AI hardware in-house, which he says could seriously threaten Nvidia’s revenue in the future.
In his words,
“Nvidia’s been able to extract a tremendous amount of windfall profits over the last couple of years, based on the fact that they had the right product at the right time,” Luria said in an interview. “But now that the market is the size that it is, [there are] multiple companies that are in a position to replace Nvidia with other products most importantly, Nvidia’s customers.”
This prediction played out recently after a report by CNBC disclosed that Nvidia entered “correction territory,” after its shares briefly fell 10% from their most recent all-time closing high of more than $950 apiece. The stock closed at a price of $853.54 on Tuesday, down 2% for the session.
Meta has introduced a new generation of custom chips now in production, aiming to ramp up its artificial intelligence efforts and reduce reliance on outside suppliers such as Nvidia. The announcement comes a day after Intel unveiled a faster AI “accelerator,” and rivals including Google are likewise developing AI chips in-house. Meta is spending billions to catch up, TechCrunch says. For now, the Meta Training and Inference Accelerator trains its ranking and recommendation algorithms, but the goal is to use MTIA for generative AI like its large language model, Llama.