• Home
  • Ai
  • Ai News
  • Meta Reportedly Testing First In House Chipsets Designed for AI Training

Meta Reportedly Testing First In-House Chipsets Designed for AI Training

Meta has reportedly deployed a limited number of chipsets, and plans to scale production if the test is successful.

Facebook Gadgets360 Twitter Share Tweet Snapchat LinkedIn Reddit Comment google-newsGoogle News
Meta Reportedly Testing First In-House Chipsets Designed for AI Training

Photo Credit: Meta

The new AI chipsets are said to be part of its Meta Training and Inference Accelerator (MTIA) family

Highlights
  • Meta’s new AI chipsets were developed in partnership with TSMC
  • The company reportedly wants to reduce its reliance on Nvidia GPUs
  • Last year, Meta unveiled custom chipsets for AI inference
Advertisement

Meta has reportedly begun testing its first in-house chipsets that will be used to train artificial intelligence (AI) models. As per the report, the company has deployed a limited number of processors to test the performance and sustainability of the custom chipsets, and based on how well the tests go, it will begin large-scale production of the said hardware. These processors are said to be part of the Menlo Park-based tech giant's Meta Training and Inference Accelerator (MTIA) family of chipsets.

Meta Reportedly Begins Testing In-House Chipsets for AI Training

According to a Reuters report, the tech giant developed these AI chipsets in collaboration with the chipmaker Taiwan Semiconductor Manufacturing Company (TSMC). Meta reportedly completed the tape-out or the final stage of the chip design process recently, and has now begun deploying the chips at a small scale.

This is not the first AI-focused chipset for the company. Last year, it unveiled Inference Accelerators or processors that are designed for AI inference. However, Meta did not have any in-house hardware accelerators to train its Llama family of large language models (LLMs).

Citing unnamed sources within the company, the publication claimed that Meta's larger vision behind developing in-house chipsets is to bring down the infrastructure costs of deploying and running complex AI systems for internal usage, consumer-focused products, and developer tools.

Interestingly, in January, Meta CEO Mark Zuckerberg announced that the company's expansion of the Mesa Data Center in Arizona, USA was finally complete and the division began running operations. It is likely that the new training chipsets are also being deployed at this location.

The report stated that the new chipsets will first be used with Meta's recommendation engine that powers its various social media platforms, and later the use case will be expanded to generative AI products such as Meta AI.

In January, Zuckerberg revealed in a Facebook post that the company plans to invest as much as $65 billion (roughly Rs. 5,61,908 crore) in 2025 on projects relating to AI. The expenses also accounted for the expansion of the Mesa Data Center. It also includes hiring more employees for its AI teams.

Play Video
Comments

For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.

Further reading: Meta, Meta AI, Artificial Intelligence, AI
Akash Dutta
Akash Dutta is a Senior Sub Editor at Gadgets 360. He is particularly interested in the social impact of technological developments and loves reading about emerging fields such as AI, metaverse, and fediverse. In hi... more  »
Motorola Edge 60 Stylus Could Be in Development; Design Render Leaked Online
Apple Arcade to Add New Katamari Damacy and Space Invader Games in April

Advertisement

Follow Us

Advertisement

© Copyright Red Pixels Ventures Limited 2025. All rights reserved.
Trending Products »
Latest Tech News »