Zuckerberg says Meta will need 10x more computing power to train Llama 4 than Llama 3 #Ecommerce - The Entrepreneurial Way with A.I.

Breaking

Thursday, August 1, 2024

Zuckerberg says Meta will need 10x more computing power to train Llama 4 than Llama 3 #Ecommerce

#SeoTips

Meta, which develops one of the biggest foundational open-source large language models, Llama, believes it will need significantly more computing power to train models in the future. Mark Zuckerberg said on Meta’s second-quarter earnings call on Tuesday that to train Llama 4 the company will need 10x more compute than what was needed to train […]

© 2024 TechCrunch. All rights reserved. For personal use only.



via https://www.aiupnow.com

Ivan Mehta, Khareem Sudlow