Zuckerberg says Meta will need 10x more computing power to train Llama 4 than Llama 3

10:53 01.08.2024
Meta, which develops one of the biggest foundational open-source large language models, Llama, believes it will need significantly more computing power to train models in the future. Mark Zuckerberg said on Meta’s second-quarter earnings call on Tuesday that to train Llama 4 the company will need 10x more compute than what was needed to train […] © 2024 TechCrunch. All rights reserved. For personal use only....
  302