Chinchilla Scaling Laws: Compute-Optimal Training and Resource Allocation for Large Language Models
Back to Writing

Chinchilla Scaling Laws: Compute-Optimal Training and Resource Allocation for Large Language Models

Michael BrenndoerferNovember 2, 202515 min read3,525 wordsInteractive

A comprehensive guide to the Chinchilla scaling laws introduced in 2022. Learn how compute-optimal training balances model size and training data, the 20:1 token-to-parameter ratio, and how these scaling laws transformed language model development by revealing the undertraining problem in previous models.

History of Language AI Cover
Part of History of Language AI

This article is part of the free-to-read History of Language AI book

View full handbook
Reading Level

Choose your expertise level to adjust how many terms are explained. Beginners see more tooltips, experts see fewer to maintain reading flow. Hover over underlined terms for instant definitions.

Loading component...
Loading component...

Reference

BIBTEXAcademic
@misc{chinchillascalinglawscomputeoptimaltrainingandresourceallocationforlargelanguagemodels, author = {Michael Brenndoerfer}, title = {Chinchilla Scaling Laws: Compute-Optimal Training and Resource Allocation for Large Language Models}, year = {2025}, url = {https://mbrenndoerfer.com/writing/chinchilla-scaling-laws-compute-optimal-training-resource-allocation}, organization = {mbrenndoerfer.com}, note = {Accessed: 2025-11-02} }
APAAcademic
Michael Brenndoerfer (2025). Chinchilla Scaling Laws: Compute-Optimal Training and Resource Allocation for Large Language Models. Retrieved from https://mbrenndoerfer.com/writing/chinchilla-scaling-laws-compute-optimal-training-resource-allocation
MLAAcademic
Michael Brenndoerfer. "Chinchilla Scaling Laws: Compute-Optimal Training and Resource Allocation for Large Language Models." 2025. Web. 11/2/2025. <https://mbrenndoerfer.com/writing/chinchilla-scaling-laws-compute-optimal-training-resource-allocation>.
CHICAGOAcademic
Michael Brenndoerfer. "Chinchilla Scaling Laws: Compute-Optimal Training and Resource Allocation for Large Language Models." Accessed 11/2/2025. https://mbrenndoerfer.com/writing/chinchilla-scaling-laws-compute-optimal-training-resource-allocation.
HARVARDAcademic
Michael Brenndoerfer (2025) 'Chinchilla Scaling Laws: Compute-Optimal Training and Resource Allocation for Large Language Models'. Available at: https://mbrenndoerfer.com/writing/chinchilla-scaling-laws-compute-optimal-training-resource-allocation (Accessed: 11/2/2025).
SimpleBasic
Michael Brenndoerfer (2025). Chinchilla Scaling Laws: Compute-Optimal Training and Resource Allocation for Large Language Models. https://mbrenndoerfer.com/writing/chinchilla-scaling-laws-compute-optimal-training-resource-allocation
Michael Brenndoerfer

About the author: Michael Brenndoerfer

All opinions expressed here are my own and do not reflect the views of my employer.

Michael currently works as an Associate Director of Data Science at EQT Partners in Singapore, where he drives AI and data initiatives across private capital investments.

With over a decade of experience spanning private equity, management consulting, and software engineering, he specializes in building and scaling analytics capabilities from the ground up. He has published research in leading AI conferences and holds expertise in machine learning, natural language processing, and value creation through data.

Stay updated

Get notified when I publish new articles on data and AI, private equity, technology, and more.