Scaling Laws Under the Microscope: Predicting Transformer Performance from Small Scale Experiments

Maor Ivgi, Yair Carmon, Jonathan Berant


Abstract
Neural scaling laws define a predictable relationship between a model’s parameter count and its performance after training in the form of a power law. However, most research to date has not explicitly investigated whether scaling laws can be used to accelerate model development. In this work, we perform such an empirical investigation across a wide range of language understanding tasks, starting from models with as few as 10K parameters, and evaluate downstream performance across 9 language understanding tasks.We find that scaling laws emerge at finetuning time in some NLP tasks, and that they can also be exploited for debugging convergence when training large models. Moreover, for tasks where scaling laws exist, they can be used to predict the performance of larger models, which enables effective model selection. However, revealing scaling lawsrequires careful hyperparameter tuning and multiple runs for the purpose of uncertainty estimation, which incurs additional overhead, partially offsetting the computational benefits.
Anthology ID:
2022.findings-emnlp.544
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2022
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7354–7371
Language:
URL:
https://aclanthology.org/2022.findings-emnlp.544
DOI:
10.18653/v1/2022.findings-emnlp.544
Bibkey:
Cite (ACL):
Maor Ivgi, Yair Carmon, and Jonathan Berant. 2022. Scaling Laws Under the Microscope: Predicting Transformer Performance from Small Scale Experiments. In Findings of the Association for Computational Linguistics: EMNLP 2022, pages 7354–7371, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Scaling Laws Under the Microscope: Predicting Transformer Performance from Small Scale Experiments (Ivgi et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-emnlp.544.pdf
Video:
 https://aclanthology.org/2022.findings-emnlp.544.mp4