The Effect of Model Capacity on the Emergence of In-Context Learning

Published in ICLR 2024 Workshop on Understanding of Foundation Models (ME-FoMo), 2024

Recommended citation: Berkan Ottlik, Narutatsu Ri, Daniel Hsu, Clayton Sanford. (2024). "The Effect of Model Capacity on the Emergence of In-Context Learning" ICLR 2024 Workshop on Understanding of Foundation Models (ME-FoMo). https://openreview.net/pdf?id=YZM9g0Mi9a

This paper investigates the relationship between model capacity and the emergence of in-context learning under a simplified statistical framework in the transformer model. When model capacity is restricted enough, transformers shift from learningthe Bayes optimal estimator for the training task distribution to an estimator that is suitable for out-of-distribution tasks. This shift is attributed to the restricted model’s inability to fully memorize the training task distribution. Further experiments examine how the transformer’s hyper-parameters impact its capacity for memorization.