“Larger Model Is Not All You Need”
Google announces Scaling (Down) CLIP
A comprehensive analysis of data, architecture, and training strategies.
This paper investigates the performance of the Contrastive Language-Image Pre-training (CLIP) when scaled down to…
Join the discussion on this paper page.
Comments are closed.