All Tags
Browse through all available tags to find articles on topics that interest you.
Browse through all available tags to find articles on topics that interest you.
Showing 1 results for this tag.
The Universal Weight Subspace Hypothesis
This paper demonstrates that deep neural networks, despite being trained on diverse tasks and initializations, converge to remarkably similar low-dimensional parametric subspaces. This finding offers significant implications for model reusability, multi-task learning, and reducing the computational and environmental costs of large-scale neural models.