Huang, Guoxi, Fu, Hongtao and Bors, Adrian Gheorghe orcid.org/0000-0001-7838-0021 (2024) Masked Image Residual Learning for Scaling Deeper Vision Transformers. In: Advances in Neural Information Processing Systems (NeurIPS). MIT Press , pp. 57570-57582.
Abstract
Deeper Vision Transformers (ViTs) are more challenging to train. We expose a degradation problem in deeper layers of ViT when using masked image modeling (MIM) for pre-training. To ease the training of deeper ViTs, we introduce a self-supervised learning framework called Masked Image Residual Learning (MIRL), which significantly alleviates the degradation problem, making scaling ViT along depth a promising direction for performance upgrade. We reformulate the pre-training objective for deeper layers of ViT as learning to recover the residual of the masked image. We provide extensive empirical evidence showing that deeper ViTs can be effectively optimized using MIRL and easily gain accuracy from increased depth. With the same level of computational complexity as ViT-Base and ViT-Large, we instantiate 4.5× and 2× deeper ViTs, dubbed ViT-S-54 and ViT-B-48. The deeper ViT-S-54, costing 3× less than ViT-Large, achieves performance on par with ViT-Large. ViT-B-48 achieves 86.2% top-1 accuracy on ImageNet. On one hand, deeper ViTs pre-trained with MIRL exhibit excellent generalization capabilities on downstream tasks, such as object detection and semantic segmentation. On the other hand, MIRL demonstrates high pre-training efficiency. With less pre-training time, MIRL yields competitive performance compared to other approaches. Code and pretrained models are available at: https://github.com/russellllaputa/MIRL.
Metadata
Item Type: | Proceedings Paper |
---|---|
Authors/Creators: |
|
Copyright, Publisher and Additional Information: | This is an author-produced version of the published paper. Uploaded in accordance with the University’s Research Publications and Open Access policy. |
Dates: |
|
Institution: | The University of York |
Academic Units: | The University of York > Faculty of Sciences (York) > Computer Science (York) |
Depositing User: | Pure (York) |
Date Deposited: | 20 Dec 2024 12:20 |
Last Modified: | 20 Dec 2024 12:20 |
Published Version: | https://doi.org/10.5555/3666122.3668633 |
Status: | Published |
Publisher: | MIT Press |
Identification Number: | 10.5555/3666122.3668633 |
Open Archives Initiative ID (OAI ID): | oai:eprints.whiterose.ac.uk:221037 |