Abstract
"In the field of continual learning, models are designed to learn tasks one after the other. While most research has centered on supervised continual learning, there is a growing interest in unsupervised continual learning, which makes use of the vast amounts of unlabeled data. Recent studies have highlighted the strengths of unsupervised methods, particularly self-supervised learning, in providing robust representations. The improved transferability of those representations built with self-supervised methods is often associated with the role played by the multi-layer perceptron projector. In this work, we depart from this observation and reexamine the role of supervision in continual representation learning. We reckon that additional information, such as human annotations, should not deteriorate the quality of representations. Our findings show that supervised models when enhanced with a multi-layer perceptron head, can outperform self-supervised models in continual representation learning. This highlights the importance of the multi-layer perceptron projector in shaping feature transferability across a sequence of tasks in continual learning. The code is available on github."
Citations
-
0
CrossRef
-
0
Web of Science
-
0
Scopus
Authors (4)
Cite as
Full text
full text is not available in portal
Keywords
Details
- Category:
- Conference activity
- Type:
- publikacja w wydawnictwie zbiorowym recenzowanym (także w materiałach konferencyjnych)
- Language:
- English
- Publication year:
- 2024
- Bibliographic description:
- Marczak D., Cygert S., Trzciński T., Twardowski B.: Revisiting Supervision for Continual Representation Learning// / : , 2024,
- DOI:
- Digital Object Identifier (open in new tab) 10.1007/978-3-031-72658-3_11
- Sources of funding:
-
- Spoza PG
- Verified by:
- Gdańsk University of Technology
seen 22 times
Recommended for you
MagMax: Leveraging Model Merging for Seamless Continual Learning
- D. Marczak,
- B. Twardowski,
- T. Trzciński
- + 1 authors
MP3vec: A Reusable Machine-Constructed Feature Representation for Protein Sequences
- S. R. Gupte,
- D. S. Jain,
- A. Srinivasan
- + 1 authors
Category Adaptation Meets Projected Distillation in Generalized Continual Category Discovery
- G. Rypeść,
- D. Marczak,
- S. Cygert
- + 2 authors