Download PDFOpen PDF in browser

Is Self-Supervised Learning a Surrogate of Supervised Learning?

EasyChair Preprint 15263

5 pagesDate: October 18, 2024

Abstract

Over the decades in the sphere of deep learning and machine learning, supervised learning has stood to be the anchor but the enormous amount of unannotated data, high cost in the annotation process, lengthy cycle involved in annotating the data, and the need for experts have been a drawback. This drawback brought semi-supervised learning into the limelight yet still semi-supervised needs a portion of annotated data. This expensiveness involved in getting the correct annotated data has brought a nascent self-supervised learning. Self-supervised learning learns useful information called pretext from the vast amount of data that is used on the downstream task. This emerging strategy has been a topic for research. The use of unannotated data to achieve supervised learning has brought the question if self-supervised is a surrogate for supervised learning. In this work, we reviewed the work of researchers to tackle the answer of whether these two strategies should be a surrogate or synergized by comparing their accuracy and their robustness to attack or detect out-of-distribution. Just like supervised learning faces problems with annotations, self-supervised learning also calls for a good pretext to be used on its downstream task. In this work, we recommended to conclude on the question asked all factors must be taken into account

Keyphrases: Downstream, out-of-distribution pretext, robust, self-supervised, semi-supervised learning, supervised learning

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
@booklet{EasyChair:15263,
  author    = {Usman Khalid and Mehmet Kaya},
  title     = {Is Self-Supervised Learning a Surrogate of Supervised Learning?},
  howpublished = {EasyChair Preprint 15263},
  year      = {EasyChair, 2024}}
Download PDFOpen PDF in browser