GitXplorerGitXplorer
l

Billion-scale-semi-supervised-learning

public
91 stars
22 forks
5 issues

Commits

List of commits on branch master.
Unverified
f3f2dc77dc58ec02dd5108a61dff3496b03ae0e8

Merge branch 'master' of https://github.com/leaderj1001/Billion-scale-semi-supervised-learning

lleaderj1001 committed 6 years ago
Unverified
c9fd0fc6193063faa89faa14aa85f8108f52102b

[Update] order of classes

lleaderj1001 committed 6 years ago
Verified
7fd6745f1c4d29d4d53325db5a871b48d6d6d7f8

Create README.md

pplemeri committed 6 years ago
Verified
adb3c2e370fa1ea29d4c3d119d67e3ded2cf4e20

Update README.md

lleaderj1001 committed 6 years ago
Verified
080a30f74906909654b5754db4567adee3d2c974

Update README.md

pplemeri committed 6 years ago
Verified
c5b29bdcde8d1b8ea5947abb24459820bac4ee99

Add files via upload

pplemeri committed 6 years ago

README

The README file for this repository.

Implementing Billion-scale semi-supervised learning for image classification using Pytorch

Network Architecture

캡처

  • Step 1:
    • We train on the labeled data to get an initial teacher model
  • Step 2:
    • For each class/label, we use the predictions of this teacher model to rank the unlabeled images and pick top-K images to construct a new training data
  • Step 3:
    • We use this data to train a student model, which typically differs from the teacher model: hence we can target to reduce the complexity at test time
  • Step 4:
    • finally, pre-trained student model is fine-tuned on the initial labeled data to circumvent potential labeling errors.

Usage

  • Step 1:
    • If there is a pretrained weight of the teacher network, go to step 2.
    • If you do not have pretrained weights, run the following command to train the teacher network.
    python main.py
    
  • Step 2:
    • Sampling unlabeled data through a pretrained teacher network.
    python make_sample_data.py
    
  • Step 3:
    • Students learn the student network using the data sampled in Step 2.
    python student_train.py
    
  • Step 4:
    • Finally, fine-tuning the CIFAR-100 data using the student network trained using unlabeled data in Step 3.
    python main.py --student-network True
    

Unlabeled Data

  • Image crawler

Experiments

  • In the paper, K=16k, P=10, Dataset=ImageNet, Unlabeled Data: 1,000,000,000 images.
  • However, we do not have many GPUs, so we are training at CIFAR-100.
    • Ours) K=1000, P=10, Dataset=CIFAR-100, Unlabeled Data: About 150,000 images.
Datasets Model Accuracy Epoch Training Time
CIFAR-100 ResNet-50 76.36% 91 3h 31m
CIFAR-100 ResNet-50, Semi-Supervisied learning(WORK IN PROCESS)
  • For CIFAR-100 data, the image size is too small, so the result is not good when the unlabeled data is reduced to (32, 32).
    • We will solve this problem !!

Requirements

  • tqdm==4.31.1
  • torch==1.0.1
  • opencv version: 4.1.0

Reference