# SupContrast: Supervised Contrastive Learning
<p align="center">
<img src="figures/teaser.png" width="700">
</p>
This repo covers an reference implementation for the following papers in PyTorch, using CIFAR as an illustrative example:
(1) Supervised Contrastive Learning. [Paper](https://arxiv.org/abs/2004.11362)
(2) A Simple Framework for Contrastive Learning of Visual Representations. [Paper](https://arxiv.org/abs/2002.05709)
## Update
ImageNet model (small batch size with the trick of the momentum encoder) is released [here](https://www.dropbox.com/s/l4a69ececk4spdt/supcon.pth?dl=0).
## Loss Function
The loss function [`SupConLoss`](https://github.com/HobbitLong/SupContrast/blob/master/losses.py#L11) in `losses.py` takes `features` (L2 normalized) and `labels` as input, and return the loss. If `labels` is `None` or not passed to the it, it degenerates to SimCLR.
Usage:
```python
from losses import SupConLoss
# define loss with a temperature `temp`
criterion = SupConLoss(temperature=temp)
# features: [bsz, n_views, f_dim]
# `n_views` is the number of crops from each image
# better be L2 normalized in f_dim dimension
features = ...
# labels: [bsz]
labels = ...
# SupContrast
loss = criterion(features, labels)
# or SimCLR
loss = criterion(features)
...
```
## Comparison
Results on CIFAR-10:
| |Arch | Setting | Loss | Accuracy(%) |
|----------|:----:|:---:|:---:|:---:|
| SupCrossEntropy | ResNet50 | Supervised | Cross Entropy | 95.0 |
| SupContrast | ResNet50 | Supervised | Contrastive | 96.0 |
| SimCLR | ResNet50 | Unsupervised | Contrastive | 93.6 |
Results on CIFAR-100:
| |Arch | Setting | Loss | Accuracy(%) |
|----------|:----:|:---:|:---:|:---:|
| SupCrossEntropy | ResNet50 | Supervised | Cross Entropy | 75.3 |
| SupContrast | ResNet50 | Supervised | Contrastive | 76.5 |
| SimCLR | ResNet50 | Unsupervised | Contrastive | 70.7 |
Results on ImageNet (Stay tuned):
| |Arch | Setting | Loss | Accuracy(%) |
|----------|:----:|:---:|:---:|:---:|
| SupCrossEntropy | ResNet50 | Supervised | Cross Entropy | - |
| SupContrast | ResNet50 | Supervised | Contrastive | 79.1 (MoCo trick) |
| SimCLR | ResNet50 | Unsupervised | Contrastive | - |
## Running
You might use `CUDA_VISIBLE_DEVICES` to set proper number of GPUs, and/or switch to CIFAR100 by `--dataset cifar100`.
**(1) Standard Cross-Entropy**
```
python main_ce.py --batch_size 1024 \
--learning_rate 0.8 \
--cosine --syncBN \
```
**(2) Supervised Contrastive Learning**
Pretraining stage:
```
python main_supcon.py --batch_size 1024 \
--learning_rate 0.5 \
--temp 0.1 \
--cosine
```
<s>You can also specify `--syncBN` but I found it not crucial for SupContrast (`syncBN` 95.9% v.s. `BN` 96.0%). </s>
WARN: Currently, `--syncBN` has no effect since the code is using `DataParallel` instead of `DistributedDataParaleel`
Linear evaluation stage:
```
python main_linear.py --batch_size 512 \
--learning_rate 5 \
--ckpt /path/to/model.pth
```
**(3) SimCLR**
Pretraining stage:
```
python main_supcon.py --batch_size 1024 \
--learning_rate 0.5 \
--temp 0.5 \
--cosine --syncBN \
--method SimCLR
```
The `--method SimCLR` flag simply stops `labels` from being passed to `SupConLoss` criterion.
Linear evaluation stage:
```
python main_linear.py --batch_size 512 \
--learning_rate 1 \
--ckpt /path/to/model.pth
```
On custom dataset:
```
python main_supcon.py --batch_size 1024 \
--learning_rate 0.5 \
--temp 0.1 --cosine \
--dataset path \
--data_folder ./path \
--mean "(0.4914, 0.4822, 0.4465)" \
--std "(0.2675, 0.2565, 0.2761)" \
--method SimCLR
```
The `--data_folder` must be of form ./path/label/xxx.png folowing https://pytorch.org/docs/stable/torchvision/datasets.html#torchvision.datasets.ImageFolder convension.
and
## t-SNE Visualization
**(1) Standard Cross-Entropy**
<p align="center">
<img src="figures/SupCE.jpg" width="400">
</p>
**(2) Supervised Contrastive Learning**
<p align="center">
<img src="figures/SupContrast.jpg" width="800">
</p>
**(3) SimCLR**
<p align="center">
<img src="figures/SimCLR.jpg" width="800">
</p>
## Reference
```
@Article{khosla2020supervised,
title = {Supervised Contrastive Learning},
author = {Prannay Khosla and Piotr Teterwak and Chen Wang and Aaron Sarna and Yonglong Tian and Phillip Isola and Aaron Maschinot and Ce Liu and Dilip Krishnan},
journal = {arXiv preprint arXiv:2004.11362},
year = {2020},
}
```
没有合适的资源?快使用搜索试试~ 我知道了~
温馨提示
<项目介绍> 基于深度学习的积灰识别-图像分类 介绍 解决灰尘识别问题 采用自制灰尘数据集-4分类 方法介绍 (1)普通数据增广 (2)AutoAugment数据增强 (3)resnet (4)监督对比学习损失 (5)各种常用深度学习算法 - 不懂运行,下载完可以私聊问,可远程教学 该资源内项目源码是个人的毕设,代码都测试ok,都是运行成功后才上传资源,答辩评审平均分达到96分,放心下载使用! 1、该资源内项目代码都经过测试运行成功,功能ok的情况下才上传的,请放心下载使用! 2、本项目适合计算机相关专业(如计科、人工智能、通信工程、自动化、电子信息等)的在校学生、老师或者企业员工下载学习,也适合小白学习进阶,当然也可作为毕设项目、课程设计、作业、项目初期立项演示等。 3、如果基础还行,也可在此代码基础上进行修改,以实现其他功能,也可用于毕设、课设、作业等。 下载后请首先打开README.md文件(如有),仅供学习参考, 切勿用于商业用途。 --------
资源推荐
资源详情
资源评论





















收起资源包目录





































































































共 242 条
- 1
- 2
- 3
资源评论


奋斗奋斗再奋斗的ajie
- 粉丝: 1887
上传资源 快速赚钱
我的内容管理 展开
我的资源 快来上传第一个资源
我的收益
登录查看自己的收益我的积分 登录查看自己的积分
我的C币 登录后查看C币余额
我的收藏
我的下载
下载帮助


最新资源
- 人力资源信息化管理x.docx
- 数据仓库和数据挖掘的OLAP技术[1].ppt
- 注塑机上下料机械手机构及自动控制系统PLC的设计.docx
- 项目管理亮点及经验总结.pdf
- 公司项目管理办法(可编辑修改word版).docx
- 基于网络计划技术的生产调度设计与实现.doc
- 最新毕业设计(基于单片机控制的智能电源的设计)整稿.doc
- 小学生网络使用情况调查问卷.doc
- 计算机控制技术实验.doc
- 医院信息化及电子政务实施建设的几个热点话题PPT课件.ppt
- 物联网十二五发展纲要.docx
- 基于JAVA的餐饮管理系统设计说明书.doc
- 高三生物复习基因工程练习题.doc
- (源码)基于STM32F1xx系列微控制器的USART DMA通信项目.zip
- 广东省干部培训网络学院2类关于干部教育目标和课程体系的思考考试答案100分.doc
- 吉林大学人工智能学院2023级程序设计导论课程(python)期末大作业
资源上传下载、课程学习等过程中有任何疑问或建议,欢迎提出宝贵意见哦~我们会及时处理!
点击此处反馈



安全验证
文档复制为VIP权益,开通VIP直接复制
