# 研究生论文阅读笔记(非研究方向) **Repository Path**: penguink3/papers ## Basic Information - **Project Name**: 研究生论文阅读笔记(非研究方向) - **Description**: 论文文献的翻译和笔记 - **Primary Language**: Unknown - **License**: Not specified - **Default Branch**: master - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 1 - **Forks**: 1 - **Created**: 2021-09-17 - **Last Updated**: 2023-05-19 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README # 论文合集 ## Trasformer史诗 1. [Attention is all you need](Transformer/Attention%20is%20all%20you%20need/transformer模型论文笔记.md):Transformer开篇之作 2. [BERT](Transformer/Pre-training%20of%20DBT%20for%20Language%20understand/BERT模型论文笔记.md):运用mask序列的手法,对Transformer进行NLP层面的预训练 3. [VIT](Transformer/Vision-and-Language%20Transformer/VIT论文笔记.md):Transformer运用到vision领域 4. [MAE](Transformer/Masked%20Autoencoders%20Are%20Scalable%20Vision%20Learners/MAE论文笔记.md):数据采用mask方式对VIT模型做vison层面预训练 ## 目标检测 1. [DETR](Transformer/Object%20Detection%20with%20Transformers/DETR论文笔记.md): 把Transformer用于目标检测,采用集合检测的方式 ## 对比学习 1. [Moco](Momentum%20Contrast%20for%20Unsupervised%20Visual%20Representation%20Learning/Moco论文阅读笔记.md):开启对比学习新时代,动态字典扩大了对比学习中key的存储容量 ## 联邦学习 - [联邦学习大合集](https://gitee.com/penguink3/Fed-Papers-Reading)