# ML-Papers-Explained **Repository Path**: airwolf0992/ML-Papers-Explained ## Basic Information - **Project Name**: ML-Papers-Explained - **Description**: 该项目来自GitHub 旨在介绍机器学习领域重点技术的研究论文,既有经典重现,也有最新前沿跟进,突出论文的主要创新点,讨论他们对研究领域的影响及其应用空间。 - **Primary Language**: Python - **License**: Not specified - **Default Branch**: main - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 0 - **Forks**: 0 - **Created**: 2023-01-19 - **Last Updated**: 2023-01-19 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README # ML Papers Explained Explanations to key concepts in ML ## Full Index - [BART](https://www.kaggle.com/discussions/general/374254) - [BEiT](https://www.kaggle.com/discussions/general/371762) - [BERT](https://www.kaggle.com/code/abhinand05/bert-for-humans-tutorial-baseline) - [ConvMixer](https://www.kaggle.com/discussions/general/339146) - [DistilBERT](https://dair.ai/posts/TL;DR_DistillBERT/) - [DiT](https://www.kaggle.com/discussions/general/369149) - [Donut](https://www.kaggle.com/discussions/general/374249) - [Fast RCNN](https://www.kaggle.com/discussions/general/342959) - [Faster RCNN](https://www.kaggle.com/discussions/general/342969) - [Feature Pyramid Network](https://www.kaggle.com/discussions/general/371783) - [Focal Loss (RetinaNet)](https://www.kaggle.com/discussions/general/371924) - [Layout LM](https://www.kaggle.com/discussions/general/335699) - [Layout LM v2](https://www.kaggle.com/discussions/general/335865) - [Layout LM v3](https://www.kaggle.com/discussions/general/377655) - [LiLT](https://www.kaggle.com/discussions/general/376900) - [Masked Autoencoder](https://www.kaggle.com/discussions/general/339141) - [MobileBERT](https://dair.ai/posts/Summary-MobileBERT/) - [RCNN](https://www.kaggle.com/discussions/general/342956) - [SentenceBERT](https://dair.ai/posts/TLDR_SentenceBERT/) - [Swin Transformer](https://www.kaggle.com/discussions/general/374258) - [TableNet](https://www.kaggle.com/discussions/general/338315) - [TinyBERT](https://dair.ai/posts/TinyBERT-Size_does_matter,_but_how_you_train_it_can_be_more_important/) - [Transformer](https://dair.ai/posts/attention-is-all-you-need/) - [Vision Transformer](https://www.kaggle.com/discussions/general/338841) - [XLNet](https://dair.ai/posts/XLNet_outperforms_BERT_on_several_NLP_Tasks/) ## By Category ### RCNNs - [RCNN](https://www.kaggle.com/discussions/general/342956) - [Fast RCNN](https://www.kaggle.com/discussions/general/342959) - [Faster RCNN](https://www.kaggle.com/discussions/general/342969) ## Transformers - [BERT](https://www.kaggle.com/code/abhinand05/bert-for-humans-tutorial-baseline) - [DistilBERT](https://dair.ai/posts/TL;DR_DistillBERT/) - [MobileBERT](https://dair.ai/posts/Summary-MobileBERT/) - [SentenceBERT](https://dair.ai/posts/TLDR_SentenceBERT/) - [TinyBERT](https://dair.ai/posts/TinyBERT-Size_does_matter,_but_how_you_train_it_can_be_more_important/) - [Transformer](https://dair.ai/posts/attention-is-all-you-need/) - [XLNet](https://dair.ai/posts/XLNet_outperforms_BERT_on_several_NLP_Tasks/) ### Layout Transformers - [Layout LM](https://www.kaggle.com/discussions/general/335699) - [Layout LM v2](https://www.kaggle.com/discussions/general/335865) - [Layout LM v3](https://www.kaggle.com/discussions/general/377655) - [LiLT](https://www.kaggle.com/discussions/general/376900) ### Document Information Processing - [TableNet](https://www.kaggle.com/discussions/general/338315) - [DiT](https://www.kaggle.com/discussions/general/369149) - [Donut](https://www.kaggle.com/discussions/general/374249) ### Vision Transformers - [BEiT](https://www.kaggle.com/discussions/general/371762) - [Masked Autoencoder](https://www.kaggle.com/discussions/general/339141) - [Swin Transformer](https://www.kaggle.com/discussions/general/374258) - [Vision Transformer](https://www.kaggle.com/discussions/general/338841) ## Single Stage Object Detectors - [Feature Pyramid Network](https://www.kaggle.com/discussions/general/371783) - [Focal Loss (RetinaNet)](https://www.kaggle.com/discussions/general/371924) --- Reach out on [Twitter](https://twitter.com/RitvikRastogi19) if you have any questions. If you are interested to contribute, feel free to open a PR.