Skip to content

Papers I read. πŸ“„ Might or might not also contain my thoughts on it or annotations, or whatever.

Notifications You must be signed in to change notification settings

vdyma/awesome-papers-I-read

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

14 Commits
Β 
Β 
Β 
Β 

Repository files navigation

Awesome papers I read 🀩

Papers I read. Might or might not also contain my thoughts on it or annotations, or whatever.

So far I've read 13 papers!

Basics

Transformers

Paper Publication Date Read date Notes
Attention Is All You Need AKA Transformers 2017-06 2023-12

Vision

Paper Publication Date Read date Notes
An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale AKA Vision Transformers (ViT) 2020-10 2024-01
Swin Transformer: Hierarchical Vision Transformer using Shifted Windows 2021-03 2024-01

Language

Paper Publication Date Read date Notes
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding 2018-10 2024-01
RoBERTa: A Robustly Optimized BERT Pretraining Approach 2019-07 2024-01

Multimodal

Paper Publication Date Read date Notes
Learning Transferable Visual Models From Natural Language Supervision AKA CLIP 2021-03 2024-01
BLIP: Bootstrapping Language-Image Pre-training for Unified Vision-Language Understanding and Generation 2022-01 2023
BLIP-2: Bootstrapping Language-Image Pre-training with Frozen Image Encoders and Large Language Models 2023-01 2023
Sigmoid Loss for Language Image Pre-Training AKA SigLIP 2023-03 2024-01

Audio

Speech Recognition

Paper Publication Date Read date Notes
wav2vec: Unsupervised Pre-training for Speech Recognition 2019-04 2023-12 Link
wav2vec 2.0: A Framework for Self-Supervised Learning of Speech Representations 2020-06 2024-01 Link
Robust Speech Recognition via Large-Scale Weak Supervision AKA Whisper 2022-12 2024-01 Link

Parameter Efficient Training/Fine-tuning

Paper Publication Date Read date Comments
LoRA: Low-Rank Adaptation of Large Language Models 2021-06 2024-01

About

Papers I read. πŸ“„ Might or might not also contain my thoughts on it or annotations, or whatever.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published