Notice
Recent Posts
Recent Comments
«   2024/12   »
1 2 3 4 5 6 7
8 9 10 11 12 13 14
15 16 17 18 19 20 21
22 23 24 25 26 27 28
29 30 31
Archives
Today
In Total
관리 메뉴

A Joyful AI Research Journey🌳😊

Key Links to Understanding Transformers in NLP 본문

🌳AI Learning🛤️✨/AI Books and Literature

Key Links to Understanding Transformers in NLP

yjyuwisely 2024. 8. 20. 07:00

Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Go-mez, Lukasz Kaiser, Illia Polosukhin, 2017, Attention Is All You Need, https://arxiv.org/abs/1706.03762

 

Attention Is All You Need

The dominant sequence transduction models are based on complex recurrent or convolutional neural networks in an encoder-decoder configuration. The best performing models also connect the encoder and decoder through an attention mechanism. We propose a new

arxiv.org


Hugging Face Transformer Usage: https://huggingface.co/transformers/usage.html

 

Quick tour

Get up and running with 🤗 Transformers! Whether you’re a developer or an everyday user, this quick tour will help you get started and show you how to use the pipeline() for inference, load a pretrained model and preprocessor with an AutoClass, and qui

huggingface.co

 

Tensor2Tensor (T2T) Introduction: https://colab.research.google.com/github/tensorflow/tensor2tensor/blob/master/tensor2tensor/notebooks/hello_t2t.ipynb?hl=en

 

Google Colab Notebook

Run, share, and edit Python notebooks

colab.research.google.com

 

Manuel Romero Notebook with link to explanations by Raimi Karim: https://colab.research.google.com/drive/1rPk3ohrmVclqhH7uQ7qys4oznDdAhpzF

 

basic_self-attention .ipynb

Colaboratory notebook

colab.research.google.com

 

Google language research: https://research.google/teams/language/

 

Language – Google Research

Our team comprises multiple research groups working on a wide range of natural language understanding and generation projects. We pursue long-term research to develop novel capabilities that can address the needs of current and future Google products. We p

research.google

 

Hugging Face research: https://huggingface.co/transformers/index.html

 

🤗 Transformers

Reinforcement learning models

huggingface.co

 

The Annotated Transformer: http://nlp.seas.harvard.edu/2018/04/03/attention.html

 

The Annotated Transformer

———————- There is now a new version of this blog post updated for modern PyTorch. ———————- from IPython.display import Image Image(filename='images/aiayn.png') The Transformer from “Attention is All You Need” has been on a l

nlp.seas.harvard.edu

 

Jay Alammar, The Illustrated Transformer: http://jalammar.github.io/illustrated-transformer/

 

The Illustrated Transformer

Discussions: Hacker News (65 points, 4 comments), Reddit r/MachineLearning (29 points, 3 comments) Translations: Arabic, Chinese (Simplified) 1, Chinese (Simplified) 2, French 1, French 2, Italian, Japanese, Korean, Persian, Russian, Spanish 1, Spanish 2,

jalammar.github.io

 

728x90
반응형
Comments