일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | 2 | 3 | 4 | 5 | ||
6 | 7 | 8 | 9 | 10 | 11 | 12 |
13 | 14 | 15 | 16 | 17 | 18 | 19 |
20 | 21 | 22 | 23 | 24 | 25 | 26 |
27 | 28 | 29 | 30 |
- Absolute
- AGI
- ai
- AI agents
- AI engineer
- AI researcher
- ajax
- algorithm
- Algorithms
- aliases
- Array 객체
- ASI
- bayes' theorem
- Bit
- Blur
- BOM
- bootstrap
- canva
- challenges
- ChatGPT
- Today
- In Total
A Joyful AI Research Journey🌳😊
Links to mBART 본문
https://huggingface.co/facebook/mbart-large-50-many-to-many-mmt
facebook/mbart-large-50-many-to-many-mmt · Hugging Face
mBART-50 many to many multilingual machine translation This model is a fine-tuned checkpoint of mBART-large-50. mbart-large-50-many-to-many-mmt is fine-tuned for multilingual machine translation. It was introduced in Multilingual Translation with Extensibl
huggingface.co
https://huggingface.co/docs/transformers/en/model_doc/mbart
MBart and MBart-50
Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the pre and post processing steps while the latter silently ignores them.
huggingface.co
https://paperswithcode.com/method/mbart
Papers with Code - mBART Explained
mBART is a sequence-to-sequence denoising auto-encoder pre-trained on large-scale monolingual corpora in many languages using the BART objective. The input texts are noised by masking phrases and permuting sentences, and a single Transformer model is learn
paperswithcode.com
'🌳AI Projects: NLP🍀✨ > NMT Deep Dive' 카테고리의 다른 글
Helsinki-NLP (OPUS-MT) versus mBART in Translation (0) | 2024.08.24 |
---|---|
Understanding the ∣{d∈D:t∈d}∣ Expression in IDF (0) | 2023.09.18 |