일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | 2 | 3 | 4 | |||
5 | 6 | 7 | 8 | 9 | 10 | 11 |
12 | 13 | 14 | 15 | 16 | 17 | 18 |
19 | 20 | 21 | 22 | 23 | 24 | 25 |
26 | 27 | 28 | 29 | 30 | 31 |
- Absolute
- AGI
- ai
- AI agents
- AI engineer
- AI researcher
- ajax
- algorithm
- Algorithms
- aliases
- Array 객체
- ASI
- bayes' theorem
- Bit
- Blur
- BOM
- bootstrap
- canva
- challenges
- ChatGPT
- Today
- In Total
목록seq2seq (2)
A Joyful AI Research Journey🌳😊
ChatGPT, response to “Is it better to first understand seq2seq models in-depth and then use high-level libraries like Hugging Face or TensorFlow? Is this approach similar to studying theory first and then using a library?,” August 27, 2023, OpenAI. Yes, your understanding is on point. Let's delve into why this sequential approach of starting with seq2seq and then moving on to modern libraries li..
ChatGPT, response to “Can I use a seq2seq model for NMT using Hugging Face, Keras, or TensorFlow?” August 27, 2023, OpenAI. Yes, the seq2seq (sequence-to-sequence) model is a foundational architecture for NMT (Neural Machine Translation), and you can implement and train it using any of the mentioned frameworks: Hugging Face's Transformers, Keras, or TensorFlow. Here's a brief overview of how you..