강의 복습
1. Recurrent Neural Network and Language Modeling
더보기





1) Basics of Recurrent Neural Networks (RNNs)
2) Types of RNNs
- One-to-one: Standard Neural Networks
- One-to-many: Image Captioning
- Many-to-one: Sentiment Classification
- Many-to-many(Sequence-to-sequence): Machine Translation, Video classification on frame level

3) Character-level Language Model



- Backpropagation through time (BPTT)




- RNN의 문제: gradient vanishing
→ LSTM, GRU로 해결
2. LSTM and GRU
더보기


1) Long Short-Term Memory (LSTM)

- i: Input gate, Whether to write to cell (sigmoid 함수)
- f: Forget gate, Whether to erase cell (sigmoid 함수)
- o: Output gate, How much to reveal cell (sigmoid 함수)
- g: Gate gate, How much to write to cell (tanh 함수)
2) Gated Recurrent Unit (GRU)


참고: 2021/02/04 - [네이버 부스트캠프 AI Tech/학습정리 [T1209 최보미]] - Day14 학습정리 - Recurrent Neural Networks
코멘트
오늘은 3주차때 배웠던 RNN을 사용한 자연어처리에 대해 배웠다. 이번주 수업을 따라가려면 3주차 내용을 잘 복습해야 할 것 같다.
'부스트캠프 AI Tech 1기 [T1209 최보미] > U stage' 카테고리의 다른 글
Day19 학습정리 - NLP4 (0) | 2021.02.18 |
---|---|
Day18 학습정리 - NLP3 (0) | 2021.02.17 |
Day16 학습정리 - NLP1 (0) | 2021.02.15 |
Day15 학습정리 - Generative Model (0) | 2021.02.05 |
Day14 학습정리 - Recurrent Neural Networks (0) | 2021.02.04 |