Lab 3: RNN, GANscai/Courses/ML2019/lab_slides/Lab... · 2019-11-11 · Lab 3: RNN, GAN 실습...

24
Lab 3: RNN, GAN <기계학습 개론> 실습 이현도, 최원석, 김윤성 2019.11.07 Biointelligence Laboratory School of Computer Science and Engineering Seoul National University

Transcript of Lab 3: RNN, GANscai/Courses/ML2019/lab_slides/Lab... · 2019-11-11 · Lab 3: RNN, GAN 실습...

Page 1: Lab 3: RNN, GANscai/Courses/ML2019/lab_slides/Lab... · 2019-11-11 · Lab 3: RNN, GAN  실습 이현도, 최원석, 김윤성 2019.11.07 Biointelligence

Lab 3: RNN, GAN

<기계학습개론> 실습

이현도, 최원석, 김윤성

2019.11.07

Biointelligence Laboratory

School of Computer Science and Engineering

Seoul National University

Page 2: Lab 3: RNN, GANscai/Courses/ML2019/lab_slides/Lab... · 2019-11-11 · Lab 3: RNN, GAN  실습 이현도, 최원석, 김윤성 2019.11.07 Biointelligence

Contents

◼ RNN, LSTM

◼ NLP(자연어처리)

◼ GAN, cGAN

2

Page 3: Lab 3: RNN, GANscai/Courses/ML2019/lab_slides/Lab... · 2019-11-11 · Lab 3: RNN, GAN  실습 이현도, 최원석, 김윤성 2019.11.07 Biointelligence

Recurrent Neural Network

3

Page 4: Lab 3: RNN, GANscai/Courses/ML2019/lab_slides/Lab... · 2019-11-11 · Lab 3: RNN, GAN  실습 이현도, 최원석, 김윤성 2019.11.07 Biointelligence

RNN, LSTM

◼ RNN

◼ 시퀀스데이터의모델링

◼ 기억을가지고있음

◼ 새로운입력과기존기억을조합하여출력

◼ 입력이들어올때마다같은구조를반복

◼ LSTM

◼ RNN의 long-term dependencies 문제해결

◼ Cell state의도입

◼ Gate를도입하여 cell state를제어

Page 5: Lab 3: RNN, GANscai/Courses/ML2019/lab_slides/Lab... · 2019-11-11 · Lab 3: RNN, GAN  실습 이현도, 최원석, 김윤성 2019.11.07 Biointelligence

LSTM의구조

O는각게이트가열려있음을,

-는각게이트가닫혀있음을의미

Page 6: Lab 3: RNN, GANscai/Courses/ML2019/lab_slides/Lab... · 2019-11-11 · Lab 3: RNN, GAN  실습 이현도, 최원석, 김윤성 2019.11.07 Biointelligence

품사추론 LSTM 모델

◼ 목표

◼ 문장이주어지면, 각단어의품사를추론하는모델설계

◼ 이전까지나온단어들을알아야현재단어의품사를추론하기용이

◼이전단어들을기억할수있는 LSTM 모델이적합

◼ input sequence: 문장에있는단어들의 sequence

◼ output sequence: 각단어의품사 sequence

◼ 전처리

◼ Input을단어그대로넣는다면? (The, dog, ate, the, apple)

◼단어마다길이가다름 -> 벡터화하기어려움

◼ 따라서단어마다하나의숫자를대응시켜 encoding

0

1

2

3

4

The

dog

ate

the

apple

Input

sentence

Encoded

sentence

Page 7: Lab 3: RNN, GANscai/Courses/ML2019/lab_slides/Lab... · 2019-11-11 · Lab 3: RNN, GAN  실습 이현도, 최원석, 김윤성 2019.11.07 Biointelligence

품사추론 LSTM 모델

◼ Encoding 과정을거친숫자를 n차원공간에 embedding하여 LSTM의input으로사용

◼ LSTM의 output에 fully connected layer를적용하여각단어의품사score를계산

0

1

2

3

4

The

dog

ate

the

apple

𝑣0

𝑣1

𝑣2

𝑣3

𝑣4

𝑤0

𝑤1

𝑤2

𝑤3

𝑤4

[-0.02, -4.16, -4.71]

[-4.51, -0.02, -5.20]

[-3.90, -4.87, -0.03]

[-0.05, -3.25, -4.24]

[-4.28, -0.02, -5.48]

Input

sentence

Encoded

sentence

Input

embedding

Output

embeddingOutput

score

관사 명사 동사

Page 8: Lab 3: RNN, GANscai/Courses/ML2019/lab_slides/Lab... · 2019-11-11 · Lab 3: RNN, GAN  실습 이현도, 최원석, 김윤성 2019.11.07 Biointelligence

Preprocessing

◼ 전처리과정개선

◼ 각단어를단순히서로다른숫자에대응시키기만했는데, 단어의연관성에따라 ‘더잘‘ 대응시킬수는없을까?

◼ 즉, 서로비슷한단어를가까운 vector에, 서로관련없는단어를먼 vector에대응시키는방법은없을까?

◼ Word vectorization

◼ Word2Vec, GloVe

◼ ELMo, BERT

Page 9: Lab 3: RNN, GANscai/Courses/ML2019/lab_slides/Lab... · 2019-11-11 · Lab 3: RNN, GAN  실습 이현도, 최원석, 김윤성 2019.11.07 Biointelligence

Vectorization - Word2Vec

◼ Word2Vec

◼ Word Embedding : {set of words} -> Rn (feature space)

◼ Shallow two-layer NNLM

◼ CBOW, Skip-gram

◼ 한문장에있는단어들이 similar context라간주

◼ Distributional Hypothesis - 비슷한분포를가진단어는비슷한의미를가짐

◼ 관련있는단어일수록 embedding 결과의 cosine similarity를작게

◼ Library - Python Gensim

◼ https://pypi.org/project/gensim/

9

Page 10: Lab 3: RNN, GANscai/Courses/ML2019/lab_slides/Lab... · 2019-11-11 · Lab 3: RNN, GAN  실습 이현도, 최원석, 김윤성 2019.11.07 Biointelligence

CBOW Skip-gram

Vectorization - Word2Vec

Page 11: Lab 3: RNN, GANscai/Courses/ML2019/lab_slides/Lab... · 2019-11-11 · Lab 3: RNN, GAN  실습 이현도, 최원석, 김윤성 2019.11.07 Biointelligence

Vectorization - Word2Vec

◼ 학습을위해서필요한요소들

◼ 학습데이터베이스

◼학습하고자하는 DB 사용 (충분히클경우)

◼한국어 - 한국어위키피디아 DB, KorQuAD, 나무위키 DB, …

◼ 형태소분석(Pos-tagging)

◼원시말뭉치를형태소(어근, 접두사/접미사, 품사등) 단위로쪼개고각형태소에품사정보를부착하는작업

◼ Library – KoNLPy

◼ https://konlpy-ko.readthedocs.io/ko/v0.4.3/

11

Page 12: Lab 3: RNN, GANscai/Courses/ML2019/lab_slides/Lab... · 2019-11-11 · Lab 3: RNN, GAN  실습 이현도, 최원석, 김윤성 2019.11.07 Biointelligence

Word Embedding sources

◼ 직접하고싶은경우

◼ 아래링크를참조할것을강하게추천

◼ https://ratsgo.github.io/embedding/

◼ Pretrained된데이터사용

◼ BERT - https://pypi.org/project/bert-embedding/

◼ GloVe - https://nlp.stanford.edu/projects/glove/

◼ 구글에 pre-trained word embedding을키워드로적절히검색

12

Page 13: Lab 3: RNN, GANscai/Courses/ML2019/lab_slides/Lab... · 2019-11-11 · Lab 3: RNN, GAN  실습 이현도, 최원석, 김윤성 2019.11.07 Biointelligence

Bidirectional RNN(LSTM)

◼ 필요성

◼ 텍스트데이터는정방향(시점을기준으로과거에서미래방향) 추론못지않게역방향(시점을기준으로미래에서과거방향) 추론도유의미한결과를냄

◼나는 ____를뒤집어쓰고펑펑울었다.

◼ 하지만일반적인 RNN 구조는오로지정방향으로데이터를처리

◼ 코드

◼ LSTM(…, bidirectional=True)

13

Page 14: Lab 3: RNN, GANscai/Courses/ML2019/lab_slides/Lab... · 2019-11-11 · Lab 3: RNN, GAN  실습 이현도, 최원석, 김윤성 2019.11.07 Biointelligence

Generative Adversarial Network

14

Page 15: Lab 3: RNN, GANscai/Courses/ML2019/lab_slides/Lab... · 2019-11-11 · Lab 3: RNN, GAN  실습 이현도, 최원석, 김윤성 2019.11.07 Biointelligence

Generative Models(생성모델)

◼ Discriminative Models

◼ 데이터 X, 레이블 Y →데이터를받아레이블을추측, 𝑃 𝑌 𝑋

◼ 입력에대한레이블의경계선만파악하면충분

◼ 레이블로부터다시데이터를복구하는문제 𝑃(𝑋|𝑌) – 해결이어려움

◼ Generative Models

◼ 데이터생성의과정: 𝑃 𝑌 → 𝑃 𝑋 𝑌

◼ 실제데이터의분포를토대로데이터생성

◼실제데이터의분포를학습

◼ 비교적복잡하고, 실제현상에대한가정이필요

15

Page 16: Lab 3: RNN, GANscai/Courses/ML2019/lab_slides/Lab... · 2019-11-11 · Lab 3: RNN, GAN  실습 이현도, 최원석, 김윤성 2019.11.07 Biointelligence

Generative Models(생성모델)

◼ 생성모델: 몇가지가정하에실제데이터의분포를학습

◼학습한데이터의확률분포로부터데이터를생성

16

Page 17: Lab 3: RNN, GANscai/Courses/ML2019/lab_slides/Lab... · 2019-11-11 · Lab 3: RNN, GAN  실습 이현도, 최원석, 김윤성 2019.11.07 Biointelligence

Adversarial Networks

◼ 대립신경망(Adversarial Networks)

◼ 서로다른다수의신경망이경쟁하면서(적대적으로) 학습하는방식

◼ “Generative” Adversarial Network(GAN)

◼ 생성모델(generator)

◼실제데이터와비슷한데이터를생성하는모델

◼실제데이터의분포를근사 -> 분포에따른샘플반환

◼ 판별모델(discriminator)

◼입력이실제데이터인지, generator가생성한데이터인지를판별

Page 18: Lab 3: RNN, GANscai/Courses/ML2019/lab_slides/Lab... · 2019-11-11 · Lab 3: RNN, GAN  실습 이현도, 최원석, 김윤성 2019.11.07 Biointelligence

각신경망의목표

◼ Discriminator

◼ Generator의생성결과(fake)와실제데이터(real)를제대로분류하는것

◼ ℒ𝐷 = −(log 𝑟𝑒𝑎𝑙 + log(1 − 𝑓𝑎𝑘𝑒))

◼ 𝑟𝑒𝑎𝑙 = 𝐷 𝑥 → 1, 𝑓𝑎𝑘𝑒 = 𝐷(𝐺 𝑧 ) → 0

◼ Generator

◼ Discriminator를속이는것

◼ 𝑓𝑎𝑘𝑒 = 𝐷(𝐺 𝑧 ) → 1

◼ Optimization

◼ m𝑖𝑛𝐺

m𝑎𝑥𝐷

𝐸𝑥~𝑝𝑟𝑒𝑎𝑙 log 𝐷 𝑥 + 𝐸𝑧~𝑁(0,1)[log(1 − 𝐷 𝐺(𝑧) )]

Page 19: Lab 3: RNN, GANscai/Courses/ML2019/lab_slides/Lab... · 2019-11-11 · Lab 3: RNN, GAN  실습 이현도, 최원석, 김윤성 2019.11.07 Biointelligence

알고리즘

Discriminator

Generator

[Ian J. Goodfellow et al., NIPS, 2014]

Page 20: Lab 3: RNN, GANscai/Courses/ML2019/lab_slides/Lab... · 2019-11-11 · Lab 3: RNN, GAN  실습 이현도, 최원석, 김윤성 2019.11.07 Biointelligence

Conditional GAN (cGAN)

◼ 기본적으로 GAN은 class 정보에대한처리과정이없음

◼ 숫자 6을생성하고싶어도, 명시적으로 6을생성할수없음

◼ cGAN – 추가적인제약정보를 GAN에적용

◼ Generator, discriminator가데이터를생성하고분류할때참고할정보를부가적으로입력

◼ 레이블, 특성, feature 등

Page 21: Lab 3: RNN, GANscai/Courses/ML2019/lab_slides/Lab... · 2019-11-11 · Lab 3: RNN, GAN  실습 이현도, 최원석, 김윤성 2019.11.07 Biointelligence

Conditional GAN (cGAN)[Mehdi Mirza et al., 2014]

Page 22: Lab 3: RNN, GANscai/Courses/ML2019/lab_slides/Lab... · 2019-11-11 · Lab 3: RNN, GAN  실습 이현도, 최원석, 김윤성 2019.11.07 Biointelligence

Vanilla GAN의학습불안정성

◼ Non-convergence

◼ Generator, discriminator가번갈아학습하면서, 파라메터가수렴하지않음

◼ Mode collapse

◼ Generator가모든 data를생성하는것이아닌, 특정데이터만생성

◼ Diminished gradient

◼ Discriminator가 generator에비해성능이너무좋아, generator를학습시키는gradient가 0으로사라짐

◼ Loss function에사용되는 distance metric이부적절하여의미있는 gradient를생성하지못함

◼ Highly sensitive to the hyperparameter selections

22

Page 23: Lab 3: RNN, GANscai/Courses/ML2019/lab_slides/Lab... · 2019-11-11 · Lab 3: RNN, GAN  실습 이현도, 최원석, 김윤성 2019.11.07 Biointelligence

GAN zoo

◼ GAN의불안정성을해결하거나각종기능을추가한모델들의리스트

◼ https://deephunt.in/the-gan-zoo-79597dc8c347

◼ DCGAN - convolution layer 적용, 가장기본이되는모델

◼ Stabilized training – Unrolled GAN, LSGAN, WGAN(-GP), SN-GAN, …

◼ Image translation – Pix2Pix, CycleGAN, StarGAN, …

◼ Latent disentanglement – cGAN, InfoGAN, …

◼ Quality - Progressively Growing GANs, BigGAN, …

◼ 등등..

23

Page 24: Lab 3: RNN, GANscai/Courses/ML2019/lab_slides/Lab... · 2019-11-11 · Lab 3: RNN, GAN  실습 이현도, 최원석, 김윤성 2019.11.07 Biointelligence

Codes

◼ 실습코드 Github 주소

◼ https://github.com/illhyhl1111/SNU_ML2019

◼ Colab 링크

◼ https://colab.research.google.com/github/illhyhl1111/SNU_ML2019/blob/mast

er/Lab3_ LSTM.ipynb

https://colab.research.google.com/github/illhyhl1111/SNU_ML2019/blob/mast

er/Lab3_MNIST_cGAN.ipynb

24