Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Python初心者がPyTorchをいじって機械学習の計算してみた PyCon mini Hiroshima 2018 / Python-newbies machine learning learning with PyTorch

Python初心者がPyTorchをいじって機械学習の計算してみた PyCon mini Hiroshima 2018 / Python-newbies machine learning learning with PyTorch

PyCon mini Hiroshima 2018 (2018年10月6日(土)) での「Python初心者がPyTorchをいじって機械学習の計算してみた」という稚拙な発表のスライドです。

北䑓如法

October 08, 2018
Tweet

More Decks by 北䑓如法

Other Decks in Technology

Transcript

  1. ͔͠΋मྃ࣌ʹ Andrew Ng ઌੜ͔Β ▸ Those were the topics of

    this class and if you worked all the way through this course you should now consider yourself an expert in machine learning. ▸ As you know, machine learning is a technology that's having huge impact on science, technology and industry. ▸ And you're now well qualified to use these tools of machine learning to great effect. ▸ I hope that many of you in this class will find ways to use machine learning to build cool systems and cool applications and cool products. ▸ And I hope that you find ways to use machine learning not only to make your life better but maybe someday to use it to make many other people's life better as well.
  2. ڭࢣ͋Γֶश — ػցֶश ▸ σʔλʹରԠ͢Δσʔλ͕͋Δ ▸ ͱ͍͍ͩͨͳΓͦ͏ͳؔ਺ f Λ୳Γ౰͍ͯͨ ▸

    ͔͠΋͜Ε͔ΒಘΔະ஌ͷσʔλʹ͍ͭͯ΋ͦ͏ͳͬͯཉ͍͠ x1 , x2 , …, xN (N͸ී௨େ͖͍) y1 , y2 , …, yN xn ↦ yn yn = f(xn )
  3. ▸ Կ͔ʹ൓Ԡ͢Δײ͡ͷؔ਺͕࡞ΕΔ ͞Βʹผͷؔ਺Λ߹੒ -1.0 -0.5 0.5 1.0 1.5 2.0 0.5

    1.0 1.5 2.0 -10 -5 5 10 0.2 0.4 0.6 0.8 1.0 s(x) = 1 1 + e−x , s(f(x))
  4. PyTorch ▸ ਂ૚ֶश༻ͷϑϨʔϜϫʔΫ ▸ theano, Caffe, TensorFlow, mxnet, Chainer ▸

    ͍ͬͺ͍͋Δ͏ͪͷҰͭ ▸ Torch ΑΓ (Torch ͸ LuaJIT) ▸ ChainerΛେ͍ʹࢀߟʹ͍ͯ͠Δ ▸ (ʮforkͱݺͼ͚ͨΕ͹Ͳ͏ͧʯhttps://www.reddit.com/r/MachineLearning/comments/ 74md00/n_how_to_use_chainer_for_theano_users/dnzkba1/ )
  5. ॻ͖΍ͦ͢͏ͳงғؾΛݟΔ class Net(nn.Module): def __init__(self): super(Net, self).__init__() self.conv1 = nn.Conv2d(1,

    10, kernel_size=5) self.conv2 = nn.Conv2d(10, 20, kernel_size=5) self.conv2_drop = nn.Dropout2d() self.fc1 = nn.Linear(320, 50) self.fc2 = nn.Linear(50, 10) def forward(self, x): x = F.relu(F.max_pool2d(self.conv1(x), 2)) x = F.relu(F.max_pool2d(self.conv2_drop(self.conv2(x)), 2)) x = x.view(-1, 320) x = F.relu(self.fc1(x)) x = F.dropout(x, training=self.training) x = self.fc2(x) return F.log_softmax(x, dim=1) ‏͖ͬ͞ͷ˓Λ Կݸ༻ҙͯ͠ ͭͳ͙͔ ‏ͭͳ͛Δͱ͖ʹ߹੒͢Δؔ਺
  6. ॻ͖΍ͦ͢͏ͳงғؾΛݟΔ ▸ random_split ▸ σʔλΛγϟοϑϧͭͭ͠෼ׂͯ͘͠ΕΔศརϝιου train_size = int(0.8 * len(dataset))

    test_size = len(dataset) - train_size trainset, testset = torch.utils.data.random_split(dataset, [train_size, test_size]) NEW 0.4.1
  7. PyTorchདྷͱΔ ▸ Frameworks mentioned @ICLR (International Conference on Learning Representations)

    2018➡2019 ▸ TensorFlow 228 ➡ 266 ▸ Keras: 42 ➡ 56 ▸ PyTorch 87 ➡ 252 ▸ https://www.reddit.com/r/MachineLearning/comments/9kys38/ r_frameworks_mentioned_iclr_20182019_tensorflow/ ▸ ݁ߏݚڀͷίʔυ͕PyTorchͰॻ͔Ε͍ͯΔ ▸ ΦϯϥΠϯߨٛ΋PyTorch͕૿͖͑ͯͨ
  8. ΍Γํ ▸ ࣗ෼ͷʮ͸ͯͳϒοΫϚʔΫʯ4ສҎ্ϒοΫϚʔΫ͍ͯ͠Δ ▸ ϒοΫϚʔΫλάΛͤͬͤͱ෇͚͍ͯΔ ▸ ʮػցֶशʯλάΛ෇͚͍ͯΔ͔Ͳ͏͔Ͱ෼ྨ ▸ 1000Ҏ্͋Δ ▸

    ࣗ෼ͷϒοΫϚʔΫ͔ΒσʔλΛूΊ͖ͯͯɺֶशɻ ▸ text ➡ MeCab(ܗଶૉղੳ) ➡ fasttext ➡ จॻϕΫτϧʹ͢Δ
  9. GAN? ▸ GAN = generative adversarial network (ఢରతੜ੒ωοτϫʔΫ) ▸ ੜ੒ऀ

    generator (G) v.s. ࣝผऀ discriminator (D) ͕ఢର͠ͳ͕Βֶशͯ͠ ͍͘ ▸ G ͱ D ͕ఢର͠ͳ͕Βֶश͍ͯ͘͠ ▸ Կ͔Λʮੜ੒͢Δʯ