Upgrade to Pro
— share decks privately, control downloads, hide ads and more …
Speaker Deck
Features
Speaker Deck
PRO
Sign in
Sign up for free
Search
Search
AI勉強会_Kerasハンズオン#1_分類
Search
taashi
April 17, 2019
Technology
0
95
AI勉強会_Kerasハンズオン#1_分類
社内勉強会資料
Keras
Google Colaboratory
(修正中...)
- precision, recallの説明追加
taashi
April 17, 2019
Tweet
Share
More Decks by taashi
See All by taashi
What is Haskell?
taashi
0
31
数学の世界~フラクタル~(社内勉強会1002)
taashi
0
100
論文LT会_論文紹介(CU-Net)
taashi
1
550
始めようElmでフロント開発_その02_ローカルWebアプリ
taashi
0
110
始めようElmでフロント開発_その01_Elmの基礎
taashi
0
130
Other Decks in Technology
See All in Technology
事業継続を支える自動テストの考え方
tsuemura
0
300
オブザーバビリティの観点でみるAWS / AWS from observability perspective
ymotongpoo
7
1k
Tech Blogを書きやすい環境づくり
lycorptech_jp
PRO
0
120
現場の種を事業の芽にする - エンジニア主導のイノベーションを事業戦略に装着する方法 -
kzkmaeda
2
1.5k
テストアーキテクチャ設計で実現する高品質で高スピードな開発の実践 / Test Architecture Design in Practice
ropqa
3
710
Building Products in the LLM Era
ymatsuwitter
10
4.4k
偶然 × 行動で人生の可能性を広げよう / Serendipity × Action: Discover Your Possibilities
ar_tama
1
740
データの品質が低いと何が困るのか
kzykmyzw
6
1k
WAF に頼りすぎない AWS WAF 運用術 meguro sec #1
izzii
0
460
リーダブルテストコード 〜メンテナンスしやすい テストコードを作成する方法を考える〜 #DevSumi #DevSumiB / Readable test code
nihonbuson
11
5.8k
『AWS Distinguished Engineerに学ぶ リトライの技術』 #ARC403/Marc Brooker on Try again: The tools and techniques behind resilient systems
quiver
0
130
Developer Summit 2025 [14-D-1] Yuki Hattori
yuhattor
19
5.1k
Featured
See All Featured
XXLCSS - How to scale CSS and keep your sanity
sugarenia
248
1.3M
Producing Creativity
orderedlist
PRO
343
39k
Six Lessons from altMBA
skipperchong
27
3.6k
Design and Strategy: How to Deal with People Who Don’t "Get" Design
morganepeng
129
19k
Dealing with People You Can't Stand - Big Design 2015
cassininazir
366
25k
10 Git Anti Patterns You Should be Aware of
lemiorhan
PRO
656
59k
Side Projects
sachag
452
42k
The Pragmatic Product Professional
lauravandoore
32
6.4k
RailsConf & Balkan Ruby 2019: The Past, Present, and Future of Rails at GitHub
eileencodes
132
33k
Docker and Python
trallard
44
3.3k
Fantastic passwords and where to find them - at NoRuKo
philnash
51
3k
A Philosophy of Restraint
colly
203
16k
Transcript
AI #1 (classification)
Keras
$# Google% Jupyter notebook$# Google Colaboratory (https://colab.research.google.com/)
12! " GPU, TPU! Google Drive &! … etc
Google Colab PYTHON Notebook
Colab!
(%"# ) : ! :
$ ! Notebook
$" !% GPU% #
GPU
%# & ’shift + enter’ "$ !
$’!’ " Linux !# Linux
ColabDrive&+ Google Drive! drive" Colb)' drive"%* from
google.colab import drive drive.mount('/content/drive') "( google.colabdrive ! Drive.mount$# Google Drive!
' !# URL (& ) Google
Drive% "$ (1)
$ # %! & ' " (’enter’ Google
Drive (2)
E@$#%"! <; *(7- ?.=384 :E@ 96 <;7 <;&'!Slack>
1, PLAYGROUND / 0C D5)+A - $# -2B
,89 Team Drives*(@2)/$#!7 "' 10Team Drives$#!5< (3:-
…) # [001] Install use items &% !.+ Team Drives $#!"' .+46 =; >? # [002] Mount GoogleDrive # [003] Copy data from google drive # [004] Check directory structure
Keras")* $&! '&
% $ )*#( )* $
:'<$ /1 !% '<9 '<7 *)0
(C F ) '<!% '<93" !5 '<9> (,?) '< 8) CF'<9A'< D+('24)(6. 8) FBC;&C'<-= (@+ #E)
Keras PythonA 8<BNNW0!,01 +($"4*TensorFlow, CNTK, Theano ; ID-3)'!- @
CPUGPU6%.2&;@ … etc :>A =?7#&&/ (=?7 ) 59 (JH) E : =?7 GCF
#$ # [005] Define constant value # [006] Make data
file list # [007] Check dog data # [008] Check cat data " ! " ‘_t_XXX.png’ &(Train) ‘_v_XXX.png’ (Validation) ‘_e_XXX.png’ %(Evaluate)
+.0 )C57 ,? %98 /- %98%:=>E04 @) H*(:[1, 0],
G*(:[0, 1] ,? ;F %98G or H OneHotVector )CA("$%&#!),? 6 2 %98'.(B1D ) numpy>E04<3
!#'% KerasOneHotVactor -*' &#$ import keras.utils as ku class_ids
= np.array([1, 0, 1, 1, 2, 0]) one_hots = ku.to_categorical(class_ids) print(one_hots) > [[0, 1, 0], [1, 0, 0], [0, 1, 0], [0, 0, 1], [1, 0, 0]] # [009] Make input and teacher data " (,cv2imread Numpy)+ OK
!$ Keras!$ 1) Sequential " 2) functional API# "
Keras!$ " 2) &( " 2)functional API# "'%
#&) % functional API$ % input_layer = Input((64,
64, 1)) cv_1_layer = Conv2D(32, 3)(input_layer) cv_2_layer = Conv2D(64, 3)(cv_1_layer) max_layer = MaxPooling2D()(cv_2_layer) flat_layer = Flatten()(max_layer) output_layer = Dense(2)(flat_layer) *'(" Keras !
Input&! input_layer = Input((64, 64, 1))
( Input(input_shape) #% input_shape ' $ "
Convolution!Conv2D1) Convolution! cv_1_layer = Conv2D(32, 3)(input_layer) 4 Conv2D(filters, kernel_size,
strides=(1, 1), padding=‘valid’, activation=None) *,- filters ( ) (30/%.-) kernel_size - 5 #"+% strides - 5 #"+% padding’valid’padding3 ’same’ 2('&$ activation
MaxPoolingMaxPooling2D.& MaxPooling max_layer = MaxPooling2D()(cv_layer) 3 MaxPooling2D(pool_size=(2, 2), strides=None)
')* pool_size *4 " (# strides *4 " (# ,%2+$ . /5610-!
Flatten# flat_layer = Flatten()(max_layer) % Flatten()
"! $&
Dense output_layer = Dense(2)(flat_layer)
Dense(units, activation=None) units activation
! Activate act_layer = Activate(activation='relu’)(cv_layer)
" Activate(activation=None) activation
;5F(%'- #! (%'- #!3D CB 14,"* (Input)!G.E7<,"* 9:input_shape==3A8 CB
@?& $!6>/ 0/2 & $!)+%'( @?)A8 3D # [010] Define Network
input_layer = Input(IMAGE_SHAPE) cv1_1 = Conv2D(32, 3, padding='same', input_shape=IMAGE_SHAPE)(input_layer)
cv1_1 = Activation(activation='relu')(cv1_1) cv1_2 = Conv2D(32, 3, padding='same')(cv1_1) cv1_2 = Activation(activation='relu')(cv1_2) cv1_max = MaxPooling2D()(cv1_2) cv2_1 = Conv2D(64, 3, padding='same')(cv1_max) cv2_1 = Activation(activation='relu')(cv2_1) cv2_2 = Conv2D(64, 3, padding='same')(cv2_1) cv2_2 = Activation(activation='relu')(cv2_2) cv2_max = MaxPooling2D()(cv2_2) flat_layer = Flatten()(cv2_max) fc = Dense(2)(flat_layer) output_layer = Activation(activation='softmax')(fc)
" Model# Keras model = Model(inputs=[input_layer],
outputs=[output_layer]) Model ! inputs ( ) outputs ( )( ) % $
()'# ! & '# model.summary() +*$ from keras.utils
import plot_model plot_model(model, to_file=MODEL_PNG_NAME) to_file " dot ()%
!- !- model.compile(optimizer=Adam(), loss='categorical_crossentropy',
metrics=['accuracy']) " %& optimizer loss metrics " [’accuracy’]( $ ,#' +*& crossentropy)
fit history = model.fit(train_inputs, train_teachers, batch_size=20,
epochs=100 , validation_data=(valid_inputs, valid_teachers) , shuffle=True, verbose=1, callbacks=callbacks) train_inputs$*#! train_teachers$*#') batch_size epochs % validation_data&(# (!, ')) shuffle! verbose$*!" (1 ) callbacks %()
+ "& + model.save_weights(save_file_name) $+ & ' )*%#(!
model.save(save_file_name) )* $+ & , &
+ # [012] Train
!*Adam )'$categorical_crossentropy & % (# +",
+./, # [013] Check loss and acc '
&#(3loss41acc41$ )% loss((3loss) % (3 )% val loss(*-loss) 5% "! 0(32
0, 0,&"#$ +( "#$ from keras.models import load_model model
= load_model(model_file) !%-. '/ $ model = Model(inputs=[input_layer], outputs=[output_layer]) model.load_weights(model_file) )'/ $ "#-.* $
predict model.predict(predict_inputs) predict_inputs (
) # [014] Load trained model # [015] Predict (Train data) # [016] Predict (Valid data) # [017] Predict (Evaluate data)
'43 51$ (-". 0 '43 , #!///+
)*//&2 1% //&2
,+#%3417 #%3417 0.68 !'" 5/ 2 3+-4+-Convolution&$* Dropout0.55/ .6
,+0.6 ()
Dropout drop_layer =
Dropout()(cv_layer) Dropout(rate) rate
;JIL0'$)8H "&*<F'$)G 8H"&*BK Keras Callback(!:= 64A;J-.
A;J1 lossE N28H</ Callback;J+C%#M(O@3 %#M) ) D97, ;J(fit)5?callbacls>?@3
# " ModelCheckpoint " import keras.callbacks as kc kc.ModelCheckpoint(
filepath=MODEL_FILE_NAME, save_weights_only=True , save_best_only=True, period=1) filepath ( ) save_weights_only save_best_only period ! val_loss!
AutoEncoder etc
…
Keras