Upgrade to Pro
— share decks privately, control downloads, hide ads and more …
Speaker Deck
Features
Speaker Deck
PRO
Sign in
Sign up for free
Search
Search
AI勉強会_Kerasハンズオン#1_分類
Search
taashi
April 17, 2019
Technology
0
94
AI勉強会_Kerasハンズオン#1_分類
社内勉強会資料
Keras
Google Colaboratory
(修正中...)
- precision, recallの説明追加
taashi
April 17, 2019
Tweet
Share
More Decks by taashi
See All by taashi
What is Haskell?
taashi
0
31
数学の世界~フラクタル~(社内勉強会1002)
taashi
0
99
論文LT会_論文紹介(CU-Net)
taashi
1
550
始めようElmでフロント開発_その02_ローカルWebアプリ
taashi
0
110
始めようElmでフロント開発_その01_Elmの基礎
taashi
0
120
Other Decks in Technology
See All in Technology
AIアプリケーション開発でAzure AI Searchを使いこなすためには
isidaitc
1
140
Reactフレームワークプロダクトを モバイルアプリにして、もっと便利に。 ユーザに価値を届けよう。/React Framework with Capacitor
rdlabo
0
140
re:Invent2024 KeynoteのAmazon Q Developer考察
yusukeshimizu
1
170
AWSマルチアカウント統制環境のすゝめ / 20250115 Mitsutoshi Matsuo
shift_evolve
0
120
いま現場PMのあなたが、 経営と向き合うPMになるために 必要なこと、腹をくくること
hiro93n
9
8.2k
三菱電機で社内コミュニティを立ち上げた話
kurebayashi
1
370
ドメイン駆動設計の実践により事業の成長スピードと保守性を両立するショッピングクーポン
lycorptech_jp
PRO
14
2.6k
FinJAWS_reinvent2024_recap_database
asahihidehiko
2
130
生成AI × 旅行 LLMを活用した旅行プラン生成・チャットボット
kominet_ava
0
170
dbtを中心にして組織のアジリティとガバナンスのトレードオンを考えてみた
gappy50
0
350
[IBM TechXchange Dojo]Watson Discoveryとwatsonx.aiでRAGを実現!座学①
siyuanzh09
0
120
デザインシステムを始めるために取り組んだこと - TechTrain x ゆめみ ここを意識してほしい!リファクタリング勉強会
kajitack
2
240
Featured
See All Featured
Visualizing Your Data: Incorporating Mongo into Loggly Infrastructure
mongodb
44
9.4k
GitHub's CSS Performance
jonrohan
1030
460k
Practical Tips for Bootstrapping Information Extraction Pipelines
honnibal
PRO
10
870
jQuery: Nuts, Bolts and Bling
dougneiner
63
7.6k
A better future with KSS
kneath
238
17k
Performance Is Good for Brains [We Love Speed 2024]
tammyeverts
7
570
Java REST API Framework Comparison - PWX 2021
mraible
28
8.3k
Easily Structure & Communicate Ideas using Wireframe
afnizarnur
192
16k
How GitHub (no longer) Works
holman
312
140k
Become a Pro
speakerdeck
PRO
26
5.1k
Git: the NoSQL Database
bkeepers
PRO
427
64k
Responsive Adventures: Dirty Tricks From The Dark Corners of Front-End
smashingmag
251
21k
Transcript
AI #1 (classification)
Keras
$# Google% Jupyter notebook$# Google Colaboratory (https://colab.research.google.com/)
12! " GPU, TPU! Google Drive &! … etc
Google Colab PYTHON Notebook
Colab!
(%"# ) : ! :
$ ! Notebook
$" !% GPU% #
GPU
%# & ’shift + enter’ "$ !
$’!’ " Linux !# Linux
ColabDrive&+ Google Drive! drive" Colb)' drive"%* from
google.colab import drive drive.mount('/content/drive') "( google.colabdrive ! Drive.mount$# Google Drive!
' !# URL (& ) Google
Drive% "$ (1)
$ # %! & ' " (’enter’ Google
Drive (2)
E@$#%"! <; *(7- ?.=384 :E@ 96 <;7 <;&'!Slack>
1, PLAYGROUND / 0C D5)+A - $# -2B
,89 Team Drives*(@2)/$#!7 "' 10Team Drives$#!5< (3:-
…) # [001] Install use items &% !.+ Team Drives $#!"' .+46 =; >? # [002] Mount GoogleDrive # [003] Copy data from google drive # [004] Check directory structure
Keras")* $&! '&
% $ )*#( )* $
:'<$ /1 !% '<9 '<7 *)0
(C F ) '<!% '<93" !5 '<9> (,?) '< 8) CF'<9A'< D+('24)(6. 8) FBC;&C'<-= (@+ #E)
Keras PythonA 8<BNNW0!,01 +($"4*TensorFlow, CNTK, Theano ; ID-3)'!- @
CPUGPU6%.2&;@ … etc :>A =?7#&&/ (=?7 ) 59 (JH) E : =?7 GCF
#$ # [005] Define constant value # [006] Make data
file list # [007] Check dog data # [008] Check cat data " ! " ‘_t_XXX.png’ &(Train) ‘_v_XXX.png’ (Validation) ‘_e_XXX.png’ %(Evaluate)
+.0 )C57 ,? %98 /- %98%:=>E04 @) H*(:[1, 0],
G*(:[0, 1] ,? ;F %98G or H OneHotVector )CA("$%&#!),? 6 2 %98'.(B1D ) numpy>E04<3
!#'% KerasOneHotVactor -*' &#$ import keras.utils as ku class_ids
= np.array([1, 0, 1, 1, 2, 0]) one_hots = ku.to_categorical(class_ids) print(one_hots) > [[0, 1, 0], [1, 0, 0], [0, 1, 0], [0, 0, 1], [1, 0, 0]] # [009] Make input and teacher data " (,cv2imread Numpy)+ OK
!$ Keras!$ 1) Sequential " 2) functional API# "
Keras!$ " 2) &( " 2)functional API# "'%
#&) % functional API$ % input_layer = Input((64,
64, 1)) cv_1_layer = Conv2D(32, 3)(input_layer) cv_2_layer = Conv2D(64, 3)(cv_1_layer) max_layer = MaxPooling2D()(cv_2_layer) flat_layer = Flatten()(max_layer) output_layer = Dense(2)(flat_layer) *'(" Keras !
Input&! input_layer = Input((64, 64, 1))
( Input(input_shape) #% input_shape ' $ "
Convolution!Conv2D1) Convolution! cv_1_layer = Conv2D(32, 3)(input_layer) 4 Conv2D(filters, kernel_size,
strides=(1, 1), padding=‘valid’, activation=None) *,- filters ( ) (30/%.-) kernel_size - 5 #"+% strides - 5 #"+% padding’valid’padding3 ’same’ 2('&$ activation
MaxPoolingMaxPooling2D.& MaxPooling max_layer = MaxPooling2D()(cv_layer) 3 MaxPooling2D(pool_size=(2, 2), strides=None)
')* pool_size *4 " (# strides *4 " (# ,%2+$ . /5610-!
Flatten# flat_layer = Flatten()(max_layer) % Flatten()
"! $&
Dense output_layer = Dense(2)(flat_layer)
Dense(units, activation=None) units activation
! Activate act_layer = Activate(activation='relu’)(cv_layer)
" Activate(activation=None) activation
;5F(%'- #! (%'- #!3D CB 14,"* (Input)!G.E7<,"* 9:input_shape==3A8 CB
@?& $!6>/ 0/2 & $!)+%'( @?)A8 3D # [010] Define Network
input_layer = Input(IMAGE_SHAPE) cv1_1 = Conv2D(32, 3, padding='same', input_shape=IMAGE_SHAPE)(input_layer)
cv1_1 = Activation(activation='relu')(cv1_1) cv1_2 = Conv2D(32, 3, padding='same')(cv1_1) cv1_2 = Activation(activation='relu')(cv1_2) cv1_max = MaxPooling2D()(cv1_2) cv2_1 = Conv2D(64, 3, padding='same')(cv1_max) cv2_1 = Activation(activation='relu')(cv2_1) cv2_2 = Conv2D(64, 3, padding='same')(cv2_1) cv2_2 = Activation(activation='relu')(cv2_2) cv2_max = MaxPooling2D()(cv2_2) flat_layer = Flatten()(cv2_max) fc = Dense(2)(flat_layer) output_layer = Activation(activation='softmax')(fc)
" Model# Keras model = Model(inputs=[input_layer],
outputs=[output_layer]) Model ! inputs ( ) outputs ( )( ) % $
()'# ! & '# model.summary() +*$ from keras.utils
import plot_model plot_model(model, to_file=MODEL_PNG_NAME) to_file " dot ()%
!- !- model.compile(optimizer=Adam(), loss='categorical_crossentropy',
metrics=['accuracy']) " %& optimizer loss metrics " [’accuracy’]( $ ,#' +*& crossentropy)
fit history = model.fit(train_inputs, train_teachers, batch_size=20,
epochs=100 , validation_data=(valid_inputs, valid_teachers) , shuffle=True, verbose=1, callbacks=callbacks) train_inputs$*#! train_teachers$*#') batch_size epochs % validation_data&(# (!, ')) shuffle! verbose$*!" (1 ) callbacks %()
+ "& + model.save_weights(save_file_name) $+ & ' )*%#(!
model.save(save_file_name) )* $+ & , &
+ # [012] Train
!*Adam )'$categorical_crossentropy & % (# +",
+./, # [013] Check loss and acc '
&#(3loss41acc41$ )% loss((3loss) % (3 )% val loss(*-loss) 5% "! 0(32
0, 0,&"#$ +( "#$ from keras.models import load_model model
= load_model(model_file) !%-. '/ $ model = Model(inputs=[input_layer], outputs=[output_layer]) model.load_weights(model_file) )'/ $ "#-.* $
predict model.predict(predict_inputs) predict_inputs (
) # [014] Load trained model # [015] Predict (Train data) # [016] Predict (Valid data) # [017] Predict (Evaluate data)
'43 51$ (-". 0 '43 , #!///+
)*//&2 1% //&2
,+#%3417 #%3417 0.68 !'" 5/ 2 3+-4+-Convolution&$* Dropout0.55/ .6
,+0.6 ()
Dropout drop_layer =
Dropout()(cv_layer) Dropout(rate) rate
;JIL0'$)8H "&*<F'$)G 8H"&*BK Keras Callback(!:= 64A;J-.
A;J1 lossE N28H</ Callback;J+C%#M(O@3 %#M) ) D97, ;J(fit)5?callbacls>?@3
# " ModelCheckpoint " import keras.callbacks as kc kc.ModelCheckpoint(
filepath=MODEL_FILE_NAME, save_weights_only=True , save_best_only=True, period=1) filepath ( ) save_weights_only save_best_only period ! val_loss!
AutoEncoder etc
…
Keras