Upgrade to Pro
— share decks privately, control downloads, hide ads and more …
Speaker Deck
Features
Speaker Deck
PRO
Sign in
Sign up for free
Search
Search
Confusion Matrix Explained
Search
Samuel Bohman
October 24, 2017
Science
0
67
Confusion Matrix Explained
This slide deck explains what a confusion matrix is and how to interpret it.
Samuel Bohman
October 24, 2017
Tweet
Share
Other Decks in Science
See All in Science
AI(人工知能)の過去・現在・未来 —AIは人間を超えるのか—
tagtag
1
120
Trend Classification of InSAR Displacement Time Series Using SAE–CNN
satai
4
630
モンテカルロDCF法による事業価値の算出(モンテカルロ法とベイズモデリング) / Business Valuation Using Monte Carlo DCF Method (Monte Carlo Simulation and Bayesian Modeling)
ikuma_w
0
250
データベース08: 実体関連モデルとは?
trycycle
PRO
0
930
データベース14: B+木 & ハッシュ索引
trycycle
PRO
0
450
研究って何だっけ / What is Research?
ks91
PRO
1
120
点群ライブラリPDALをGoogleColabにて実行する方法の紹介
kentaitakura
1
400
06_浅井雄一郎_株式会社浅井農園代表取締役社長_紹介資料.pdf
sip3ristex
0
620
Hakonwa-Quaternion
hiranabe
1
130
データベース09: 実体関連モデル上の一貫性制約
trycycle
PRO
0
990
データベース02: データベースの概念
trycycle
PRO
2
900
凸最適化からDC最適化まで
santana_hammer
1
290
Featured
See All Featured
It's Worth the Effort
3n
187
28k
What's in a price? How to price your products and services
michaelherold
246
12k
Creating an realtime collaboration tool: Agile Flush - .NET Oxford
marcduiker
31
2.2k
Faster Mobile Websites
deanohume
309
31k
[RailsConf 2023] Rails as a piece of cake
palkan
57
5.8k
[RailsConf 2023 Opening Keynote] The Magic of Rails
eileencodes
30
9.7k
Designing for humans not robots
tammielis
253
25k
Large-scale JavaScript Application Architecture
addyosmani
512
110k
Measuring & Analyzing Core Web Vitals
bluesmoon
9
580
Put a Button on it: Removing Barriers to Going Fast.
kastner
60
4k
How to Create Impact in a Changing Tech Landscape [PerfNow 2023]
tammyeverts
53
2.9k
GraphQLの誤解/rethinking-graphql
sonatard
72
11k
Transcript
Confusion Matrix Explained Samuel Bohman
What is a Confusion Matrix? A common method for describing
the performance of a classification model consisting of true positives, true negatives, false positives, and false negatives. It is called a confusion matrix because it shows how confused the model is between the classes.
True Positives Predicted class Apple Orange Pear Actual class Apple
50 5 50 Orange 10 50 20 Pear 5 5 0 The model correctly classified 50 apples and 50 oranges.
True Negatives for Apple The model correctly classified 75 cases
as not belonging to class apple. Predicted class Apple Orange Pear Actual class Apple 50 5 50 Orange 10 50 20 Pear 5 5 0
True Negatives for Orange The model correctly classified 105 cases
as not belonging to class orange. Predicted class Apple Orange Pear Actual class Apple 50 5 50 Orange 10 50 20 Pear 5 5 0
True Negatives for Pear The model correctly classified 115 cases
as not belonging to class pear. Predicted class Apple Orange Pear Actual class Apple 50 5 50 Orange 10 50 20 Pear 5 5 0
False Positives for Apple The model incorrectly classified 15 cases
as apples. Predicted class Apple Orange Pear Actual class Apple 50 5 50 Orange 10 50 20 Pear 5 5 0
False Positives for Orange The model incorrectly classified 10 cases
as oranges. Predicted class Apple Orange Pear Actual class Apple 50 5 50 Orange 10 50 20 Pear 5 5 0
False Positives for Pear The model incorrectly classified 70 cases
as pears. Predicted class Apple Orange Pear Actual class Apple 50 5 50 Orange 10 50 20 Pear 5 5 0
False Negatives for Apple The model incorrectly classified 55 cases
as not belonging to class apple. Predicted class Apple Orange Pear Actual class Apple 50 5 50 Orange 10 50 20 Pear 5 5 0
False Negatives for Orange The model incorrectly classified 30 cases
as not belonging to class orange. Predicted class Apple Orange Pear Actual class Apple 50 5 50 Orange 10 50 20 Pear 5 5 0
False Negatives for Pear The model incorrectly classified 10 cases
as not belonging to class pears. Predicted class Apple Orange Pear Actual class Apple 50 5 50 Orange 10 50 20 Pear 5 5 0