Upgrade to Pro
— share decks privately, control downloads, hide ads and more …
Speaker Deck
Features
Speaker Deck
PRO
Sign in
Sign up for free
Search
Search
Improving Data Gathering And Research
Search
Luca Matteis
November 26, 2011
Programming
2
140
Improving Data Gathering And Research
How to improve data gathering using web scraping methodologies.
Luca Matteis
November 26, 2011
Tweet
Share
More Decks by Luca Matteis
See All by Luca Matteis
Linked Open Data
lmatteis
1
120
What I do
lmatteis
1
70
Crop Ontology
lmatteis
1
73
Why NPM rocks!
lmatteis
2
290
Informatics Development Tools
lmatteis
0
95
Other Decks in Programming
See All in Programming
2年のAppleウォレットパス開発の振り返り
muno92
PRO
0
130
Deno Tunnel を使ってみた話
kamekyame
0
270
Go コードベースの構成と AI コンテキスト定義
andpad
0
150
Canon EOS R50 V と R5 Mark II 購入でみえてきた最近のデジイチ VR180 事情、そして VR180 静止画に活路を見出すまで
karad
0
140
TerraformとStrands AgentsでAmazon Bedrock AgentCoreのSSO認証付きエージェントを量産しよう!
neruneruo
4
2k
フルサイクルエンジニアリングをAI Agentで全自動化したい 〜構想と現在地〜
kamina_zzz
0
310
実は歴史的なアップデートだと思う AWS Interconnect - multicloud
maroon1st
0
270
從冷知識到漏洞,你不懂的 Web,駭客懂 - Huli @ WebConf Taiwan 2025
aszx87410
2
3.1k
tparseでgo testの出力を見やすくする
utgwkk
2
310
愛される翻訳の秘訣
kishikawakatsumi
3
350
開発に寄りそう自動テストの実現
goyoki
2
1.5k
Combinatorial Interview Problems with Backtracking Solutions - From Imperative Procedural Programming to Declarative Functional Programming - Part 2
philipschwarz
PRO
0
120
Featured
See All Featured
Data-driven link building: lessons from a $708K investment (BrightonSEO talk)
szymonslowik
1
860
Avoiding the “Bad Training, Faster” Trap in the Age of AI
tmiket
0
39
How to make the Groovebox
asonas
2
1.9k
Taking LLMs out of the black box: A practical guide to human-in-the-loop distillation
inesmontani
PRO
3
2k
Designing for Performance
lara
610
70k
A brief & incomplete history of UX Design for the World Wide Web: 1989–2019
jct
1
270
What the history of the web can teach us about the future of AI
inesmontani
PRO
0
380
Keith and Marios Guide to Fast Websites
keithpitt
413
23k
16th Malabo Montpellier Forum Presentation
akademiya2063
PRO
0
32
The AI Revolution Will Not Be Monopolized: How open-source beats economies of scale, even for LLMs
inesmontani
PRO
3
2.8k
Getting science done with accelerated Python computing platforms
jacobtomlinson
0
79
Designing Dashboards & Data Visualisations in Web Apps
destraynor
231
54k
Transcript
RESEARCH IMPROVING DATA GATHERING & Luca Matteis
What is Research?
"In the broadest sense of the word, the definition of
research includes any gathering of data, information and facts for the advancement of knowledge."
"Research is a process of steps used to collect and
analyze information to increase our understanding of a topic or issue"
Data is essential for research
Where do we get data from? Einstein got his data
from his own experiments and from other peoples experiments Information exchange took weeks if not months
Today we have the internet! Information exchange takes milliseconds Works
much better than anything Einstein had
BUT THERE’S STILL ISSUES
DATA IS SCATTERED ALL OVER THE WEB
http://science.com/paper.... http://newton.com/research... http://national.com/ goo... http://biology.com/ science... http://newscientist.com/ neutrinodiscovery... http:// astronomynow.com/
themoon http://space.com/ november2001 http://science.com/ paper.... http://newton.com/research... http://science.com/paper.... http://space.com/astro... http://space.com/astro... http://space.com/astro... http://science.com/paper....
Information that can be extremely valuable, lives somewhere online and
we don’t know it because we can’t find it
EVEN WITH GOOGLE, IT’S STILL HARD TO FIND WHAT WE
NEED
Scientific data searching is facilitated if there is a central
repository or data bank
http://science.com/paper.... http://newton.com/research... http://national.com/ goo... http://biology.com/ science... http://newscientist.com/ neutrinodiscovery... http:// astronomynow.com/
themoon http://space.com/ november2001 http://science.com/ paper.... http://newton.com/research... http://science.com/paper.... http://space.com/astro... http://space.com/astro... http://space.com/astro... http://science.com/paper....
When our information is centralized by context, we can more
easily find what we’re looking for
We already have websites that centralize this information
And allow us to find data that Google couldn’t
BUT THERE’S ROOM FOR IMPROVEMENT
How is this data currently being centralized?
Each center sends us their data in the form of
Excel or Access files, through FTP or Email
None
THIS IS AN ENTIRELY MANUAL PROCESS
Is this sustainable?
Is this sustainable? This process needs to be automated
• no human interference • less communication hassles • less
human errors • more accurate data • more data What are the advantages of automating the data exchange process?
How do we automate? Centers no longer have to send
us anything. We get it directly from their website
There’s no secret. Google, hotel sites, flight search engines and
many others do this It is called web scraping
How does it work
We automatically navigate to the centers websites and fetch the
information that we need
We automatically navigate to the centers websites and fetch the
information that we need This is done by little scripts called spiders or web crawlers
What? Spiders?
“A Web crawler (or spider) is a computer program that
browses the World Wide Web in a methodical, automated manner or in an orderly fashion.”
None
This process allows us to reach more centers and gather
more data
For each center to have a website that displays their
information The main requirement Without a website we wouldn’t be able to automate this exchange
Working prototype http://seeds.iriscouch.com/
Working prototype http://seeds.iriscouch.com/ PASSPORT DATA
Working prototype http://seeds.iriscouch.com/ PASSPORT DATA CHARACTERIZATION
Working prototype http://seeds.iriscouch.com/ PASSPORT DATA CHARACTERIZATION OTHER...
RECAP
RECAP Automation of the data exchange process is the only
sustainable solution
RECAP Automation of the data exchange process is the only
sustainable solution With new technologies, web scraping has become a very reliable system
RECAP Automation of the data exchange process is the only
sustainable solution With new technologies, web scraping has become a very reliable system The process is modular and will allow us to plug-in systems such as GRIN-Global
THANK YOU