Upgrade to Pro
— share decks privately, control downloads, hide ads and more …
Speaker Deck
Features
Speaker Deck
PRO
Sign in
Sign up for free
Search
Search
Getting started with the spark core
Search
claudiomettler
January 14, 2015
Technology
1
290
Getting started with the spark core
For Hackware v0.3
claudiomettler
January 14, 2015
Tweet
Share
More Decks by claudiomettler
See All by claudiomettler
On-demand image scaling with AWS Lambda and S3
claudiomettler
0
150
Terraform in 5 minutes
claudiomettler
0
730
Intro to Xdebug
claudiomettler
0
220
chef talk at DevOps Singapore
claudiomettler
0
140
Other Decks in Technology
See All in Technology
5年目から始める Vue3 サイト改善 #frontendo
tacck
PRO
3
170
250905 大吉祥寺.pm 2025 前夜祭 「プログラミングに出会って20年、『今』が1番楽しい」
msykd
PRO
1
370
LLMを搭載したプロダクトの品質保証の模索と学び
qa
0
220
2025年にHCP Vaultを学び直して見えた景色 / Lessons and New Perspectives from Relearning HCP Vault in 2025
aeonpeople
0
180
Bye-Bye Query Spaghetti: Write Queries You'll Actually Understand Using Pipelined SQL Syntax
tobiaslampertlotum
0
130
自作JSエンジンに推しプロポーザルを実装したい!
sajikix
1
140
AWS環境のリソース調査を Claude Code で効率化 / aws investigate with cc devio2025
masahirokawahara
2
1.2k
【 LLMエンジニアがヒューマノイド開発に挑んでみた 】 - 第104回 Machine Learning 15minutes! Hybrid
soneo1127
0
280
データアナリストからアナリティクスエンジニアになった話
hiyokko_data
2
400
AI時代にPdMとPMMはどう連携すべきか / PdM–PMM-collaboration-in-AI-era
rakus_dev
0
270
Snowflakeの生成AI機能を活用したデータ分析アプリの作成 〜Cortex AnalystとCortex Searchの活用とStreamlitアプリでの利用〜
nayuts
0
240
Figma + Storybook + PlaywrightのMCPを使ったフロントエンド開発
yug1224
10
3.9k
Featured
See All Featured
The Cost Of JavaScript in 2023
addyosmani
53
8.9k
4 Signs Your Business is Dying
shpigford
184
22k
Thoughts on Productivity
jonyablonski
69
4.8k
The Success of Rails: Ensuring Growth for the Next 100 Years
eileencodes
46
7.6k
Performance Is Good for Brains [We Love Speed 2024]
tammyeverts
11
1.1k
Being A Developer After 40
akosma
90
590k
The Psychology of Web Performance [Beyond Tellerrand 2023]
tammyeverts
49
3k
Into the Great Unknown - MozCon
thekraken
40
2k
Creating an realtime collaboration tool: Agile Flush - .NET Oxford
marcduiker
31
2.2k
Mobile First: as difficult as doing things right
swwweet
224
9.9k
Writing Fast Ruby
sferik
628
62k
Easily Structure & Communicate Ideas using Wireframe
afnizarnur
194
16k
Transcript
getting started with the spark core 1
2
hardware • 72MHz ARM Cortex M3 • TI CC3000 WiFi
• 7 analog IO (pwm out), 7 digital IO • ~ S$ 55 • coming soon: Photon, 120MHz, Broadcom Wifi, half the price 3
setup, the easy and unreliable way • install smartphone app
• power up spark • use smartphone app to create cloud account and register the core 4
setup, the better way (USB) npm install spark-cli spark setup
5
ready? • your spark core should have a name and
be breathing cyan by now 6
the tinker firmware • default firmware of spark core •
allows access to all inputs/outputs 7
let’s ask the cloud export SPARK_CORE_ID=53ff71065075535128311587 export SPARK_ACCESS_TOKEN=[get this from
https://spark.io/ build or “spark login”] 8
let’s ask the cloud curl -H "Authorization: Bearer $SPARK_ACCESS_TOKEN” https://
api.spark.io/v1/devices/$SPARK_CORE_ID/ Alternatively include the token in the URL: https://api.spark.io/v1/devices/$SPARK_CORE_ID/?access_token= $SPARK_ACCESS_TOKEN 9
output { "id": "53ff71065075535128311587", "name": "schnitzel", "connected": true, "variables": {},
"functions": [ "digitalread", "digitalwrite", "analogread", "analogwrite" ], "cc3000_patch_version": "1.29" 10
using functions curl https://api.spark.io/v1/devices/$SPARK_CORE_ID/ digitalwrite?access_token=$SPARK_ACCESS_TOKEN -d params=D7,LOW or include access_token
in POST data: curl https://api.spark.io/v1/devices/$SPARK_CORE_ID/ digitalwrite -d access_token=$SPARK_ACCESS_TOKEN -d params=D7,LOW 11
response { "id": "53ff71065075535128311587", "name": "schnitzel", "last_app": null, "connected": true,
"return_value": 1 } 12
the cloud API • events, functions, variables • http://docs.spark.io/api/ 13
the javascript API • runs in browser and in nodejs
• (npm|bower) install spark • http://docs.spark.io/javascript/ 14
voodoospark • alternative firmware • faster communication through local TCP
connection • npm client module • still uses cloud for facilitating local connection 15
DIY, noob mode 16
DIY, noob mode #2 17
DIY spark compile myapp.ino spark flash firmware_1421160254713.bin 18
DIY, hardcore mode brew tap PX4/homebrew-px4 brew update brew install
gcc-arm-none-eabi-48 brew install dfu-util git clone https://github.com/spark/core-firmware.git git clone https://github.com/spark/core-common-lib.git git clone https://github.com/spark/core-communication-lib.git 19
DIY, hardcore mode 20
DIY, hardcore mode cd core-firmware/build make 21
flash via USB • press&hold mode button, tap and release
RST button until LED flashes yellow • spark core is now in DFU mode 22
flash via USB make program-dfu 23
flash via cloud make program-cloud 24
create your own app mkdir core-firmware/applications/myapp vim core-firmware/applications/myapp/application.cpp cd core-firmware/build
make APP=myapp make APP=myapp program-dfu # or program-cloud 25
app structure #include "application.h" int example(String command); void setup(){ //
set up cloud functions & variables, IO pin modes Spark.function("example", example); pinMode(D7, OUTPUT); } void loop(){ // called continuously } int example(String command){ digitalWrite(D7, HIGH); delay(2000); digitalWrite(D7, LOW); return 1; } 26
app structure { "id": "53ff71065075535128311587", "name": "schnitzel", "connected": true, "variables":
{}, "functions": [ "example" ], "cc3000_patch_version": "1.29" 27
documentation • http://docs.spark.io/firmware/ 28
my current project 29
my current project • websockets • static site hosted on
S3 • direct connection from browser to spark • https://github.com/ponyfleisch/cloudlamp 30