Upgrade to Pro
— share decks privately, control downloads, hide ads and more …
Speaker Deck
Features
Speaker Deck
PRO
Sign in
Sign up for free
Search
Search
Airflowのチュートリアルやってみた
Search
iwamot
PRO
June 30, 2023
Technology
0
410
Airflowのチュートリアルやってみた
2023-06-30
ENECHANGE Tech Talk(社内勉強会)
iwamot
PRO
June 30, 2023
Tweet
Share
More Decks by iwamot
See All by iwamot
これがLambdaレス時代のChatOpsだ!実例で学ぶAmazon Q Developerカスタムアクション活用法
iwamot
PRO
4
170
Developer Certificate of Origin、よさそう
iwamot
PRO
0
10
復号できなくなると怖いので、AWS KMSキーの削除を「面倒」にしてみた CODT 2025 クロージングイベント版
iwamot
PRO
1
78
復号できなくなると怖いので、AWS KMSキーの削除を「面倒」にしてみた
iwamot
PRO
3
73
IPA&AWSダブル全冠が明かす、人生を変えた勉強法のすべて
iwamot
PRO
14
11k
2年でここまで成長!AWSで育てたAI Slack botの軌跡
iwamot
PRO
4
1.1k
名単体テスト 禁断の傀儡(モック)
iwamot
PRO
1
540
クォータ監視、AWS Organizations環境でも楽勝です✌️
iwamot
PRO
2
540
Cline、めっちゃ便利、お金が飛ぶ💸
iwamot
PRO
22
22k
Other Decks in Technology
See All in Technology
セキュアな認可付きリモートMCPサーバーをAWSマネージドサービスでつくろう! / Let's build an OAuth protected remote MCP server based on AWS managed services
kaminashi
3
230
"プロポーザルってなんか怖そう"という境界を超えてみた@TSUDOI by giftee Tech #1
shilo113
0
150
o11yで育てる、強い内製開発組織
_awache
3
130
OCI Network Firewall 概要
oracle4engineer
PRO
1
7.8k
AI駆動開発を推進するためにサービス開発チームで 取り組んでいること
noayaoshiro
0
230
生成AIとM5Stack / M5 Japan Tour 2025 Autumn 東京
you
PRO
0
240
[Keynote] What do you need to know about DevEx in 2025
salaboy
0
120
リーダーになったら未来を語れるようになろう/Speak the Future
sanogemaru
0
350
Adapty_東京AI祭ハッカソン2025ピッチスライド
shinoyamada
0
200
20201008_ファインディ_品質意識を育てる役目は人かAIか___2_.pdf
findy_eventslides
2
560
AIAgentの限界を超え、 現場を動かすWorkflowAgentの設計と実践
miyatakoji
1
160
大規模サーバーレスAPIの堅牢性・信頼性設計 〜AWSのベストプラクティスから始まる現実的制約との向き合い方〜
maimyyym
4
2.9k
Featured
See All Featured
Responsive Adventures: Dirty Tricks From The Dark Corners of Front-End
smashingmag
252
21k
The Pragmatic Product Professional
lauravandoore
36
6.9k
How to Create Impact in a Changing Tech Landscape [PerfNow 2023]
tammyeverts
54
3k
Gamification - CAS2011
davidbonilla
81
5.5k
Building Flexible Design Systems
yeseniaperezcruz
329
39k
The Web Performance Landscape in 2024 [PerfNow 2024]
tammyeverts
9
850
Save Time (by Creating Custom Rails Generators)
garrettdimon
PRO
32
1.6k
個人開発の失敗を避けるイケてる考え方 / tips for indie hackers
panda_program
114
20k
Measuring & Analyzing Core Web Vitals
bluesmoon
9
620
Documentation Writing (for coders)
carmenintech
75
5k
CoffeeScript is Beautiful & I Never Want to Write Plain JavaScript Again
sstephenson
162
15k
Build The Right Thing And Hit Your Dates
maggiecrowley
37
2.9k
Transcript
Airflowのチュートリアルやってみた 2023-06-30 ENECHANGE Tech Talk(社内勉強会) CTO室 岩本隆史
Airflow案件に関わることに
よい機会なのでMWAAを試したい https://aws.amazon.com/jp/managed-workflows-for-apache-airflow/
チュートリアルをやってみよう https://docs.aws.amazon.com/mwaa/latest/userguide/quick-start.html
めっちゃ時間かかった… https://docs.aws.amazon.com/mwaa/latest/userguide/quick-start.html#quick-start- createstack
Dockerだと数分で構築完了 curl -LfO 'https://airflow.apache.org/docs/apache-airflow/2.6.2/docker-compose.yaml' mkdir -p ./dags ./logs ./plugins ./config
echo -e "AIRFLOW_UID=$(id -u)" > .env docker compose up airflow-init docker compose up https://airflow.apache.org/docs/apache-airflow/stable/howto/docker-compose/
サンプルDAGも豊富
tutorial DAGを実行
成功
3つのタスク
タスク1=日付の出力 t1 = BashOperator( task_id="print_date", bash_command="date", ) [2023-06-22, 06:52:22 UTC]
{subprocess.py:75} INFO - Running command: ['/bin/bash', '-c', 'date'] [2023-06-22, 06:52:22 UTC] {subprocess.py:86} INFO - Output: [2023-06-22, 06:52:22 UTC] {subprocess.py:93} INFO - Thu Jun 22 06:52:22 UTC 2023 [2023-06-22, 06:52:22 UTC] {subprocess.py:97} INFO - Command exited with return code 0
タスク2=スリープ t2 = BashOperator( task_id="sleep", depends_on_past=False, bash_command="sleep 5", retries=3, )
[2023-06-22, 06:52:25 UTC] {subprocess.py:75} INFO - Running command: ['/bin/bash', '-c', 'sleep 5'] [2023-06-22, 06:52:25 UTC] {subprocess.py:86} INFO - Output: [2023-06-22, 06:52:30 UTC] {subprocess.py:97} INFO - Command exited with return code 0
タスク3=テンプレートの利用 templated_command = dedent( """ {% for i in range(5)
%} echo "{{ ds }}" echo "{{ macros.ds_add(ds, 7)}}" {% endfor %} """ ) t3 = BashOperator( task_id="templated", depends_on_past=False, bash_command=templated_command, )
10個のechoにレンダリング echo "2023-06-22" echo "2023-06-29" echo "2023-06-22" echo "2023-06-29" echo
"2023-06-22" echo "2023-06-29" echo "2023-06-22" echo "2023-06-29" echo "2023-06-22" echo "2023-06-29"
10個の日付が出力 [2023-06-22, 06:52:25 UTC] {subprocess.py:86} INFO - Output: [2023-06-22, 06:52:25
UTC] {subprocess.py:93} INFO - 2023-06-22 [2023-06-22, 06:52:25 UTC] {subprocess.py:93} INFO - 2023-06-29 [2023-06-22, 06:52:25 UTC] {subprocess.py:93} INFO - 2023-06-22 [2023-06-22, 06:52:25 UTC] {subprocess.py:93} INFO - 2023-06-29 [2023-06-22, 06:52:25 UTC] {subprocess.py:93} INFO - 2023-06-22 [2023-06-22, 06:52:25 UTC] {subprocess.py:93} INFO - 2023-06-29 [2023-06-22, 06:52:25 UTC] {subprocess.py:93} INFO - 2023-06-22 [2023-06-22, 06:52:25 UTC] {subprocess.py:93} INFO - 2023-06-29 [2023-06-22, 06:52:25 UTC] {subprocess.py:93} INFO - 2023-06-22 [2023-06-22, 06:52:25 UTC] {subprocess.py:93} INFO - 2023-06-29 [2023-06-22, 06:52:25 UTC] {subprocess.py:97} INFO - Command exited with return code 0
タスク依存関係は演算子で指定 t1 >> [t2, t3]
別のチュートリアルも実行
Extract @task() def extract(): data_string = '{"1001": 301.27, "1002": 433.21,
"1003": 502.22}' order_data_dict = json.loads(data_string) return order_data_dict Key Value return_value {'1001': 301.27, '1002': 433.21, '1003': 502.22}
Transform @task(multiple_outputs=True) def transform(order_data_dict: dict): total_order_value = 0 for value
in order_data_dict.values(): total_order_value += value return {"total_order_value": total_order_value} Key Value total_order_value 1236.7 return_value {'total_order_value': 1236.7}
Load @task() def load(total_order_value: float): print(f"Total order value is: {total_order_value:.2f}")
[2023-06-22, 07:55:00 UTC] {logging_mixin.py:149} INFO - Total order value is: 1236.70
タスク依存関係は自動解決 order_data = extract() order_summary = transform(order_data) load(order_summary["total_order_value"])
実はAirflow 2.0の新機能 @task def hello_name(name: str): print(f'Hello {name}!') hello_name('Airflow users')
Dockerで気軽に試そう