Upgrade to Pro
— share decks privately, control downloads, hide ads and more …
Speaker Deck
Features
Speaker Deck
PRO
Sign in
Sign up for free
Search
Search
Open Software for Astrophysics, AAS241
Search
Dan Foreman-Mackey
January 12, 2023
Science
2
570
Open Software for Astrophysics, AAS241
Slides for my plenary talk at the 241st American Astronomical Society meeting.
Dan Foreman-Mackey
January 12, 2023
Tweet
Share
More Decks by Dan Foreman-Mackey
See All by Dan Foreman-Mackey
Open software for Astronomical Data Analysis
dfm
0
180
My research talk for CCA promotion
dfm
1
800
Astronomical software
dfm
1
760
emcee-odi
dfm
1
700
Exoplanet population inference: a tutorial
dfm
3
490
Data-driven discovery in the astronomical time domain
dfm
6
740
TensorFlow for astronomers
dfm
6
850
How to find a transiting exoplanets
dfm
1
490
Long-period transiting exoplanets
dfm
1
330
Other Decks in Science
See All in Science
白金鉱業Meetup_Vol.20 効果検証ことはじめ / Introduction to Impact Evaluation
brainpadpr
2
1.6k
防災デジタル分野での官民共創の取り組み (1)防災DX官民共創をどう進めるか
ditccsugii
0
480
My Little Monster
juzishuu
0
550
HajimetenoLT vol.17
hashimoto_kei
1
170
Cross-Media Technologies, Information Science and Human-Information Interaction
signer
PRO
3
32k
検索と推論タスクに関する論文の紹介
ynakano
1
140
あなたに水耕栽培を愛していないとは言わせない
mutsumix
1
240
中央大学AI・データサイエンスセンター 2025年第6回イブニングセミナー 『知能とはなにか ヒトとAIのあいだ』
tagtag
PRO
0
120
Celebrate UTIG: Staff and Student Awards 2025
utig
0
680
安心・効率的な医療現場の実現へ ~オンプレAI & ノーコードワークフローで進める業務改革~
siyoo
0
450
機械学習 - ニューラルネットワーク入門
trycycle
PRO
0
930
(メタ)科学コミュニケーターからみたAI for Scienceの同床異夢
rmaruy
0
160
Featured
See All Featured
Reality Check: Gamification 10 Years Later
codingconduct
0
2k
The Limits of Empathy - UXLibs8
cassininazir
1
210
Product Roadmaps are Hard
iamctodd
PRO
55
12k
The Web Performance Landscape in 2024 [PerfNow 2024]
tammyeverts
12
1k
The Power of CSS Pseudo Elements
geoffreycrofte
80
6.1k
10 Git Anti Patterns You Should be Aware of
lemiorhan
PRO
659
61k
Mozcon NYC 2025: Stop Losing SEO Traffic
samtorres
0
140
Marketing to machines
jonoalderson
1
4.6k
[RailsConf 2023 Opening Keynote] The Magic of Rails
eileencodes
31
9.9k
Primal Persuasion: How to Engage the Brain for Learning That Lasts
tmiket
0
240
Music & Morning Musume
bryan
47
7.1k
Designing Powerful Visuals for Engaging Learning
tmiket
0
220
Transcript
OPEN SOFTWARE FOR ASTROPHYSICS Dan Foreman-Mackey
None
case study: Gaussian Processes
AAS 225 / 2015 / Seattle AAS 231 / 2018
/ National Harbor
°0.6 °0.3 0.0 0.3 0.6 raw [ppt] 0 5 10
15 20 25 time [days] °0.30 °0.15 0.00 de-trended [ppt] N = 1000 reference: DFM+ (2017)
°0.6 °0.3 0.0 0.3 0.6 raw [ppt] 0 5 10
15 20 25 time [days] °0.30 °0.15 0.00 de-trended [ppt] N = 1000 reference: DFM+ (2017)
reference: Aigrain & DFM (2022)
reference: Aigrain & DFM (2022)
reference: Aigrain & DFM (2022) ignoring correlated noise accounting for
correlated noise
reference: Aigrain & DFM (2022)
a Gaussian Process is a drop - in replacement for
chi - squared
more details: Aigrain & Foreman-Mackey (2023) arXiv:2209.08940
7 [1] model building [2] computational cost
k(tn , tm ; θ) “kernel” or “covariance”
None
import george import celerite import tinygp
my f i rst try: george 1
import numpy as np def log_likelihood(params, x, diag, r) :
K = build_kernel_matrix(params, x, diag) gof = r.T @ np.linalg.solve(K, r) norm = np.linalg.slogdet(K)[1] return -0.5 * (gof + norm)
import numpy as np def log_likelihood(params, x, diag, r) :
K = build_kernel_matrix(params, x, diag) gof = r.T @ np.linalg.solve(K, r) norm = np.linalg.slogdet(K)[1] return -0.5 * (gof + norm)
k(tn , tm ; θ) “kernel” or “covariance”
from george.kernels import * k1 = 1.5 * ExpSquaredKernel(2.3) k2
= 5.5 * Matern32Kernel(0.1) kernel = 0.5 * (k1 + k2)
from george import GP gp = GP(kernel) gp.compute(x, yerr) gp.log_likelihood(y)
from george import GP gp = GP(kernel) gp.compute(x, yerr) gp.log_likelihood(y)
gp.f i t(y) ???
the astronomical Python ecosystem + MANY MORE!
* API design (library vs scripts) * don’t reinvent the
wheel
None
faster: celerite* 2 * yes, that truly is how you
pronounce it…
import numpy as np def log_likelihood(params, x, diag, r) :
K = build_kernel_matrix(params, x, diag) gof = r.T @ np.linalg.solve(K, r) norm = np.linalg.slogdet(K)[1] return -0.5 * (gof + norm)
import numpy as np def log_likelihood(params, x, diag, r) :
K = build_kernel_matrix(params, x, diag) gof = r.T @ np.linalg.solve(K, r) norm = np.linalg.slogdet(K)[1] return -0.5 * (gof + norm)
None
“semi/quasi - separable” matrices
102 103 104 105 number of data points [N] 10
5 10 4 10 3 10 2 10 1 100 computational cost [seconds] 1 2 4 8 16 32 64 128 256 direct O(N) 100 101 number o reference: DFM, Agol, Ambikasaran, Angus (2017)
102 103 104 105 number of data points [N] 10
4 10 3 10 2 10 1 100 computational cost [seconds] 1 2 4 8 16 32 64 128 256 O(N) 100 101 number o reference: DFM, Agol, Ambikasaran, Angus (2017)
None
+
+ + vs
* interdisciplinary collaboration * importance of implementation
7 [1] 1 (ish) dimensional input [2] specif i c
type of kernel restrictions:
modern infrastructure: tinygp 3
what’s missing from the astronomical Python ecosystem?
7 [1] differentiable programming [2] hardware acceleration
the broader numerical computing Python ecosystem + SO MANY MORE!
jax.readthedocs.io
import numpy as np def linear_least_squares(x, y) : A =
np.vander(x, 2) return np.linalg.lstsq(A, y)[0]
import jax.numpy as jnp def linear_least_squares(x, y) : A =
jnp.vander(x, 2) return jnp.linalg.lstsq(A, y)[0]
import jax.numpy as jnp @jax.jit def linear_least_squares(x, y) : A
= jnp.vander(x, 2) return jnp.linalg.lstsq(A, y)[0]
None
tinygp.readthedocs.io
the broader numerical computing Python ecosystem + SO MANY MORE!
* I <3 JAX * don’t reinvent the wheel
the why & how of open software in astrophysics
credit: Adrian Price-Whelan / / data: SAO/NASA ADS
None
None
None
None
takeaways
open software is foundational to astrophysics research let’s consider &
discuss interface design and user interaction leverage existing infrastructure & learn when to start fresh
get in touch! dfm.io github.com/dfm
None