stars. Fundamental/Intrinsic Properties 1. T eff 2. log g 3. [Fe/H], [α/Fe], C/O Special Properties 1. dm/dt 2. B 3. A V 4. f spot 5. R p /R star (λ) 6. log gf, etc. Extrinsic/Kinematic Properties 1. RV 2. v sin i .
Has a large library of observed template spectra - Built with SciPy/AstroPy/Pandas - Assumes all stars look like “normal” stars in your library Yee et al. 2017
anything we’ve seen before. → Templates are scarce/non-existent. → Ground-truth labelling is difﬁcult/impossible. We have to model our spectra based on astrophysical theory.
noisy astronomical spectrum 2. A tunable theoretical model for how that spectrum could have been generated and outputs the-cloud-of-physical-properties-consistent-with-that-data
to compute. 2. The models are imperfect. 3. The models have possibly many parameters. 4. Degeneracies among parameters give rise to similar spectra. 5. The data possess correlated noise (e.g. from Earth’s atmosphere). 6. The noise properties may not be perfectly known.
to compute. 2. The models are imperfect. 3. The models have possibly many parameters. 4. Degeneracies among parameters give rise to similar spectra. 5. The data possess correlated noise (e.g. from Earth’s atmosphere). 6. The noise properties may not be perfectly known. Spectral emulation Gaussian Processes MCMC & Gibbs Sampling Local covariance kernels Noise scale inference
grids. - The interpolation occurs on the weights of PCA eigenspectra computed from the grid volume. - These weights tend to be smooth in the model parameters, giving a better reconstruction than linear interpolation of pixels. Czekala et al. 2015
grids. Emulation mitigates “piling up” at interpolated grid points (e.g. Cottaar et al. 2014) Mean reconstructed model Covariance matrix of each pixel Czekala et al. 2015
slightly-off line strengths. Non-stationary kernel downweights routine outliers. Instrumental noise alone underestimates residuals. Net effect of Gaussian Process is to avoid overﬁtting noise spikes. Starfish covariance matrix “Chi-squared” diagonal matrix Czekala et al. 2015
star ^Sunspots are seen on the Sun. Giant starspots confound fundamental properties and are difﬁcult to measure. Somers et al. 2015; Roettenbacher et al. 2016
Plus: - Temperature of the spot, T spot - Coverage fraction of spots, f spot 14 total parameters ﬁt with ensemble sampling with emcee, chunking the IGRINS spectrum into 42 segments matched to spectral order; 21 segments shown here → github.com/BrownDwarf/welter
are shrouded in dust and difﬁcult to observe. We used ~8 hours of Keck time on a single protostar to measure its spectrum. Greene, Gully-Santiago, Barsony 2018 github.com/browndwarf/protostars
a ~1200 K disk possessing 4x the emitting area of protostar. github.com/browndwarf/protostars We added 4 new parameters to Starﬁsh: 1. Disk temperature 2. Disk emitting area 3. Extinction A K 4. Extinction power law Informs strategies for JWST.
et al. in prep. We’ve extended Starﬁsh to Brown Dwarfs using the Sonora-Bobcat synthethic model grid (Marley et al. in prep) cf. github.com/gully/jammer-Gl570D A sea of molecules blanket the spectra of brown dwarfs making them difﬁcult to interpret. Starﬁsh enables retrieval-like analyses with physically self-consistent models.
to entry have led to high interest but low adoption 2. Tuning the blocked Gibbs sampler is subtle and slow 3. Training the spectral emulator is computationally demanding and slow
to entry have led to high interest but low adoption 2. Tuning the blocked Gibbs sampler is subtle and slow 3. Training the spectral emulator is computationally demanding and slow 4. Physical extensions reside in undocumented forks 5. Not set up for auto-differentiation 6. No GPU acceleration
to entry have led to high interest but low adoption 2. Tuning the blocked Gibbs sampler is subtle and slow 3. Training the spectral emulator is computationally demanding and slow 4. Physical extensions reside in undocumented forks 5. Not set up for auto-differentiation 6. No GPU acceleration Addressed in v. 0.3!
to entry have led to high interest but low adoption 2. Tuning the blocked Gibbs sampler is subtle and slow 3. Training the spectral emulator is computationally demanding and slow 4. Physical extensions reside in undocumented forks 5. Not set up for auto-differentiation 6. No GPU acceleration Addressed in v. 0.3! Applying for NASA funding for support
is solving the N~1000 Gaussian Process likelihood. - We cannot use celerite* since the Starﬁsh noise matrix is non-stationary. With modern GPUs we can get to N~20,000 pixel spectra *Foreman-Mackey et al. 2017 github.com/dfm/celerite
(10+ parameters) is difﬁcult. - Hamiltonian Monte Carlo (e.g. NUTS) overcomes this challenge by using exact gradients - Autodiff dramatically simpliﬁes writing physical extensions Hoffman & Gelman 2011 arxiv.org/abs/1111.4246 ^ Samples from a 250 dimensional correlated Multivariate Normal
(10+ parameters) is difﬁcult. - Hamiltonian Monte Carlo (e.g. NUTS) overcomes this challenge by using exact gradients - Autodiff dramatically simpliﬁes writing physical extensions statmodeling.stat.columbia.edu/2017/03/15/ensemble-methods-doomed-fail-high-dimensions/ Emcee begins to fail for ten(s) of parameters
2. Spectral emulation unlocks value from pre-computed synthetic grid models 3. Starfish has enabled new applications domains (starspots, brown dwarf physical chemistry, protostars) 4. Future promise of Jax/Numpyro: autodiff & GPUs will allow us to ask new questions at the scientiﬁc frontier
sponsors ➔ Ian Czekala (UC Berkeley), Miles Lucas (U Hawaii), and Starﬁsh contributors ➔ Greg Herczeg (KIAA-Beijing), Tom Greene & Mark Marley (NASA Ames), Caroline Morley (UT Austin) for funding Starﬁsh development ➔ Austin Python Users Group, Beijing Python Meetup, SF Python Meetup