Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Julia Lascar

Julia Lascar

(CEA Paris Saclay)

Title — Hyperspectral data fusion and source separation for X-ray astrophysics

Abstract — In astrophysics, X-ray telescopes can collect cubes of data called hyperspectral images. These data cubes have two spatial dimensions, and one spectral dimension. The subject of my thesis is to implement algorithms to analyze X-ray hyperspectral images of supernovae remnants. In particular, I have implemented a non-stationary unmixing algorithm (unmixing where each endmember varies spectrally), and a fusion algorithm to obtain the best resolutions from two generation of telescopes. The aim of my thesis is to combine the two, and thus obtain an algorithm that unmixes and fuses hyperspectral data simultaneously.

Article: https://arxiv.org/abs/2404.03490

Bio
Julia Lascar is a 2nd year PhD student working at the CEA with Jérôme Bobin and Fabio Acero. Her research focuses on implementing signal processing algorithms applied to hyperspectral X-ray astrophysics, with a particular interest in supernova remnants. She has a Master’s degree in astrophysics from McGill University, Montreal, and a M2 in statistics from ENSAI, Rennes. Before that, she studied Physics and Philosophy at King’s College London.

S³ Seminar

June 07, 2024
Tweet

More Decks by S³ Seminar

Other Decks in Research

Transcript

  1. S3 SEMINAR JULIA LASCAR, JÉRÔME BOBIN, FABIO ACERO HYPERSPECTRAL DATA

    FUSION AND SOURCE SEPARATION FOR X-RAY ASTROPHYSICS 1 Julia Lascar | [email protected] | https://github.com/JMLascar/
  2. OUTLINE • Context: X-ray Astrophysics • Source Separation with Spectral

    Variabilities • Hyperspectral Fusion • Future works and perspectives 2 Julia Lascar | [email protected] | https://github.com/JMLascar/
  3. CONTEXT • Supernova Remnants: stars that exploded hundreds of years

    ago Cassiopeia A 4 Julia Lascar | [email protected] | https://github.com/JMLascar/ Type Ia (binary star collapse) Type II (core collapse) Tycho
  4. CONTEXT • Hyperspectral images: 2 spatial dimensions + 1 energy

    dimension • For each pixel, a spectrum • Entangled physical components spatial spatial spectral 5 Julia Lascar | [email protected] | https://github.com/JMLascar/
  5. CHALLENGES IN X-RAY ASTROPHYSICS • Poisson Noise • Low signal/noise

    ratio, noise variabilities • High spectral variability • Non analytical physical model --> Need tools that account for these specific challenges 6 Julia Lascar | [email protected] | https://github.com/JMLascar/
  6. SEMI-BLIND UNMIXING WITH SPARSITY FOR HYPERSPECTRAL IMAGES 7 SUSHI SOURCE

    SEPARATION WITH SPECTRAL VARIABILITIES Julia Lascar | [email protected] | https://github.com/JMLascar/
  7. UNMIXING: A NON-STATIONARY MODEL Stationary model: = + + …

    Spectrum_1 Image_1 Spectrum_2 Image_2 + + … More realistic model includes spatial variability: Cube_1 Cube_2 = 8
  8. SUSHI: AN OVERVIEW  Plug a learnt model for spectral

    variation  Applies a spatial regularisation on the parameters map of the learnt model  Obtains, for each component, a cube that varies spatially and spectrally. 9 + … Component 1 = + Component 2 SUSHI output
  9. SUSHI: OVERVIEW 10 Data fidelity Learnt Model Latent Parameters Spatial

    Reg = + 1 2 Amplitude Latent Parameters A, θ = argmin + A, θ Julia Lascar | [email protected] | https://github.com/JMLascar/
  10. • Learn to interpolate between anchor points 11 LEARNT MODEL:

    INTERPOLATORY AUTO ENCODER (IAE) Bobin, J., Gertosio, R., Thiam, C., Bobin C. 2021. Julia Lascar | [email protected] | https://github.com/JMLascar/
  11. LEARNT MODEL: INTERPOLATORY AUTO ENCODER (IAE) 12 Spectral Space Latent

    Space Spectral Space Julia Lascar | [email protected] | https://github.com/JMLascar/
  12. THE DECODER AS A GENERATIVE MODEL 13 Julia Lascar |

    [email protected] | https://github.com/JMLascar/ • D is differentiable • Not costly to call upon • Latent parameters vary smoothly with physical parameters  Spatial regularization makes sense
  13. SPATIAL REGULARIZATION OF LATENT PARAMETERS • Undecimated Isotropic Wavelet Transform:

    Starlet Transform (useful in astro) • Minimize L1 norm 14 Likelihood Learnt Model Latent Parameters Spatial Reg • Proximal Alternating Linearized Minimization (PALM, Bolte 2014) Julia Lascar | [email protected] | https://github.com/JMLascar/ A, θ = argmin A, θ +
  14. SIMULATED TOY MODEL • from real images + numerical simulations

    of CasA • Thermal Component: Varying redshift, temperature • Synchrotron Component: Constant Photon Index 16 16 Thermal Amplitude Synchrotron Amplitude Photon Index Velocity Redshift (z) Temperature (keV)
  15. EXAMPLE OF SPECTRA FROM INDIVIDUAL PIXELS • Varying levels of

    noise / count statistics 17 Energy (keV) Flux Julia Lascar | [email protected] | https://github.com/JMLascar/
  16. 21

  17. SUSHI SUMMARY • SUSHI is a method to unmix hyperspectral

    images with spectral variations based on a physical model (one for each endmember). • As a surrogate model, SUSHI uses the decoder of an Interpolatory Auto-Encoder. • It applies a spatial sparsity regularization on the surrogate model’s parameter maps. • General framework applicable to hyperspectral images with spectral variabilities where a spectral model can be learnt. 23 https://github.com/JMLascar/SUSHI Julia Lascar | [email protected] |
  18. FUSIO N 25 √ Spatial X Spectral X Spatial √

    Spectral √ Spatial √ Spectral XMM- Newton, 1999 XRISM, 2023 Chandra, 1999 Athena-XIFU, 2037
  19. Sourc e Z XMM XRISM Rebinni ng Operato rs R

    1 ( ) Z⊛ ⊛ R 2 ( ) Z⊛ ⊛ Spatial 2D Convolution Kernel Spectral 1D Convolution Kernel Effectiv e Area X Y × × Observed data Poisso n Noise Poisso n Noise FUSION FORWARD MODEL 26 Julia Lascar | [email protected] | https://github.com/JMLascar/
  20. REGULARIZATION • Three methods: • l1 norm of Wavelet 2D-1D

    coefficients • Low rank Approximation (using PCA) with Sobolev Regularization (inspired by Guilloteau 2020 work on JWST) • Low rank Approximation with 2D Wavelet Regularization 28 Julia Lascar | [email protected] | https://github.com/JMLascar/
  21. ALGORITHM • Negative log likelihood Poisson • Most of the

    cost function is differentiable (except at 0) • Non differentiable constraints, but convex • Proximal Gradient Descent / ISTA • FFT of the kernels is calculated at the start, then saved. 29 Julia Lascar | [email protected] | https://github.com/JMLascar/
  22. FOUR TOYMODELS 33 • Gaussian: • 1 keV • 6

    keV • Gaussian with rebinning: • 1 keV • Realistic: • 1 keV Julia Lascar | [email protected] | https://github.com/JMLascar/
  23. REALISTIC TOY MODEL 38 (Note: version with more noise in

    progress) Julia Lascar | [email protected] | https://github.com/JMLascar/
  24. Gaussian 1 keV Gaussian 6 keV Realistic 1 keV Gaussian

    with Rebinning 1 keV SUMMARY 47 ASAM: Moyenne de Carte d’angle spectral, PSNR: Pic Rapport Signal sur Bruit, acSSIM: moyenne complémentaire de l’indice de similarité structurelle
  25. FUSION SUMMARY 48 • Propose a new algorithm for hyperspectral

    fusion adapted to X-ray imaging • Studied regularization at different regimes • Methods work similarly at low spectral variability • Low rank is faster thanks to dimension reduction • W2D1D works best at high spectral variability • Though, both can be biased • Need a method that reduces dimension, preserves variabilities  include physical information Julia Lascar | [email protected] | https://github.com/JMLascar/
  26. FUTURE WORK • FUSION + SOURCE SEPARATION • Will have

    the acceleration obtained by dimension reduction • Including physical information • Performs source separation and fusion at the same time
  27. CONCLUSION • X-ray hyperspectral images are complex to analyse because

    of Poisson noise and high spectral variability • SUSHI is a method to unmix hyperspectral images with spectral variations based on a physical model (one for each endmember). • HIFReD is a fusion method, which combines two hyperspectral images to obtain the best spatial and spectral resolutions 50 [email protected] https://github.com/JMLascar/SUSHI
  28. DECODER FUNCTION AS A GENERATIVE MODEL • D is differentiable

    • Not costly to call upon • Keeps a notion of neighborhood  spatial regularization makes sense 52 52
  29. ALGORITHM OUTLINE • Until stopping criterion: • For each component

    C : • 1. Update latent parameters for C, keep all else fixed • Gradient descent for the latent parameters • Soft thresholding in wavelet domain of the parameter maps • 2. Update Amplitude for C, keep all else fixed •Gradient descent for the Amplitude 53
  30. CLASSIC METHOD • Physical model fit pixel by pixel with

    multiple variables • Treats pixels individually: • Performs poorly on low signal to noise pixels • Ignores correlation between pixels • Costly to call upon • Not differentiable model 54 Improvement: Spatial regularization on the parameters of a learnt spectral model
  31. STATE OF THE ART • First, panchromatic / multi-spectral, then

    multispectral / hyper-spectral • Subspace projection • Unmixing, e.g. Yokoya et al., 2012, Prévost et al. 2022 • Low rank approximation, e.g. Simões et al., 2015 • Spatial Regularization • TV, sparse dictionary (Wei et al., 2015), deep learning (Uezato et al., 2020) • Astrophysics, mainly for JWST: • Guilloteau et al. 2020, low rank approximation with Sobolev regularization • Pineau et al., 2023, exact solution for fusion with unmixing 55 Julia Lascar | [email protected] | https://github.com/JMLascar/