Upgrade to Pro
— share decks privately, control downloads, hide ads and more …
Speaker Deck
Features
Speaker Deck
PRO
Sign in
Sign up for free
Search
Search
Regularizations of Inverse Problems
Search
Samuel Vaiter
September 06, 2013
Science
0
46
Regularizations of Inverse Problems
GRETSI'13, Quartz, Brest, September 2013.
Samuel Vaiter
September 06, 2013
Tweet
Share
More Decks by Samuel Vaiter
See All by Samuel Vaiter
Low Complexity Regularizations: A ''Localization'' Result
svaiter
0
66
A First Look at Proximal Methods
svaiter
0
210
Fast Distributed Total Variation
svaiter
0
110
Low Complexity Regularizations: A Localization Result
svaiter
0
68
Low Complexity Regularizations: a Localization Result
svaiter
0
120
Model Selection with Partly Smooth Functions
svaiter
0
42
Low Complexity Models: Robustness
svaiter
0
67
Low Complexity Models: Robustness and Sensivity
svaiter
0
38
Recovery Guarantees for Low Complexity Models
svaiter
1
77
Other Decks in Science
See All in Science
機械学習 - K近傍法 & 機械学習のお作法
trycycle
PRO
0
1.3k
生成検索エンジン最適化に関する研究の紹介
ynakano
2
1.5k
安心・効率的な医療現場の実現へ ~オンプレAI & ノーコードワークフローで進める業務改革~
siyoo
0
440
(2025) Balade en cyclotomie
mansuy
0
400
イロレーティングを活用した関東大学サッカーの定量的実力評価 / A quantitative performance evaluation of Kanto University Football Association using Elo rating
konakalab
0
180
20251212_LT忘年会_データサイエンス枠_新川.pdf
shinpsan
0
220
Accelerated Computing for Climate forecast
inureyes
PRO
0
140
[Paper Introduction] From Bytes to Ideas:Language Modeling with Autoregressive U-Nets
haruumiomoto
0
180
NASの容量不足のお悩み解決!災害対策も兼ねた「Wasabi Cloud NAS」はここがスゴイ
climbteam
1
310
Optimization of the Tournament Format for the Nationwide High School Kyudo Competition in Japan
konakalab
0
140
データマイニング - グラフ埋め込み入門
trycycle
PRO
1
150
(メタ)科学コミュニケーターからみたAI for Scienceの同床異夢
rmaruy
0
150
Featured
See All Featured
Designing Powerful Visuals for Engaging Learning
tmiket
0
200
Practical Tips for Bootstrapping Information Extraction Pipelines
honnibal
25
1.7k
The AI Revolution Will Not Be Monopolized: How open-source beats economies of scale, even for LLMs
inesmontani
PRO
3
2.9k
Breaking role norms: Why Content Design is so much more than writing copy - Taylor Woolridge
uxyall
0
140
What's in a price? How to price your products and services
michaelherold
246
13k
Why Your Marketing Sucks and What You Can Do About It - Sophie Logan
marketingsoph
0
60
How People are Using Generative and Agentic AI to Supercharge Their Products, Projects, Services and Value Streams Today
helenjbeal
1
99
Producing Creativity
orderedlist
PRO
348
40k
Large-scale JavaScript Application Architecture
addyosmani
515
110k
How to train your dragon (web standard)
notwaldorf
97
6.5k
Leveraging Curiosity to Care for An Aging Population
cassininazir
1
150
Claude Code どこまでも/ Claude Code Everywhere
nwiizo
61
51k
Transcript
Régularisation de problèmes inverses Analyse unifiée de la robustesse Samuel
VAITER CNRS, CEREMADE, Université Paris-Dauphine, France Travaux en collaboration avec M. GOLBABAEE, G. PEYRÉ et J. FADILI
Linear Inverse Problems inpainting denoising super-resolution Forward model y =
x0 + w observations noise input operator
The Variational Approach x argmin x RN 1 2 ||y
x||2 2 + J(x) Data fidelity Regularity
The Variational Approach x argmin x RN 1 2 ||y
x||2 2 + J(x) Data fidelity Regularity J(x) = || x||2 J(x) = || x||1
The Variational Approach x argmin x RN 1 2 ||y
x||2 2 + J(x) Data fidelity Regularity J(x) = || x||2 J(x) = || x||1 sparsity analysis-sparsity group-sparsity nuclear norm Tikhonov Total Variation Anti-sparse Polyhedral L1 + TV ... atomic norm decomposable norm Candidate J
Objectives Model selection performance x0 x w Prior model J
Objectives Model selection performance x0 x w Prior model J
Objectives Model selection performance x0 x w Prior model J
How close ? in term of SNR in term of features
Union of Linear Models Union of models: T T linear
spaces
Union of Linear Models Union of models: T T linear
spaces T sparsity
Union of Linear Models Union of models: T T linear
spaces block sparsity T sparsity
Union of Linear Models Union of models: T T linear
spaces block sparsity T sparsity analysis sparsity
Union of Linear Models Union of models: T T linear
spaces block sparsity low rank T sparsity analysis sparsity
Union of Linear Models Union of models: T T linear
spaces block sparsity low rank Objective Encode T in a function T sparsity analysis sparsity
Gauges 1 J(x) J : RN R+ convex J( x)
= J(x), 0
Gauges 1 J(x) J : RN R+ convex C C
= {x : J(x) 1} J( x) = J(x), 0
Gauges 1 J(x) J : RN R+ convex C C
= {x : J(x) 1} Geometry of C Union of Models (T )T T x T x x 0 T0 x x 0 T0 x 0 x T T0 ||x||1 |x1|+||x2,3|| ||x|| ||x|| J( x) = J(x), 0
Subdifferential |x| 0
Subdifferential |x| 0
J(x) = RN : x , J(x ) J(x)+ ,
x x Subdifferential |x| 0
J(x) = RN : x , J(x ) J(x)+ ,
x x Subdifferential |x| 0 |·|(0) = [ 1,1] x = 0, |·|(x) = {sign(x)}
From the Subdifferential to the Model J(x) x 0 J(x)
x 0
From the Subdifferential to the Model J(x) x 0 J(x)
x 0 Tx= VectHull( J(x)) Tx Tx Tx = : supp( ) supp(x)
From the Subdifferential to the Model J(x) x 0 J(x)
x 0 ex = ProjTx ( J(x)) ex ex ex = sign(x) Tx= VectHull( J(x)) Tx Tx Tx = : supp( ) supp(x)
Regularizations and their Models J(x) = ||x||1 ex = sign(x)
Tx = : supp( ) supp(x) x x J(x) = b ||xb|| ex = (N (xb))b B Tx = : supp( ) supp(x) x N (xb) = xb/||xb|| J(x) = ||x||∗ ex =UV Tx = : U V = 0 x x =UΛV ∗ J(x) = ||x||∞ ex = |I| 1 sign(x) Tx = : I sign(xI ) x x I = {i : |xi | = ||x||∞}
Dual Certificates and Model Selection x argmin x RN 1
2 ||y x||2 2 + J(x) Hypothesis: Ker Tx0 = {0} J regular enough
Dual Certificates and Model Selection x argmin x RN 1
2 ||y x||2 2 + J(x) Hypothesis: Ker Tx0 = {0} J regular enough ¯ D = Im ri( J(x0)) Tight dual certificates: x = x0 J(x) x
Dual Certificates and Model Selection x argmin x RN 1
2 ||y x||2 2 + J(x) Hypothesis: 0 = ( + Tx0 ) ex0 Minimal norm pre-certificate: Tx = Tx0 and ||x x0|| = O(||w||) If 0 ¯ D,||w|| small enough and ||w||, then x is the unique solution. Moreover, [V. et al. 2013] 1: [Fuchs 2004] 1 2: [Bach 2008] Ker Tx0 = {0} J regular enough ¯ D = Im ri( J(x0)) Tight dual certificates: x = x0 J(x) x
Example: Sparse Deconvolution x = i xi (· i) J(x)
= ||x||1 Increasing : reduces correlation. reduces resolution. x0 x0
Example: Sparse Deconvolution x = i xi (· i) J(x)
= ||x||1 Increasing : reduces correlation. reduces resolution. x0 x0 I = j : x0[j] = 0 || 0,Ic || < 1 0 ¯ D support recovery || 0,Ic || 1 2 20
Example: 1D TV Denoising J(x) = || x||1 = Id
I = {i : ( x0)i = 0} x0
Example: 1D TV Denoising J(x) = || x||1 = Id
I = {i : ( x0)i = 0} x0 +1 1 0 = div( 0) where j I,( 0)j = 0 x0 I J || 0,Ic || < 1 Support stability
Example: 1D TV Denoising J(x) = || x||1 = Id
I = {i : ( x0)i = 0} x0 +1 1 0 = div( 0) where j I,( 0)j = 0 x0 I J || 0,Ic || < 1 Support stability x0 || 0,Ic || = 1 2-stability only
Conclusion Gauges: encode linear models as singular points
Conclusion Gauges: encode linear models as singular points Certificates: guarantees
of model selection / 2 robustness (see poster 208 for a pure robustness result)
Conclusion Merci de votre attention ! Gauges: encode linear models
as singular points Certificates: guarantees of model selection / 2 robustness (see poster 208 for a pure robustness result)