Benchmark Phase II 2 • Benchmark objective: Review the methodology applied by TSO and research institutes (14 participants) to quantify model parameters uncertainty • Phase II Objective: identify influential parameters used in parameterized phenomenological reflood model • Materials: • System Thermal-Hydraulics Code (parameterized model) • FEBA Separate Effect Test Facility = 120% ANS decay = 4.114 m = 4.1 bar = 3.8 cm. s−1 = 312 [K]
PREMIUM Benchmark Phase II 3 26 Parameters to specify the model in TRACE : 1. (4) boundary conditions (backpressure, inlet flow, etc.) 2. (9) material properties (conductivity, heat capacity, etc.) 3. (2) spacer grid model (Conv. HT enhancement, Δ correlation) 4. (10) Post-CHF closure relations describing transfer terms of phases (IAFB wall HTC, DFFB Interf. Drag,…) 5. (1) quench temperature Perturbed by factors with independent & uniform range of variations (set up a priori) Inverted Annular Film Boiling (IAFB) Dispersed Flow (DFFB) Transitionary (Inverted Slug) Uncertainty propagation by Monte Carlo Sampling for mid-assembly prediction in TRACE
the calibration was done on other similar separate effect test? Uncertain Model Parameters: Origin 4 1. “…the approximate value of the coefficient in Eq. (4-119) was determined from data comparisons with FLECHT- SEASET high flooding rate reflood data…” (pp. 164) 2. “In TRACE, the above interfacial drag coefficient has been reduced by a factor of ¾ to better match FLECHT-SEASET high flooding rate reflood data, so…” (pp. 166) 3. “…in simulations of FLECHT-SEASET high flooding rate tests, this value appeared to be too large and thus was reduced to…” (pp. 223) RBHT PSU, USA ACHILLES Winfrith, UK FEBA KIT, Germany FLECHT-SEASET Westinghouse, USA Excerpt from TRACE manual
Analysis of Computer Experiment: Application to TH Modelling 5 Ultimate Goal: Assign prior probabilities on (TRACE reflood) model parameters and update them using available experimental data Steps: 1. Lots of parameters are perceived to be important (initially) •How parameters affect the output? •Which parameters are actually important? •Given some type of data, which parameters can or cannot be informed? 2. Running code is expensive (need many), is there a method to approximate input/output to save time and storage space? 3. Update prior probabilities Sensitivity Analysis Meta- modelling Bayesian Calibration
Interest (QoI) 6 Consider temperature transient during reflood at mid-assembly 600 s ; Δ = 0.1 s = 6000 data points How to describe impact of parameter perturbation on the output? Max. Cladding Temperature Time of Quenching (time at maximum negative curvature) 2 Use a predefined scalar function • Relevant to the application • Problem-specific (min, max, ave) 1 Use the temperature at each time step • inputs × outputs = 156′000 sensitivity measures • Interpretability?
Interest (QoI) 7 3 , = 1 =1 ( − ത ()) ⋅ ( − ത ()) Empirical (Co)variance of the dataset: , = =1 ∞ ⋅ ⋅ () = ത + =1 ∞ ⋅ Use data-driven dimension reduction technique PC scores, associated with each reflood curve, contain the random character Quantities of Interest describing overall variation New basis in reduced space • :Eigenvalues • :Eigenfunctions (fPC) Principal Component Analysis (PCA) An orthogonal basis functions expansion Change in overall shape? Consider all the points simultaneously
Model Reduced Output 8 • The first 2 PCs account for % of overall temperature transient variation • 1st PC related to temperature ramp variation • 2nd PC related to temperature descent variation • Output dimension reduction: 6′000 → 2
Analysis of Computer Experiment: Application to TH Modelling 9 Ultimate Goal: Assign prior probabilities on (TRACE reflood) model parameters and update using available experimental data Steps: 1. Lots of parameters are perceived to be important (initially) •How parameters affect the output? •Which parameters are actually important? •Given some type of data, which parameters can or cannot be informed? 2. Running code is expensive (need many), is there a method to approximate input/output to save time and storage space? 3. Update prior probabilities Sensitivity Analysis Meta- modelling Bayesian Calibration
Analysis 10 • Effect of small perturbation around nominal point, one parameter at a time (One-at-Time, OAT) • Assume linearity, monotonic, no interaction • Approximate partial derivative, intuitive interpretation • But: Perturbation might not be small (grossly uncertain input), linearity might not hold Simultaneous perturbation excludes interaction effect Nominal point is not necessarily known
and Exploring Parameter Space 11 • Knowledge of the complex code output throughout the input parameters space is important for its validation, calibration, and (global) sensitivity analysis • In an ideal world: discretized the parameter space, evaluate the model, and construct and analyze the output “surface” • # of computations explodes easily : # of input parameters : # of discretization level = # of runs = 5 = 10 = 20 = 5 3125 9.77 × 106 9.54 × 1013 = 10 1.0 × 105 1.02 × 1010 1.00 × 1020 = 20 . × . × . × 32 1.05 × 106 1024 = 2 Select limited points judiciously = Design of Computer Experiment Local sensitivity of a complex model?
Computer Experiment 12 • Sampling-based uncertainty propagation does not depend on Make statement about output • Sensitivity analysis requires model exploration that proportional to the number of parameters Make statement about the parameters in relation to the output • For model exploration spread points in many possible combinations • Different purpose of (simulation) studies entails different kind of design Simple Random Sampling (SRS) usage: general purpose, uncertainty propagation Space-filling Design (Latin Hypercube & Variant, Low Discrepancy & Variant ) usage: numerical integration, sensitivity analysis, metamodelling Winding Stairs Usage: screening analysis Etc. etc...
Analysis 13 Due to Morris (1991), a One-at-a-Time design with a twist: trajectory + randomized design (winding stairs design) For each random nominal points, the effect of parameter perturbation (elementary effect): ≡ Δ Δ = 1 , … , i + Δ , … , − 1 , … , , … , Δ 4-samples of winding stairs design of 2-dimensional input parameter space discretized in 5 levels Mean and variance of EE are sensitivity measures, used in parameter importance ranking Least important one can be excluded
Model Screening 14 Check run with 50 MC samples using only parameters from each set • 1’080 TRACE runs • 16 parameters were found to have a very small effect on the average temperature output • Fixing non-influential parameters does not significantly change output variation • Closure laws, boundary conditions, and spacer grid model parameters are relatively influential (10 parameters) • 16 parameters are excluded from further analysis
Analysis of Computer Experiment: Application to TH Modelling 15 Ultimate Goal: Assign prior probabilities on (TRACE reflood) model parameters and update using available experimental data Steps: 1. Lots of parameters are perceived to be important (initially) •How parameters affect the output? •Which parameters are actually important? •Given some type of data, which parameters can or cannot be informed? 2. Running code is expensive (need many), is there a method to approximate input/output to save time and storage space? 3. Update prior probabilities Sensitivity Analysis Meta- modelling Bayesian Calibration
Decomposition 16 Sobol’ decomposition gives: Sobol’ Indices Definition Meaning Main-effect (1st –order) ≡ ~ Output variance cut if fixed (parameter prioritization) Total Effect ≡ 1 − ~ ~ Output variance left if other than fixed (parameter fixing) Interaction Effect − Effect of a parameter on the output depends on the other parameters (parameter identifiability) [] = =1 ~ + 1≤<≤ [~ , ] + ⋯ 1st-Order (Main) Effect 2nd-Order Effect Higher- order Effect Variance often used as representative characterization of sampled output. Its decomposition can inform us about complex model structure
differently for different aspect of the output: Temperature Ramping 17 • The model is additive with respect to temperature raise variation before quenching • Given temperature data, these parameters (and their uncertainties) by structure of the model theoretically can be estimated • FEBA facility is not suitable for inverted annular flow (IAFB) model assessment 21% 21% 25% 20% Other Main-Effect 5% Interaction 8% DFFB Parameters (66%) Sum of Main-Effect Indices = 92%
differently for different aspect of the output: Temperature Descent 18 Small 1st-order effect: σ = % Boundary Conditions Physical Model Parameters Δ indices = Total Interaction Effect • The model is interacting w.r.t to this temperature descent variation • Parameters estimation might suffer from non- identifiability (non-unique calibrated parameter)
Analysis of Computer Experiment: Application to TH Modelling 19 Ultimate Goal: Assign prior probabilities on (TRACE reflood) model parameters and update using available experimental data Steps: 1. Lots of parameters are perceived to be important (initially) •How parameters affect the output? •which parameters are actually important? •Given some type of data, which parameters can or cannot be informed? 2. Running code is expensive (need many), is there a method to approximate input/output to save time and storage space? 3. Update prior probabilities Sensitivity Analysis Meta- modelling Bayesian Calibration
Prior for Regression 20 • Each FEBA simulation takes 6-10 minutes and 200 MB of output (Faster? Smaller?) • Gaussian process prior: A distribution over functions in function space ⋅ ~ ⋅ | ⋅,⋅ • Used for non-linear non-parametric Bayesian regression (to build emulator/meta-model/proxy model) • Versatile and provide error estimates of prediction in untried input • The parameters in the kernel function increase with number of input parameters, high-dimensional regression harder Kernel function Correlate input parameters in the output space; control the shape of function Prior Data Posterior
– Time Output FEBA Reflood 21 • Consider full temperature transient output from FEBA simulation for better fidelity regression • 142 axial locations, 6’000 time steps = 852’000 output dimension • (Right) 9 samples from 2 inputs parameter variation (wall heat transfer & Interfacial heat transfer) • Think of “image” for the full output
Reduction: Principal Components (Eigensurface) 22 85% 7.3% 3.2% 1.6% 0.9% 0.6% 0.5% 0.3% 0.2% • Principal Component Analysis on full output matrix • More than 95% output variation is captured by the first 3 eigensurfaces • 852’000-dimension → 3-10 uncorrelated dimension (error estimates of truncated eigensurfaces available) , = ത , + =1 , ⋅ , PC scores Is regression on these possible? Eigensurface
Output Reconstruction 23 • For 2-parameter model (scoping study) of FEBA TRACE, it is possible • 100-sample training dataset Predicting 3 outputs from an untried input (evaluation in [ms]!) 3 Coefficients GP Prediction + Reconstruction Original TRACE Run 1st PC 2nd PC 3rd PC GP Regression (DFFB Wall HT & Liq-Interfacial HT) DFFB Liq-Interf. HTC = 0.8657 DFFB Wall HTC = 0.8017
of Computer Experiment provide framework for a data-driven and black-box treatment of model assessment. Here it was applied on the (revisited) reflood simulation problem posed in OECD/NEA PREMIUM Benchmark. 1. Global Sensitivity Analysis provide additional details on input/output behavior of a complex model. Steps involved: input/output dimension reduction, and variance decomposition 2. Gaussian Process Modelling is a potential versatile meta-modelling tool to approximate a complex and expensive model: useful in Bayesian uncertainty analysis where input parameter space is rigorously explored (more work required to further assess its suitability in FEBA TRACE)