CIE Graduate Research Poster Session is organized by: Technical Committees of ASME – CIE Conference www.asme.org Deep Neural Network (DNN) Solvers for High-dimensional Elliptic Stochastic Partial Differential Equations (SPDE) CIE 2018 Graduate Research Poster Sharmila Karumuri School of Mechanical Engineering, Purdue University Ph.D. Candidate Advisors: Jitesh Panchal, Associate Professor; Ilias Bilionis, Assistant Professor. f ( x, ⇠ ) = 0 T=1 T=0 Minimizing energy functional Collocation Steady State Stochastic Heat problem Input Uncertainty x 2 ⌦ = [0 , 1] ⇠ 2 ⌅ r . ( a ( x, ⇠ )r T ( x, ⇠ )) = f ( x, ⇠ ) 1. Input Uncertainty Dimensionality Reduction g ( x, ⇠ ) ⇠ GP ( m ( x ) , k ( x, x 0)) m ( x ) = 0 k ( x, x 0) = e | x x 0| ↵ ↵ = 0.05 ⇠i ⇠ N(0, 1) Truncated KLE g ( x, ⇠ ) ⇡ m ( x ) + d⇠ X i=1 ⇠i p i i( x ) a ( x, ⇠ )= eg ( x,⇠ ) 2. Methodology T ( x, ⇠ ) ⇡ b T ( x, ⇠, ✓ ) - parameters of the DNN ✓ b T ( x, ⇠, ✓ ) = (1 x ) + x (1 x ) N ( x, ⇠, ✓ ) STEP 1 STEP 2 3. Loss function Min ✓2⇥ L = Min ✓2⇥ " Z ⌦ Z ⌅ ( 1 2a ( x, ⇠ )kr b T ( x, ⇠, ✓ )k2 f ( x, ⇠ ) b T ( x, ⇠, ✓ )) p ( ⇠ ) d x d ⇠ # Min ✓2⇥ " 1 N N X i=1 ⇣1 2a ( x i ,⇠ i)krb T ( x i ,⇠ i ,✓ )k2 f ( x i ,⇠ i)b T ( x i ,⇠ i ,✓ ) ⌘ # 4. Results PDE Random conductivity Temperature response DNN ⇠ ! F ipy T ( x, ⇠ ) Finite Volume Solver of the Boundary Value Problem Random conductivity Truth vs Prediction Uncertainty propagation of 10,000 samples of random conductivity Comparison of mean obtained from MCS and DNN surrogate Comparison of variance obtained from MCS and DNN surrogate Comparison of solution PDF at x = (0.5) obtained from MCS and DNN surrogate Future work - Automatically enforce boundary conditions. - Validating for a 2D problem and solving an inverse problem. - Explore further minimization algorithms & network architectures. - Implement procedures such as L1/L2 regularization, drop out. Conclusions - This method can be easily extended to further higher stochastic dimensions. - Explored energy functional loss form. - Eliminated the need of physics solver, discretization of the domain. - K. Sharmila (2018), “Deep Neural Network Solvers for High- dimensional Stochastic Elliptic Partial Differential Equations” presented at SIAM annual conference. - I. E. Lagaris, A. Likas, and D. I. Fotiadis, (1998), “Artificial neural networks for solving ordinary and partial differential equations”, IEEE Trans. Neural Networks, vol. 9, pp. 987–1000, 1998. References: Problem Overview a physical phenomena. nction f : X ! Y. f could specify material al conditions, etc. The of interest y = f (⇠) 2 PDE which depends on formation about it can e values of ⇠. We allow ay be a noisy estimate e ✏ is Gaussian noise. PDE - f is some scalar quantity of interest. - Obtained by solving the PDE numerically. - Inputs ξ – uncertain and high dimensional. - Interested in quantifying the uncertainty in f(ξ). Input uncertainty: QoI mean: Finally, the dimensionality, D , of the input vector ⇠ is large, potentially of the order of hundreds or thousands. Given a finite number of evaluations of the simulator, the task of constructing a surrogate function, ˆ f , for the true response surface f becomes computationally infeasible without resorting to dimensionality reduction. 2.1. Uncertainty propagation Suppose the inputs ⇠ to the function f are not known exactly (a common scenario in numerous engineering tasks). We formalize our beliefs about ⇠ using a suitable probability distribution: ⇠ ⇠ p (⇠) . (1) Given our beliefs about ⇠, we wish to characterize the statistical prop- erties of the output f (⇠) such as the mean: µf = Z f (⇠) p (⇠)d⇠ , (2) the variance, 2 f = Z f (⇠) µf p (⇠)d⇠ , (3) and the probability density, pf ( y ) = Z y f (⇠) p (⇠)d⇠ . (4) This is formally known as the uncertainty propagation problem (UP). 10 Finally, the dimensionality, D , of the input vector ⇠ is large, potentially of the order of hundreds or thousands. Given a finite number of evaluations of the simulator, the task of constructing a surrogate function, ˆ f , for the true response surface f becomes computationally infeasible without resorting to dimensionality reduction. 2.1. Uncertainty propagation Suppose the inputs ⇠ to the function f are not known exactly (a common scenario in numerous engineering tasks). We formalize our beliefs about ⇠ using a suitable probability distribution: ⇠ ⇠ p (⇠) . (1) Given our beliefs about ⇠, we wish to characterize the statistical prop- erties of the output f (⇠) such as the mean: µf = Z f (⇠) p (⇠)d⇠ , (2) the variance, 2 f = Z f (⇠) µf p (⇠)d⇠ , (3) and the probability density, pf ( y ) = Z y f (⇠) p (⇠)d⇠ . (4) This is formally known as the uncertainty propagation problem (UP). 10 QoI variance: f 2 = Z (f(⇠) µf )2p(⇠)d⇠ Motivation - Classical methods: Intrusive polynomial chaos, Stochastic collocation method etc. suffer from Curse of dimensionality. computational cost required to solve SPDEs grows exponentially as the no. of stochastic dimensions grows - Monte Carlo, although independent in the dimensionality, converges very slowly in the number of samples of f. Approach Develop a DNN based surrogate model to approximate the solution/output of the PDE with high dimensional input uncertainty f(⇠) ⇡ b f(⇠) DNN Key steps 1. Modelling input uncertainty as GP. 2. Solving the SPDE using the DNN framework. 3. Minimizing loss function – energy functional. 4. Quantifying output uncertainty. Model Elliptic SPDE problem r . ( ( x )r u ( x )) = ⇢ ( x ) , on x 2 ⌦ u ( x ) = g ( x ) , on d r u ( x ) .n = h ( x ) , on n Input Uncertainty RQ: Solve high dimensional stochastic partial differential equations We demonstrate our solver by solving a specific example of a stochastic heat equation PDE Random conductivity Temperature response DNN a ( x,⇠ ) T ( x,⇠ ) Impose boundary conditions Solution is expanded in fixed basis qi with stochastic coefficients ci estimated using NN Network Configuration and parameters Comparison of Statistics Verification a ( x, ⇠ ) vs x T ( x, ⇠ ) vs x N ( x, ⇠, ✓ ) = M X i=1 ci( ⇠, ✓ ) qi( x )