Abstract

We present a unified framework for estimating stochastic parameters in general variational problems. This nonlinear inverse problem is formulated as a stochastic optimization problem using the output least-squares (OLS) objective, which minimizes the discrepancy between observed data and the computed solution. A key challenge in OLS-based formulations is the efficient computation of first- and second-order derivatives of the OLS functional, which depend on the corresponding derivatives of the parameter-to-solution map—often costly and difficult to evaluate, especially in stochastic settings. To address this, we develop a rigorous computational approach based on first- and second-order adjoint methods for inverse problems governed by stochastic variational problems. Specifically, we propose a new first-order adjoint method for computing the gradient of the OLS objective and introduce two novel second-order adjoint methods for Hessian evaluation. A stochastic Galerkin discretization framework is employed, enabling efficient implementation of the adjoint-based derivative computations. Numerical experiments demonstrate the accuracy and efficiency of the proposed computational framework.

Library of Congress Subject Headings

Inverse problems (Differential equations); Image registration (Mathematics); Stochastic systems; Parameter estimation; Mathematical optimization

Publication Date

5-6-2025

Document Type

Thesis

Student Type

Graduate

Degree Name

Applied and Computational Mathematics (MS)

Department, Program, or Center

Mathematics and Statistics, School of

College

College of Science

Advisor

Akhtar A. Khan

Advisor/Committee Member

Ephraim Agyingi

Advisor/Committee Member

Christiane Tammer

Campus

RIT – Main Campus

Plan Codes

ACMTH-MS

Share

COinS