A joint project of the Graduate School, Peabody College, and the Jean & Alexander Heard Library

Title page for ETD etd-08252013-231229

Type of Document Dissertation
Author Ling, You
URN etd-08252013-231229
Title Uncertainty quantification in time-dependent reliability analysis
Degree PhD
Department Civil Engineering
Advisory Committee
Advisor Name Title
Sankaran Mahadevan Committee Chair
Caglar Oskay Committee Member
Jayathi Y. Murthy Committee Member
Prodyot K. Basu Committee Member
Ronald D. Schrimpf Committee Member
  • Bayesian network
  • calibration
  • validation
  • model uncertainty
  • reliability
Date of Defense 2013-08-21
Availability unrestricted
The prediction of failure probability for engineering components/devices under service conditions often involves the use of predictive models and measurement data. The fact that no model is absolutely correct indicates the existence of "model uncertainty", while the available of a limited amount of data or imprecise measurements introduces an additional source of uncertainty called "data uncertainty". These two sources of uncertainty can be classified as epistemic uncertainty (uncertainty due to lack of knowledge/information). Another important source of uncertainty comes from natural variability of physical quantities. In time-dependent problems, uncertainty due to all these sources can accumulate with time, and thus can significantly affect the reliability prediction. Hence, rigorous quantification of model uncertainty, data uncertainty, and natural variability needs to be included in the reliability analysis.

This research focuses on two activities to address model uncertainty: (1) model calibration, and (2) model validation. Model calibration aims to quantify the uncertainty in the estimation of model parameters based on the available information on model input and output variables. Bayesian calibration under the well known Kennedy and O'Hagan (KOH) framework is investigated and extended for multi-physics systems with various possible types of available information. Model validation is the procedure of assessing the predictability of models in the domain of intended use. Probabilistic model validation methods are developed in order to quantify the confidence in model prediction, with the capability to include fully, partially, and un-characterized validation data.

A Bayesian network-based probabilistic uncertainty quantification (UQ) framework is then developed to include the results of model calibration and validation in time-dependent reliability analysis of engineering systems, including: (1) time-dependent reliability prediction for a multi-physics MEMS device, and (2) prognosis of mechanical components in service with the inclusion of monitoring data (loading history and system health).

  Filename       Size       Approximate Download Time (Hours:Minutes:Seconds) 
 28.8 Modem   56K Modem   ISDN (64 Kb)   ISDN (128 Kb)   Higher-speed Access 
  Dissertation_Ling_2013.pdf 3.08 Mb 00:14:15 00:07:20 00:06:25 00:03:12 00:00:16

Browse All Available ETDs by ( Author | Department )

If you have more questions or technical problems, please Contact LITS.