A joint project of the Graduate School, Peabody College, and the Jean & Alexander Heard Library

Title page for ETD etd-11222005-184433


Type of Document Dissertation
Author Rebba, Ramesh
Author's Email Address ramesh_rebba@yahoo.com
URN etd-11222005-184433
Title Model Validation and Design under Uncertainty
Degree PhD
Department Civil Engineering
Advisory Committee
Advisor Name Title
Prof. Sankaran Mahadevan Committee Chair
Prof. Bruce Cooil Committee Member
Prof. Gautam Biswas Committee Member
Prof. Prodyot. K. Basu Committee Member
Keywords
  • Engineering -- Mathematical models -- Evaluation
  • verification
  • error estimation
  • Bayesian statistics
  • extrapolation
  • hypothesis testing
  • model validation
  • Reliability (Engineering)
Date of Defense 2005-11-11
Availability unrestricted
Abstract
Full-scale testing of large engineering systems for assessing performance could be infeasible and expensive. With the growth of advanced computing capabilities, model-based simulation plays an increasingly important role in the design of such systems. When computational models are developed, the assumptions and approximations introduce various types of errors in the code predictions. In order to accept the model prediction with confidence, the computational models need to be rigorously verified and validated. When the input parameters of the model are uncertain, model prediction has uncertainty. On the other hand, the validation experiments also have measurement errors. Thus model validation involves comparing prediction with test data when both are uncertain. Appropriate validation metrics that address various uncertainties and errors are developed in this study, for both component-level and system-level models. Both classical and Bayesian statistics are used for this purpose.

Another goal of model validation is to extend what we can learn about the model’s predictive capability within the tested region to an inference about the predictive capability in the untested region of actual application and quantify the confidence in the extrapolation being performed. Sometimes the response quantity of interest in the target application may be different from the validated response quantity. Validation inferences may need to be extrapolated from nominal to off-nominal (tail) conditions or component level data may have to be used to make partial inference on the validity of system-level prediction. In all of the above cases, the methodology of Bayesian networks is developed to extrapolate inferences from the validation domain to the application domain.

This study also proposes a methodology to estimate the errors in computational models and to include them in reliability-based design optimization (RBDO). Various sources of uncertainties, errors and approximations in model form selection and numerical solution are included in a first order-based RBDO methodology.

Files
  Filename       Size       Approximate Download Time (Hours:Minutes:Seconds) 
 
 28.8 Modem   56K Modem   ISDN (64 Kb)   ISDN (128 Kb)   Higher-speed Access 
  RebbaDissertation.pdf 1.79 Mb 00:08:16 00:04:15 00:03:43 00:01:51 00:00:09

Browse All Available ETDs by ( Author | Department )

If you have more questions or technical problems, please Contact LITS.