Concepedia

Concept

uncertainty quantification

Parents

87.4K

Publications

5.4M

Citations

146K

Authors

13.1K

Institutions

Table of Contents

Overview

Definition and Importance

(UQ) refers to the study of uncertainty in and simulations, particularly those used in scientific research and . It is crucial for addressing the inherent uncertainties present in physical processes modeled by computer simulations, which is essential for performing reliable model-based , such as and predictions.[1.1] The significance of UQ has grown substantially, driven by its potential to enhance risk mitigation through scientific predictions. This evolution has fostered the integration of concepts from , , and engineering, thereby strengthening the credibility of predictive risk assessments and informing decision-making processes.[2.1] The origins of research in uncertainty quantification can be traced back to approximately 1980, and over the past four decades, UQ has become integral to various fields, demonstrating its applicability and importance in understanding complex physical processes.[3.1] A critical component of UQ is , which ensures that both simulators and surrogate models yield useful and reliable predictions. This process involves collaboration among experts in , , research , and specific domain knowledge, facilitating the development of robust predictive models.[4.1]

Key Concepts in Uncertainty Quantification

Uncertainty Quantification (UQ) involves characterizing and reducing uncertainties across various fields by integrating mathematics, statistics, and engineering to improve predictive risk assessments and inform decision-making processes.[14.1] Since 2000, the importance of UQ has been increasingly acknowledged, with applications across numerous domains highlighting its value in both academic and industrial contexts.[5.1] A core component of UQ is identifying and categorizing uncertainties from sources such as model parameters, input data, and inherent system variability.[5.1] This systematic approach transforms potential threats into measurable data, crucial for managing risks in complex projects.[6.1] Techniques like simulations offer a comprehensive view of potential scenarios, aiding project managers in impact assessment and informed decision-making.[6.1] UQ also enhances public in and environmental policy-making by providing stakeholders with reliable information, crucial in an era of declining institutional trust.[11.1] Integrating UQ into environmental decision-making involves quantifying uncertainties related to human input and establishing widely accepted risk-based performance criteria.[12.1] This not only improves decision-making transparency but also encourages public engagement in sustainability initiatives.[9.1] Historically, UQ has evolved from viewing uncertainty as undesirable to recognizing its importance in scientific inquiry.[16.1] Key milestones include the of probability measures in the mid-1600s and the integration of UQ techniques into scientific and engineering applications.[13.1] Today, methods such as Bayesian approximation and ensemble learning are widely used in UQ, addressing challenges in recognizing and quantifying uncertainty in systems.[18.1]

History

Early Developments in UQ

The quantification of uncertainty has its roots in the works of early mathematicians such as Pascal and Fermat, who linked it to broader themes of social rationalization and enlightenment. This historical perspective highlights that probability and statistics, despite their distinct origins, have consistently served as tools for understanding and managing the uncertainties inherent in change.[49.1] Historically, the scientific community has had a complex relationship with the concept of uncertainty. Initially, uncertainty was viewed negatively, as a condition to be minimized or eliminated in scientific inquiry. This traditional perspective regarded uncertainty as an undesirable state, one that scientists sought to avoid at all costs.[50.1] Over time, however, the recognition of uncertainty as an integral aspect of scientific analysis has evolved, leading to the development of methodologies aimed at its assessment and quantification.

Evolution of UQ Methods

Uncertainty quantification (UQ) methods have evolved significantly over the years, driven by the need for effective risk mitigation through scientific prediction. The integration of concepts from mathematics, statistics, and engineering has been crucial in enhancing predictive assessments of risk and informing decision-making processes across various fields.[47.1] Since the year 2000, UQ research has gained substantial traction, being applied in numerous domains and receiving strong support from both academia and industry.[48.1] The development of UQ methods has been characterized by the introduction of various techniques, including Bayesian approximation and ensemble learning, which are widely utilized to address uncertainties in optimization and decision-making.[46.1] Bayesian methods, in particular, have emerged as a powerful framework for quantifying uncertainty in model predictions. They allow for the incorporation of prior knowledge, updating beliefs, and expressing uncertainty, thereby enhancing the of predictions in complex models.[52.1] For instance, Bayesian combines probabilistic modeling with deep neural networks, facilitating uncertainty quantification in high-dimensional datasets.[54.1] Moreover, the classification of UQ methods into probabilistic and non-probabilistic categories has provided a structured approach to addressing different types of uncertainties, such as aleatory and epistemic uncertainties.[56.1] This classification is essential for selecting appropriate UQ methods tailored to specific engineering projects, as demonstrated in involving applications like steel-concrete composite columns and floor slab panels.[58.1] The historical context of UQ can be traced back to the Enlightenment period, where figures such as Pascal and Fermat laid the groundwork for understanding and controlling uncertainties through probability and statistics.[63.1] This philosophical shift towards rationalization and enlightenment has significantly influenced the formalization of UQ methods, leading to their current applications in engineering and beyond.

In this section:

Sources:

Recent Advancements

Probabilistic and Non-Probabilistic Models

Probabilistic models in uncertainty quantification (UQ) are increasingly popular for their capacity to incorporate randomness and variability in system parameters. These models are crucial in fields like engineering, , and climate modeling, where decision-making often depends on predictions with inherent uncertainties. They enable the assessment of various outcomes' likelihoods based on uncertain inputs, thus supporting risk assessment and .[89.1] Advancements in probabilistic UQ methods have aimed at improving model accuracy and efficiency. Techniques such as regression and polynomial chaos expansion are now prominent for modeling with high-dimensional inputs.[100.1] These methods provide a nuanced understanding of uncertainty propagation, which is essential for informed decision-making based on model predictions.[98.1] Conversely, non-probabilistic models, which do not depend on probability distributions, have also evolved to address uncertainties. They often employ or to represent uncertainties, offering an alternative perspective on managing variability in system parameters. These approaches are particularly beneficial in situations with limited data or poorly understood processes.[92.1] The incorporation of machine learning techniques into both probabilistic and non-probabilistic frameworks has further propelled UQ's development. , including artificial neural networks and physics-informed neural networks, enhance the predictive capabilities of UQ methods, facilitating real-time updates and with new data.[99.1] This integration of traditional UQ methods with modern computational techniques illustrates the ongoing evolution of uncertainty quantification, broadening its applicability across diverse domains.[101.1]

In this section:

Sources:

Mathematical Foundations

Probability Theory in UQ

plays a crucial role in uncertainty quantification (UQ) by providing the mathematical framework necessary for characterizing and propagating uncertainties in complex systems. UQ is fundamentally concerned with the quantitative description of the origins, characteristics, and propagation of uncertainties, which is achieved through the application of probability distributions to model these uncertainties.[124.1] The integration of probability theory into UQ allows for the construction of probabilistic models that quantify information and uncertainty in terms of probability distributions. These models are essential for explaining phenomena and observations, as they enable the modeling of known unknowns through the use of probability distributions.[145.1] In practice, uncertainties are represented on stochastic project models using various probability distributions, which can be tailored to the specific of the uncertainties involved.[146.1] Different types of probability distributions are employed depending on the characteristics of the system being analyzed. For instance, discrete distributions, such as binomial or Poisson distributions, are typically used for simple systems, while continuous distributions may be more appropriate for complex scenarios.[146.1] The choice of distribution significantly influences the modeling of uncertainty, as it determines how uncertainties are characterized and propagated through the model.[144.1] Moreover, probabilistic involves the characterization of uncertainties using probability distributions, which facilitates the application of standard probability techniques in propagating uncertainties within a model.[141.1] This approach not only aids in understanding the impact of uncertainties but also allows for the prioritization of uncertainty contributors to the overall risk assessment.[141.1]

Bayesian Methods for Epistemic Uncertainty

Bayesian methods play a crucial role in addressing epistemic uncertainty, which arises from a lack of knowledge about the underlying model or parameters. These methods allow for the incorporation of prior knowledge and the updating of beliefs in light of new evidence, making them particularly effective for uncertainty quantification in complex systems. One significant aspect of Bayesian approaches is their ability to provide on data , which is essential for obtaining confidence intervals for estimations. This is achieved through the relationship between Data Shapley and infinite-order U-statistics, which quantifies the uncertainty associated with changes in .[127.1] Furthermore, Bayesian methods are designed to be invariant to data transformations and model parameterizations, ensuring that the uncertainty quantification reflects the informativeness of the observed data for the underlying process.[128.1] In practical applications, Bayesian methods are often employed in model validation and reproducibility. A statistical framework known as matching is utilized for global parameter searches by comparing model outputs to observed data, thereby facilitating model-based .[129.1] Additionally, surrogate models are frequently used in (MCMC) methods for likelihood evaluation, which helps mitigate the high computational costs associated with simulations.[130.1] The importance of uncertainty quantification in machine learning systems cannot be overstated. It enhances model robustness and , providing end-users with critical information regarding the reliability of predictions. For instance, deep learning models that incorporate uncertainty quantification can effectively distinguish between easy and difficult classification tasks, as demonstrated by the performance of Bayesian neural networks and ensemble models.[133.1] Moreover, uncertainty quantification methods have been shown to improve the overall robustness of deep learning models through comprehensive data experiments.[135.1] In engineering contexts, the selection of appropriate uncertainty quantification techniques is vital. Techniques such as Gaussian process regression, polynomial chaos expansion, and physics-informed neural networks have gained prominence for their effectiveness in and inverse uncertainty quantification.[139.1] These methods enable practitioners to address complex problems involving stochastic processes and random parameters, thereby enhancing decision-making and optimization under uncertainty.[138.1]

Applications In Engineering

UQ in Computational Simulations

Uncertainty quantification (UQ) plays a pivotal role in computational simulations across various engineering disciplines. The integration of machine learning (ML) techniques, such as Gaussian Process Regression (GPR) and physics-informed neural networks (PINNs), has significantly enhanced the capability of UQ in addressing complex engineering problems characterized by high levels of uncertainty. These ML methods serve as powerful tools for constructing surrogate models, which are fast approximations of original complex models, thereby facilitating efficient uncertainty analysis.[175.1] GPR, in particular, is a nonparametric regression method widely utilized in UQ for developing surrogate models. Its application extends to dynamic systems, where it quantifies uncertainty and addresses limitations inherent in traditional techniques found in the .[177.1] The use of GPR allows for a more nuanced understanding of the relationships between input variables and system responses, which is crucial for effective decision-making in engineering .[176.1] Moreover, UQ methodologies, including Bayesian approximation and ensemble learning techniques, have been employed to solve a variety of real-world problems in science and engineering.[174.1] These methods not only improve the reliability and of but also enhance the robustness of simulations by providing a clearer picture of the uncertainties involved.[191.1] For instance, Monte Carlo simulations, a critical component of UQ, enable engineers to model the impact of different variables on system performance and durability, thus aiding in the design of products that can withstand various operational conditions.[192.1] is another essential aspect of UQ in computational simulations. It quantifies the relationship between input parameters and output responses, helping engineers identify the most influential parameters and understand how variations in these parameters overall system behavior.[188.1] This analysis is particularly relevant in reliability-based design, where it informs the relationship between changes in reliability and the characteristics of uncertain variables.[187.1]

Real-World Engineering Applications

Uncertainty quantification (UQ) is extensively applied across engineering domains, enhancing system reliability and performance. In , UQ is vital for designing robust structures and systems, addressing challenges like limited experimental data, complex interactions between physical fields, and high computational costs of high-fidelity simulations. These challenges make UQ and model validation essential for ensuring safety and performance in applications.[183.1] In , UQ is crucial for performance-based structural assessments. A stochastic simulator for seismic uncertainty quantification addresses high-dimensional uncertainties, critical for evaluating under .[178.1] This systematic approach to uncertainty is essential for risk assessment and developing new design procedures, retrofits, and .[180.1] Integrating UQ into structural models significantly impacts the reliability of nonlinear time history results, especially as earthquake intensity increases.[181.1] By quantifying uncertainties, engineers can enhance deterministic analyses' accuracy and better understand uncertain variables' effects on structural demands.[182.1] UQ methodologies have evolved, with significant milestones marking their development. Since the 1980s, systematic uncertainty quantification in has been critical for engineering design and analysis, leading to advancements in fields like aerospace, mechanical applications, , and assessment.[168.1] Recent advancements in machine learning, such as Gaussian process regression and physics-informed neural networks, have further enhanced UQ capabilities, enabling more sophisticated analyses and applications in reliability assessments.[165.1]

In this section:

Sources:

Challenges In Uq

Dimensionality and Identifiability Issues

Dimensionality and identifiability issues are significant challenges in the field of uncertainty quantification (UQ), particularly in complex systems such as climate modeling and epidemiological studies. One of the primary concerns is the inherent variation in and the lack of comprehensive knowledge about these systems, which can lead to various types of uncertainty, including observation error, stochastic uncertainty, parameter uncertainty, structural uncertainty, and model discrepancy.[206.1] These uncertainties complicate the estimation of model parameters and the overall reliability of predictions. In the context of climate modeling, the exclusion or downweighting of models that violate fundamental physical principles, such as , is crucial. This necessitates a robust understanding of the processes and feedbacks that are essential for accurate modeling.[205.1] Similarly, in epidemiological modeling, the complexity of dynamics introduces additional layers of uncertainty, making it challenging to identify and quantify the relevant parameters effectively.[240.1] The calibration of stochastic epidemic compartmental models, for instance, highlights the difficulties in achieving reliable estimates due to the stochastic nature of .[240.1] Moreover, the dimensionality of the parameter space can exacerbate identifiability issues. As the number of parameters increases, the likelihood of encountering non-identifiable models rises, complicating the estimation process and potentially leading to misleading conclusions.[206.1] This is particularly evident in hierarchical Bayesian modeling frameworks, which, while providing a structured approach to integrate uncertainties, still face challenges in ensuring that all parameters are identifiable and that the model remains computationally feasible.[233.1] In aerospace engineering, the integration of UQ methods into the design process also grapples with dimensionality and identifiability challenges. The need for accurate must be balanced with , as prioritizing speed over thoroughness can lead to suboptimal that compromise safety and performance.[243.1] Thus, addressing dimensionality and identifiability issues is critical for enhancing the reliability and effectiveness of uncertainty quantification across various fields.

Incomplete Model Responses

Incomplete model responses in uncertainty quantification (UQ) arise from various challenges associated with the inherent complexities of modeling real-world systems. One significant issue is the reliance on surrogate models for likelihood evaluation in Markov Chain Monte Carlo (MCMC)-based model uncertainty quantification, which is often necessary to mitigate the high computational costs of simulations.[219.1] However, this approach can lead to incomplete responses if the surrogate models do not adequately capture the underlying dynamics of the system being studied. Moreover, the quantification of uncertainty sources is crucial for understanding the overall uncertainty in the quantity of interest. This involves a systematic of model-related uncertainties across different scientific fields, including biological, physical, and .[212.1] The modularized global sensitivity analysis and efficient Gaussian mixture copula (GMC) approximation techniques are employed to compute sensitivity indices rapidly, yet they may not fully address the complexities involved in certain models.[213.1] The of these uncertainties to policymakers and the public is another critical aspect that can be hampered by incomplete model responses. Effective must characterize, assess, and convey the limits of scientific statements clearly, as emphasized by the Science Advice for Policy by European Academies.[214.1] Engaging in a two-way dialogue between scientists and policymakers can help illuminate the various dimensions of uncertainty, thereby enhancing the application of scientific findings.[215.1] Furthermore, the framing of evidence to align with the biases, beliefs, and priorities of policymakers is essential for effective communication.[217.1] This approach acknowledges the of policy actors, who often rely on simplified to process complex information.[217.1] Thus, addressing the challenges of incomplete model responses in UQ requires not only robust modeling techniques but also strategic communication efforts to ensure that uncertainties are effectively conveyed and understood.

Future Directions

Recent advancements in uncertainty quantification (UQ) methods have significantly influenced engineering applications, particularly in and machine learning. A notable trend is the increasing emphasis on surrogate modeling techniques, which have gained traction for their ability to manage uncertainties in high-dimensional spaces. These methods, such as polynomial chaos expansion and Gaussian process regression, are particularly useful in scenarios where traditional modeling approaches face challenges due to the complexity and dimensionality of the data.[249.1] In structural dynamics, the state-of-the- UQ methods focus on structural response characterization, model calibration, and . Researchers have adopted both forward and inverse UQ frameworks to effectively address uncertainties in these areas.[247.1] Furthermore, the integration of machine learning techniques, including artificial neural networks and physics-informed neural networks, has emerged as a powerful approach to enhance predictive accuracy and computational efficiency in UQ.[256.1] These machine learning models are particularly adept at handling the inherent uncertainties present in data and models, thereby improving the robustness of predictions in real-world applications.[253.1] Moreover, the application of machine learning in UQ is evolving to include innovative strategies for addressing imbalanced datasets, which are common in regression tasks. The development of algorithms such as the UQ-driven imbalanced regression algorithm (UQDIR) aims to enhance prediction accuracy by leveraging epistemic uncertainty quantification.[255.1] This reflects a broader trend of utilizing UQ to refine machine learning methodologies, ultimately leading to more reliable and efficient engineering solutions.[256.1]

Integration of UQ with Other Disciplines

The integration of uncertainty quantification (UQ) with advanced machine learning (ML) techniques is an evolving area of research that addresses the complexities of quantifying errors and uncertainties in ML-based inference. Neural networks (NNs) are significantly altering the computational landscape by effectively combining data with mathematical in fields such as and engineering, particularly in solving challenging inverse and ill-posed problems that traditional methods cannot address. However, the quantification of uncertainties in NN-based inference presents greater challenges compared to conventional approaches.[269.1] Recent advancements in ML, including techniques such as Gaussian process regression, physics-informed neural networks, and Bayesian neural networks, have gained traction in both theoretical and practical applications. These models are particularly relevant in reliability analysis, where they are categorized into data-driven neural networks and physics-informed neural networks (PINNs). The integration of these ML strategies with UQ is crucial for applications such as probabilistic model updating and under uncertainty.[270.1] Uncertainty quantification is especially critical in high-risk domains like healthcare, autonomous systems, and , where decision-making processes must consider various forms of uncertainty. This includes distinguishing between aleatoric uncertainty, which arises from inherent variability, and epistemic uncertainty, which stems from a lack of knowledge. The mathematical foundations and methods for quantifying these uncertainties are essential for enhancing the reliability of AI systems.[271.1] The American Society of Mechanical Engineers (ASME) has established the VVUQ 70 subcommittee, which focuses on the , validation, and uncertainty quantification of machine learning algorithms. This initiative aims to develop standards that facilitate the assessment and quantification of the credibility of ML algorithms applied to mechanistic problems, thereby promoting a more systematic integration of UQ with ML.[272.1] In the context of structural dynamics, incorporating UQ into model calibration processes is vital for improving the reliability of structural . Various forms of uncertainty, including parametric, predictive, and model-form uncertainties, must be addressed to enhance the robustness of decision-making in structural and model updating. Techniques such as Bayesian estimation and the Stochastic (SFEM) are employed to account for these uncertainties, thereby improving model predictions and reducing the likelihood of structural failures.[282.1]

In this section:

Sources:

References

ncbi.nlm.nih.gov favicon

nih

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8059558/

[1] The importance of uncertainty quantification in model reproducibility Uncertainty quantification in computer models is important for a number of reasons. Firstly, the analysis of physical processes based on computer models is riddled with uncertainty, which has to be addressed to perform 'trustworthy' model-based inference such as forecasting (predictions) [ 1 ].

link.springer.com favicon

springer

https://link.springer.com/referencework/10.1007/978-3-319-12385-1

[2] Handbook of Uncertainty Quantification | SpringerLink The topic of Uncertainty Quantification (UQ) has witnessed massive developments in response to the promise of achieving risk mitigation through scientific prediction. It has led to the integration of ideas from mathematics, statistics and engineering being used to lend credence to predictive assessments of risk but also to design actions (by

onlinelibrary.wiley.com favicon

wiley

https://onlinelibrary.wiley.com/doi/full/10.1155/2020/6068203

[3] Basic Framework and Main Methods of Uncertainty Quantification The research on the uncertainty in the deterministic engineering modeling of complex physical processes dates back to around 1980 . After nearly four decades of development, uncertainty quantification (UQ) has played an important role and has been successfully applied in many fields.

turing.ac.uk favicon

turing

https://www.turing.ac.uk/research/interest-groups/uncertainty-quantification

[4] Uncertainty quantification | The Alan Turing Institute Uncertainty quantification ... Calibration is an important step in ensuring useful and reliable predictions can be made by both simulators and surrogate models. ... applied mathematics, computer science, research software engineering and domain-area expertise. Through discussion and collaboration, the group will facilitate the development and

onlinelibrary.wiley.com favicon

wiley

https://onlinelibrary.wiley.com/doi/full/10.1155/2020/6068203

[5] Basic Framework and Main Methods of Uncertainty Quantification Since 2000, the research of uncertainty quantification (UQ) has been successfully applied in many fields and has been highly valued and strongly supported by academia and industry. This review firstly discusses the sources and the types of uncertainties and gives an overall discussion on the goal, practical significance, and basic framework of

rosemet.com favicon

rosemet

https://www.rosemet.com/quantifying-risk/

[6] Quantifying Risk: 7 Steps For Precise Project Management - ROSEMET LLC Using advanced statistical methods and robust techniques in data analysis makes risk quantification a crucial tool for managing uncertainties effectively, particularly in complex, high-impact projects where understanding the full scope of potential risks is essential for successful project outcomes. Quantifying risks in a project management context involves a structured, systematic approach that turns potential threats and uncertainties into measurable, manageable data. By considering a range of values for each risk factor, Monte Carlo simulations provide a comprehensive view of potential scenarios, helping project managers understand the full spectrum of impacts. While quantifying risks is a highly effective approach to managing uncertainties in project management, several alternative methods can also provide valuable insights and support decision-making.

pubmed.ncbi.nlm.nih.gov favicon

nih

https://pubmed.ncbi.nlm.nih.gov/38026007/

[9] The effects of communicating uncertainty around statistics, on public trust We also show that this minimal impact of numeric uncertainty on trustworthiness is also present when communicating future, projected COVID-19 statistics (Study 2; N = 2,309). Conversely, we find statements about the mere existence of uncertainty, without quantification, can reduce both perceived trustworthiness of the numbers and of their source.

sciencedirect.com favicon

sciencedirect

https://www.sciencedirect.com/science/article/pii/S2589004222017849

[11] Insights into the quantification and reporting of model-related ... With quantitative science now highly influential in the public sphere 3 and the results from models translating into action, we must support our conclusions with sufficient rigor. Incomplete consideration of model uncertainties can lead to false conclusions with real-world impacts and an erosion of public trust in science. 16, 18, 22 In 2019, Seibold et al. 23 reported substantial declines in

sciencedirect.com favicon

sciencedirect

https://www.sciencedirect.com/science/article/pii/S0304380008003554

[12] Future research challenges for incorporation of uncertainty in ... Some of the important issues that need to be addressed in relation to the incorporation of uncertainty in environmental decision-making processes include: (1) the development of methods for quantifying the uncertainty associated with human input; (2) the development of appropriate risk-based performance criteria that are understood and accepted by a range of disciplines; (3) improvement of fuzzy environmental decision-making through the development of hybrid approaches (e.g., fuzzy-rule-based models combined with probabilistic data-driven techniques); (4) development of methods for explicitly conveying uncertainties in environmental decision-making through the use of Bayesian probability theory; (5) incorporating adaptive management practices into the environmental decision-making process, including model divergence correction; (6) the development of approaches and strategies for increasing the computational efficiency of integrated models, optimization methods, and methods for estimating risk-based performance measures; and (7) the development of integrated frameworks for comprehensively addressing uncertainty as part of the environmental decision-making process.

link.springer.com favicon

springer

https://link.springer.com/referencework/10.1007/978-3-319-12385-1

[13] Handbook of Uncertainty Quantification | SpringerLink The topic of Uncertainty Quantification (UQ) has witnessed massive developments in response to the promise of achieving risk mitigation through scientific prediction. It has led to the integration of ideas from mathematics, statistics and engineering being used to lend credence to predictive assessments of risk but also to design actions (by

afit.edu favicon

afit

https://www.afit.edu/STAT/statcoe_files/4_0328_LazarusUQBP_2_2.pdf

[14] PDF Uncertainty Quantification (UQ) is the science of the characterization and reduction of . uncertainties (Saouma & Hariri-Ardebili, 2021). UQ is not a standalone field of study, but it is ... The key to UQ is to accompany an out-of-distribution prediction with lower confidence or uncertainty. Bayesian neural networks (BNNs) and evidential deep

sciencedirect.com favicon

sciencedirect

https://www.sciencedirect.com/science/article/pii/S1026309811000551

[16] An evolution of uncertainty assessment and quantification Brief history of uncertainty assessment and quantification. From a historical point of view, the issue of uncertainty has not always been embraced within the scientific community . In the traditional view of science, uncertainty represents an undesirable state, a state that must be avoided at all costs.

cset.georgetown.edu favicon

georgetown

https://cset.georgetown.edu/wp-content/uploads/CSET-Key-Concepts-in-AI-Safety-Reliable-Uncertainty-Quantification-in-Machine-Learning.pdf

[18] PDF Unfortunately, designing machine learning systems that can recognize their limits is more challenging than it may appear at first glance. In fact, enabling machine learning systems to "know what they don't know"—known in technical circles as "uncertainty quantification"—is an open and widely studied research problem within machine

sciencedirect.com favicon

sciencedirect

https://www.sciencedirect.com/science/article/pii/S1566253521001081

[46] A review of uncertainty quantification in deep learning: Techniques ... Uncertainty quantification (UQ) methods play a pivotal role in reducing the impact of uncertainties during both optimization and decision making processes. They have been applied to solve a variety of real-world problems in science and engineering. Bayesian approximation and ensemble learning techniques are two widely-used types of uncertainty quantification (UQ) methods. In this regard

link.springer.com favicon

springer

https://link.springer.com/referencework/10.1007/978-3-319-12385-1

[47] Handbook of Uncertainty Quantification | SpringerLink The topic of Uncertainty Quantification (UQ) has witnessed massive developments in response to the promise of achieving risk mitigation through scientific prediction. It has led to the integration of ideas from mathematics, statistics and engineering being used to lend credence to predictive assessments of risk but also to design actions (by engineers, scientists and investors) that are

onlinelibrary.wiley.com favicon

wiley

https://onlinelibrary.wiley.com/doi/full/10.1155/2020/6068203

[48] Basic Framework and Main Methods of Uncertainty Quantification Since 2000, the research of uncertainty quantification (UQ) has been successfully applied in many fields and has been highly valued and strongly supported by academia and industry. This review firstly discusses the sources and the types of uncertainties and gives an overall discussion on the goal, practical significance, and basic framework of the research of UQ. Then, the core ideas and

link.springer.com favicon

springer

https://link.springer.com/chapter/10.1007/978-94-015-7873-8_3

[49] The Quantification of Uncertainty after 1700: Statistics Socially ... The quantification of uncertainty has since the time of Pascal and Fermat been tied to a program of social rationalization and enlightenment. Probability and statistics, though their historical roots are distinct, have always been united in this: that they provide a way of understanding and hence controlling the uncertainties of change.

sciencedirect.com favicon

sciencedirect

https://www.sciencedirect.com/science/article/pii/S1026309811000551

[50] An evolution of uncertainty assessment and quantification Brief history of uncertainty assessment and quantification. From a historical point of view, the issue of uncertainty has not always been embraced within the scientific community . In the traditional view of science, uncertainty represents an undesirable state, a state that must be avoided at all costs.

ieeexplore.ieee.org favicon

ieee

https://ieeexplore.ieee.org/document/10696308

[52] A Review on Bayesian Methods for Uncertainty Quantification in Machine ... This research study reviews the statistical fundamentals of machine learning with a focus on Bayesian methods to quantify the uncertainty in model predictions. Bayesian statistics provides a framework for incorporating prior knowledge, updating beliefs, and expressing uncertainty in predictions. This research study will explore Bayesian techniques applied to various aspects of machine learning

ijnrd.org favicon

ijnrd

https://www.ijnrd.org/papers/IJNRD2403577.pdf

[54] PDF Additionally, Bayesian deep learning merges probabilistic modeling with deep neural networks, allowing for uncertainty quantification in complex, high-dimensional datasets. Practical applications across various domains highlight the importance of Bayesian inference methods in enhancing the reliability and robustness of data-driven analyses.

link.springer.com favicon

springer

https://link.springer.com/chapter/10.1007/978-981-13-9806-3_3

[56] Uncertainty Quantification and Robust Optimization in Engineering Depending on the properties of the input uncertainties, UQ methods can be divided into probabilistic (for aleatory) and non-probabilistic (for epistemic) methods. The uncertainty quantification methods described in this article can be employed if the probability distribution functions of the described uncertainties is known or defined.

sciencedirect.com favicon

sciencedirect

https://www.sciencedirect.com/science/article/pii/S0141029618322338

[58] Efficient uncertainty quantification method applied to structural fire ... The case studies in 4 Case A - application on a steel-concrete composite column, 5 Case B - application to a floor slab panel in tensile membrane action show the potential effectiveness of the MaxEnt method for uncertainty quantification in conjunction with advanced SFE modelling tools (in this case SAFIR).

link.springer.com favicon

springer

https://link.springer.com/chapter/10.1007/978-94-015-7873-8_3

[63] The Quantification of Uncertainty after 1700: Statistics Socially ... The quantification of uncertainty has since the time of Pascal and Fermat been tied to a program of social rationalization and enlightenment. Probability and statistics, though their historical roots are distinct, have always been united in this: that they provide a way of understanding and hence controlling the uncertainties of change.

onlinelibrary.wiley.com favicon

wiley

https://onlinelibrary.wiley.com/doi/full/10.1155/2020/6068203

[89] Basic Framework and Main Methods of Uncertainty Quantification Since 2000, the research of uncertainty quantification (UQ) has been successfully applied in many fields and has been highly valued and strongly supported by academia and industry. This review firstly discusses the sources and the types of uncertainties and gives an overall discussion on the goal, practical significance, and basic framework of

sciencedirect.com favicon

sciencedirect

https://www.sciencedirect.com/science/article/pii/S1566253521001081

[92] A review of uncertainty quantification in deep learning: Techniques ... We discussed various applications of uncertainty quantification methods. We summarized major open challenges and research gaps in uncertainty quantification. Abstract Uncertainty quantification (UQ) methods play a pivotal role in reducing the impact of uncertainties during both optimization and decision making processes.

sciencedirect.com favicon

sciencedirect

https://www.sciencedirect.com/science/article/pii/S0375650522002565

[98] Improvement of accuracy with uncertainty quantification in the ... Recently, these methods were applied in the engineering field because of their versatility and effectiveness. Vogt et al. (2014) estimated the permeability of enhanced geothermal systems using a typical data assimilation method called the ensemble Kalman filter (EnKF). Applications in the material science and hydrology fields have also been reported (Oka and Ohno, 2020; Vrugt et al., 2005).

link.springer.com favicon

springer

https://link.springer.com/article/10.1007/s44379-024-00011-x

[99] A survey on machine learning approaches for uncertainty quantification ... Recently, machine learning (ML) techniques, including Gaussian process regression, artificial neural networks, physics-informed neural networks, and many others, have garnered significant attention in both theoretical research and practical applications. A plethora of ML models are currently available, including the Kriging model , Gaussian process regression (GPR) , polynomial chaos expansion (PCE) , support vector machine (SVM) , artificial neural network (ANN) , Bayesian neural network (BNN) , physics-informed neural network (PINN) , among others . In reliability analysis, we review two prominent categories of ML techniques: data-driven neural networks and PINNs. Section 3 then shifts to ML strategies for inverse UQ analysis, covering key areas such as probabilistic model updating (PMU) and design optimization under uncertainty. Physics-Informed Deep Learning-Based Real-Time Structural Response Prediction Method.

sciencedirect.com favicon

sciencedirect

https://www.sciencedirect.com/science/article/pii/S004578252400762X

[100] A Review of Recent Advances in Surrogate Models for Uncertainty ... A Review of Recent Advances in Surrogate Models for Uncertainty Quantification of High-Dimensional Engineering Applications - ScienceDirect A Review of Recent Advances in Surrogate Models for Uncertainty Quantification of High-Dimensional Engineering Applications Challenges in surrogate modeling for high-dimensional spaces are comprehended. High-dimensional benchmark functions assessing the surrogate models are provided. Nonetheless, as the complexity of the problem increases and the number of input variables grows, the computational burden of constructing an efficient surrogate model also rises, leading to the so-called curse of dimensionality in uncertainty propagation from inputs to outputs. This paper reviews the developments of the past years in surrogate modeling for high-dimensional inputs, with the goal of quantifying output uncertainty. For all open access content, the relevant licensing terms apply.

sciencedirect.com favicon

sciencedirect

https://www.sciencedirect.com/science/article/pii/S0888327023007045

[101] Uncertainty quantification in machine learning for engineering design ... Two case studies are developed to demonstrate the implementation of UQ methods and benchmark their performance in predicting battery life using early-life data (case study 1) and turbofan engine RUL using online-accessible measurements (case study 2).

afit.edu favicon

afit

https://www.afit.edu/STAT/statcoe_files/4_0328_LazarusUQBP_2_2.pdf

[124] PDF Uncertainty Quantification: An Overview . December 2022 Joseph Lazarus, Ctr Corinne Weeks, Ctr ... Uncertainty Quantification (UQ) is the science of the characterization and reduction of ... 2021). UQ is not a standalone field of study, but it is . incorporated within related fields such as, but not limited to, mathematics, statistics, and

arxiv.org favicon

arxiv

https://arxiv.org/abs/2407.19373

[127] Uncertainty Quantification of Data Shapley via Statistical Inference This paper establishes the relationship between Data Shapley and infinite-order U-statistics and addresses this limitation by quantifying the uncertainty of Data Shapley with changes in data distribution from the perspective of U-statistics. We make statistical inferences on data valuation to obtain confidence intervals for the estimations.

frontiersin.org favicon

frontiersin

https://www.frontiersin.org/journals/ecology-and-evolution/articles/10.3389/fevo.2020.00035/full

[128] How Should We Quantify Uncertainty in Statistical Inference? 1. Uncertainty quantification should be invariant to both data transformation and parameterization of the model. 2. Uncertainty quantification should reflect the informativeness of the observed data for the underlying process. 3. Uncertainty quantification should be amenable to be probed empirically for possible violations.

royalsocietypublishing.org favicon

royalsocietypublishing

https://royalsocietypublishing.org/doi/10.1098/rsta.2020.0071

[129] The importance of uncertainty quantification in model reproducibility ... We argue that uncertainty quantification is crucial for computer model validation and reproducibility. We present a statistical framework, termed history matching, for performing global parameter search by comparing model output to the observed data. ... To perform model-based inference and to learn about the relationships between parameters x

onlinelibrary.wiley.com favicon

wiley

https://onlinelibrary.wiley.com/doi/full/10.1155/2020/6068203

[130] Basic Framework and Main Methods of Uncertainty Quantification Therefore, surrogate models are commonly used for likelihood evaluation in MCMC-based model uncertainty quantification to alleviate the high computational cost of simulations . Besides, when the exact probability is not critical and only the low-order moments such as the mean and the variance are important, various approximate Bayesian

pmc.ncbi.nlm.nih.gov favicon

nih

https://pmc.ncbi.nlm.nih.gov/articles/PMC7274324/

[133] Evaluation of Uncertainty Quantification in Deep Learning The results, furthermore, support the previous observation by Hendrycks and Gimpel that deep learning models that only use the softmax activation function to quantify the uncertainty are overconfident when faced with out of the distribution samples. However, the autoencoder correctly uses the uncertainty quantification to separate all notMNIST samples from the MNIST samples, while the Bayesian neural network and the ensemble of neural networks can correctly separate classified MNIST samples from the other two cases. It is shown that the uncertainty quantification of some models (the Bayesian neural network and the ensemble of neural networks) can be used to distinguish between samples that are easy to classify and those that are difficult.

sciencedirect.com favicon

sciencedirect

https://www.sciencedirect.com/science/article/pii/S0306261923002532

[135] Uncertainty quantification-based robust deep learning for building ... Second, we adopt uncertainty quantification methods to improve the overall robustness of deep learning model. Comprehensive data experiments are conducted based on the reference modeling problem of chiller, and five widely-used uncertainty quantification methods are compared under distribution shift scenarios.

engineering.lehigh.edu favicon

lehigh

https://engineering.lehigh.edu/ise/uncertainty-quantification-and-complex-systems

[138] Uncertainty Quantification and Complex Systems Uncertainty is ubiquitous in modeling complex systems in various scientific and engineering problems that involve stochastic processes, random parameters, unknown physics, or noise. Uncertainty Quantification (UQ) is the science of quantitative characterization and reduction of uncertainties in real world problems. The behavior of such problems

link.springer.com favicon

springer

https://link.springer.com/article/10.1007/s44379-024-00011-x

[139] A survey on machine learning approaches for uncertainty quantification ... Recently, machine learning (ML) techniques, including Gaussian process regression, artificial neural networks, physics-informed neural networks, and many others, have garnered significant attention in both theoretical research and practical applications. A plethora of ML models are currently available, including the Kriging model , Gaussian process regression (GPR) , polynomial chaos expansion (PCE) , support vector machine (SVM) , artificial neural network (ANN) , Bayesian neural network (BNN) , physics-informed neural network (PINN) , among others . In reliability analysis, we review two prominent categories of ML techniques: data-driven neural networks and PINNs. Section 3 then shifts to ML strategies for inverse UQ analysis, covering key areas such as probabilistic model updating (PMU) and design optimization under uncertainty. Physics-Informed Deep Learning-Based Real-Time Structural Response Prediction Method.

ntrs.nasa.gov favicon

nasa

https://ntrs.nasa.gov/api/citations/20180004832/downloads/20180004832.pdf

[141] PDF techniques that can be used and gives example applications. Quantification of uncertainties also allows prioritization of the uncertainty contributors to the overall ... Probabilistic uncertainty analysis is the analysis of uncertainties by characterizing the uncertainty using a probability distribution. The probability distribution

ntrs.nasa.gov favicon

nasa

https://ntrs.nasa.gov/api/citations/20180004832/downloads/20180004832.pdf

[144] PDF Techniques are given for constructing an uncertainty distribution based on succinctly characterizing the uncertainty. Characterizing uncertainties with probability distributions allows standard probability techniques to be used in propagating uncertainties in a model. The use of uncertainty distributions allows parameter

sciencedirect.com favicon

sciencedirect

https://www.sciencedirect.com/science/article/pii/S1566253519301976

[145] A tutorial on uncertainty modeling for machine reasoning A probabilistic model is a mathematical model for explaining a phenomenon (and observations): it quantifies information and uncertainty in terms of probability distributions. It is a powerful framework but it only allows us to model the known unknowns (expressed via probability distributions).

pmi.org favicon

pmi

https://www.pmi.org/learning/library/capturing-judgments-uncertainty-probability-distributions-4623

[146] Capturing judgments about risks and uncertainties Uncertainty is represented on stochastic project models by probability distributions. This article discusses the general nature of probability distributions as they are used in project management, and in what situations it is appropriate to apply which distributions. Discrete distributions are usually produced by simple systems, and usually represented by binomial or Poisson distributions

link.springer.com favicon

springer

https://link.springer.com/article/10.1007/s44379-024-00011-x

[165] A survey on machine learning approaches for uncertainty quantification ... Recently, machine learning (ML) techniques, including Gaussian process regression, artificial neural networks, physics-informed neural networks, and many others, have garnered significant attention in both theoretical research and practical applications. A plethora of ML models are currently available, including the Kriging model , Gaussian process regression (GPR) , polynomial chaos expansion (PCE) , support vector machine (SVM) , artificial neural network (ANN) , Bayesian neural network (BNN) , physics-informed neural network (PINN) , among others . In reliability analysis, we review two prominent categories of ML techniques: data-driven neural networks and PINNs. Section 3 then shifts to ML strategies for inverse UQ analysis, covering key areas such as probabilistic model updating (PMU) and design optimization under uncertainty. Physics-Informed Deep Learning-Based Real-Time Structural Response Prediction Method.

onlinelibrary.wiley.com favicon

wiley

https://onlinelibrary.wiley.com/doi/full/10.1155/2020/6068203

[168] Basic Framework and Main Methods of Uncertainty Quantification The research on the uncertainty in the deterministic engineering modeling of complex physical processes dates back to around 1980 . After nearly four decades of development, uncertainty quantification (UQ) has played an important role and has been successfully applied in many fields.

sciencedirect.com favicon

sciencedirect

https://www.sciencedirect.com/science/article/pii/S1566253521001081

[174] A review of uncertainty quantification in deep learning: Techniques ... They have been applied to solve a variety of real-world problems in science and engineering. Bayesian approximation and ensemble learning techniques are two widely-used types of uncertainty quantification (UQ) methods.

researchgate.net favicon

researchgate

https://www.researchgate.net/profile/Alexander-Jima/publication/388743105_Uncertainty_Quantification_in_Multiphysics_Simulation_Using_Machine_Learning-Based_Surrogate_Models_for_Enhanced_Reliability_Assessment/links/67a480d1207c0c20fa7b537f/Uncertainty-Quantification-in-Multiphysics-Simulation-Using-Machine-Learning-Based-Surrogate-Models-for-Enhanced-Reliability-Assessment.pdf

[175] PDF Machine learning (ML) techniques have shown great promise in addressing these challenges. By constructing surrogate models—fast approximations of the original complex models—ML algorithms can

arxiv.org favicon

arxiv

https://arxiv.org/html/2502.03090v2

[176] Gaussian Processes Regression for Uncertainty Quantification: An ... Gaussian Process Regression (GPR) is a powerful nonparametric regression method that is widely used in Uncertainty Quantification (UQ) for constructing surrogate models. ... Computer models and simulations play an essential role in many real-world applications, ranging from engineering design, scientific exploration, to public policy making.

arc.aiaa.org favicon

aiaa

https://arc.aiaa.org/doi/10.2514/6.2025-2131

[177] Advancing Uncertainty Quantification in Dynamical Systems: A ... This paper introduces the application of Gaussian Process Regression (GPR), a probabilistic machine learning technique, to quantify uncertainty in dynamic systems and address limitations inherent in the techniques found in the literature. Although Monte Carlo (MC) methods are widely used for uncertainty quantification and propagation, they require an extensive sampling regime to achieve

onlinelibrary.wiley.com favicon

wiley

https://onlinelibrary.wiley.com/doi/full/10.1002/eqe.4265

[178] Earthquake Engineering & Structural Dynamics - Wiley Online Library This paper introduces a stochastic simulator for seismic uncertainty quantification, which is crucial for performance-based earthquake engineering. The proposed simulator extends the recently developed dimensionality reduction-based surrogate modeling method (DR-SM) to address high-dimensional ground motion uncertainties and the high

engineering.purdue.edu favicon

purdue

https://engineering.purdue.edu/~frosch/ftp/Talbott/11+-+References/files/Whitepaper.pdf

[180] PDF As risk assessment is an essential part in loss estimate and development of strategy for new design procedures, retrofits, and rehabilitations, a systematic treatment of uncertainty is essential. A critical review of currently available methodologies for uncertainty analysis and application to earthquake engineering has been conducted.

onlinelibrary.wiley.com favicon

wiley

https://onlinelibrary.wiley.com/doi/10.1002/eqe.3711

[181] Uncertainty quantification in the calibration of numerical elements in ... Modeling uncertainty in structural models can greatly affect the reliability of nonlinear time history results, which are central to performance-based earthquake engineering. A crucial source of modeling uncertainty is the uncertainty in the parameters of constitutive models, which simulate the hysteretic behavior of key structural components.

sciencedirect.com favicon

sciencedirect

https://www.sciencedirect.com/science/article/pii/S0141029621002467

[182] Stochastic uncertainty quantification of seismic performance of complex ... The structural demands present larger uncertain intervals with the increase in earthquake intensity, thus, the calculation results in the deterministic analyses can be further improved by the uncertainty quantification. Finally, the effects of uncertain variables on the structural demands are discussed.

mdpi.com favicon

mdpi

https://www.mdpi.com/journal/aerospace/special_issues/Uncertainty_Quantification

[183] Uncertainty Quantification in Aerospace Engineering - MDPI Hence, Uncertainty Quantification (UQ) is a crucial aspect in the reliability or robust design of aerospace structures and systems. However, the lack of knowledge or experimental data, the coupling between different physical fields and the expensive computational cost of high-fidelity simulation make UQ and model validation extremely challenging.

onlinelibrary.wiley.com favicon

wiley

https://onlinelibrary.wiley.com/doi/abs/10.1002/nme.2543

[187] Reliability sensitivity analysis with random and interval variables In reliability analysis and reliability-based design, sensitivity analysis identifies the relationship between the change in reliability and the change in the characteristics of uncertain variables.

hogonext.com favicon

hogonext

https://hogonext.com/how-to-conduct-sensitivity-analysis-in-finite-element-analysis/

[188] How to Conduct Sensitivity Analysis in Finite Element Analysis Sensitivity analysis is a fundamental aspect of engineering analysis and design that focuses on quantifying the relationship between input parameters and output responses of a model. It helps engineers identify the most influential parameters and understand how variations in these parameters affect the overall behavior of the system. In the context of FEA, sensitivity analysis is particularly

onemoneyway.com favicon

onemoneyway

https://onemoneyway.com/en/dictionary/monte-carlo-simulation/

[191] Monte Carlo simulation: how it works, applications, and benefits In engineering, Monte Carlo simulation is critical in reliability analysis and product design. Engineers use it to model the impact of different variables on systems' performance and durability, helping them design products that can withstand various operational conditions. ... The flexibility and robustness of Monte Carlo simulation make it

eracons.com favicon

eracons

https://eracons.com/resources/monte-carlo-simulation

[192] How to use Monte Carlo simulation for reliability analysis? Monte Carlo simulation divides the number of samples with system failure by the total number of random samples generated to estimate the probability of failure in a ... (MCS) is a simple, straightforward and robust method. The principle steps in a MCS are: (1) A set of random variables $\mathbf{X}$ that follow joint probability density function

link.springer.com favicon

springer

https://link.springer.com/chapter/10.1007/978-3-319-70766-2_34

[205] Uncertainty Quantification Using Multiple Models—Prospects and Challenges 34.2 Challenges for Uncertainty Quantification in Climate Modeling. ... Background knowledge is important for considering whether to exclude or downweight models which violate basic physical principles (such as conservation of water or energy), or which lack representations of processes or feedbacks that are known to play an important role for

pmc.ncbi.nlm.nih.gov favicon

nih

https://pmc.ncbi.nlm.nih.gov/articles/PMC7612598/

[206] Challenges in estimation, uncertainty quantification and elicitation ... In this paper, we identify and discuss four broad challenges in the estimation paradigm relating to infectious disease modelling, namely the Uncertainty Quantification framework, data challenges in estimation, model-based inference and prediction, and expert judgement. Efficient and timely estimation in parametric models of epidemiological processes for real-world systems is highly challenging, but fundamental to scientific understanding, forecasting and decision-making under uncertainty (Shea et al., 2020a). Key sources of uncertainty include inherent variation in natural systems and our lack of knowledge about these systems, typically broken down into: observation error or bias (where the process of data collection is imperfect); stochastic uncertainty (where inherent randomness in the transmission process impacts outcomes of interest); parameter uncertainty (where data are insufficient to fully identify model inputs); structural uncertainty (where the choice of model structure is unknown); and model discrepancy (reflecting differences between the reality and the mathematical approximation to it that the model provides).

pmc.ncbi.nlm.nih.gov favicon

nih

https://pmc.ncbi.nlm.nih.gov/articles/PMC9712693/

[212] Insights into the quantification and reporting of model-related ... We introduce the "sources of uncertainty" framework, using it to conduct a systematic audit of model-related uncertainty quantification from seven scientific fields, spanning the biological, physical, and political sciences.

reliability-studies.vanderbilt.edu favicon

vanderbilt

https://www.reliability-studies.vanderbilt.edu/research/uncertainty_quantification.php

[213] Uncertainty Quantification | Research | Risk, Reliability, and ... Quantification of the contribution of each uncertainty source to the uncertainty in the quantity of interest; modularized global sensitivity analysis and efficient Gaussian mixture copula (GMC) approximation of the joint distribution for fast computation of sensitivity indices .

bennettinstitute.cam.ac.uk favicon

cam

https://www.bennettinstitute.cam.ac.uk/wp-content/uploads/2020/12/How_to_communicate_effectively_to_policy_makers.pdf

[214] PDF CONTENTS Strategies for effective communication 2 Increase your visibility 2 Framing your evidence 3 Translating findings more effectively 4 Communicating scientific uncertainty 5 Published August 2020 Publication from the Bennett Institute for Public Policy, Cambridge www.bennettinstitute.cam.ac.uk 1 Strategies for effective communication An extensive body of literature emphasizes that the inaccessibility of ‘science’ can be an important barrier to knowledge exchange and impact between scientists and decision makers. 3. In addition the Science Advice for Policy by European Academies have recently undertaken an evidence review in this area and concluded that scientific uncertainty can be communicated effectively by characterising, assessing and conveying the limits of scientific statements clearly.

researchgate.net favicon

researchgate

https://www.researchgate.net/publication/225242748_Communicating_uncertainty_to_policy_makers

[215] Communicating uncertainty to policy makers - ResearchGate Dialogue between scientists and policymakers can communicate the many dimensions to uncertainty (Patt, 2009) and, furthermore, could inform the application of the science so the events studied are

nature.com favicon

nature

https://www.nature.com/articles/s41599-017-0046-8

[217] How to communicate effectively with policymakers: combine ... - Nature Policy actors may deal collectively with bounded rationality by telling simple stories to help ‘process information, communicate, and reason’ (McBeth et al., 2014) and an ‘evidence-gathering’ process may serve to reinforce collective identity or what people already believe (Lewis, 2013, p 13–15; Stone, 1989). Such general advice is already common in policy studies as part of a package of possible measures: ‘learn and follow the “rules of the game” [of policy networks] to improve strategies and help build up trust; form coalitions with actors with similar aims and beliefs; and frame the evidence to appeal to the biases, beliefs, and priorities of policy makers’ (Cairney et al., 2016; see also Weible et al., 2012; Stoker 2010, pp 55–57).

onlinelibrary.wiley.com favicon

wiley

https://onlinelibrary.wiley.com/doi/full/10.1155/2020/6068203

[219] Basic Framework and Main Methods of Uncertainty Quantification Therefore, surrogate models are commonly used for likelihood evaluation in MCMC-based model uncertainty quantification to alleviate the high computational cost of simulations . Besides, when the exact probability is not critical and only the low-order moments such as the mean and the variance are important, various approximate Bayesian

sciencedirect.com favicon

sciencedirect

https://www.sciencedirect.com/science/article/pii/S088832702200437X

[233] Hierarchical Bayesian uncertainty quantification of Finite Element ... This paper develops a Hierarchical Bayesian Modeling (HBM) framework for uncertainty quantification of Finite Element (FE) models based on modal information. This framework uses an existing Fast Fourier Transform (FFT) approach to identify experimental modal parameters from time-history data and employs a class of maximum-entropy probability

ncbi.nlm.nih.gov favicon

nih

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6517086/

[240] Accurate quantification of uncertainty in epidemic parameter estimates ... Stochastic transmission dynamic models are needed to quantify the uncertainty in estimates and predictions during outbreaks of infectious diseases. We previously developed a calibration method for stochastic epidemic compartmental models, called Multiple Shooting for Stochastic Systems (MSS), and demonstrated its competitive performance against

pubs.aip.org favicon

aip

https://pubs.aip.org/aip/sci/article/2019/43/431102/364461/Enabling-optimization-under-uncertainty-in

[243] Enabling optimization under uncertainty in aerospace design With an ever-present demand for lighter, faster, more efficient spacecraft, the aerospace industry is in need of optimal design methods. Unfortunately, traditional methodologies are deterministic and don't account for uncertainties in the final product.

sciencedirect.com favicon

sciencedirect

https://www.sciencedirect.com/science/article/pii/S0266892023000966

[247] Recent advances in uncertainty quantification in structural response ... Because of the large volume of research associated with the uncertainty quantification in structural dynamics and different applications in structural dynamics adopting similar implementation architecture in uncertainty quantification, this article will particularly emphasize the survey of the state-of-the-art uncertainty quantification in structural response characterization, and model calibration and system identification. This article briefly reviews the state-of-the-art studies regarding uncertainty quantification analysis in structural dynamics, particularly with emphasis on structural response characterization, and model calibration and system identification. According to the implementation structure, forward and inverse uncertainty quantification methods and frameworks are adopted to manage the uncertainties in structural response characterization, and model calibration and system identification, Arbitrary polynomial chaos expansion method for uncertainty quantification and global sensitivity analysis in structural dynamics

sciencedirect.com favicon

sciencedirect

https://www.sciencedirect.com/science/article/pii/S004578252400762X

[249] A Review of Recent Advances in Surrogate Models for Uncertainty ... A Review of Recent Advances in Surrogate Models for Uncertainty Quantification of High-Dimensional Engineering Applications - ScienceDirect A Review of Recent Advances in Surrogate Models for Uncertainty Quantification of High-Dimensional Engineering Applications Challenges in surrogate modeling for high-dimensional spaces are comprehended. High-dimensional benchmark functions assessing the surrogate models are provided. Nonetheless, as the complexity of the problem increases and the number of input variables grows, the computational burden of constructing an efficient surrogate model also rises, leading to the so-called curse of dimensionality in uncertainty propagation from inputs to outputs. This paper reviews the developments of the past years in surrogate modeling for high-dimensional inputs, with the goal of quantifying output uncertainty. For all open access content, the relevant licensing terms apply.

sciencedirect.com favicon

sciencedirect

https://www.sciencedirect.com/science/article/pii/S1566253521001081

[253] A review of uncertainty quantification in deep learning: Techniques ... Uncertainty quantification (UQ) methods play a pivotal role in reducing the impact of uncertainties during both optimization and decision making processes. They have been applied to solve a variety of real-world problems in science and engineering. Bayesian approximation and ensemble learning techniques are two widely-used types of uncertainty quantification (UQ) methods. In this regard

sciencedirect.com favicon

sciencedirect

https://www.sciencedirect.com/science/article/pii/S0957417424023935

[255] Uncertainty quantification driven machine learning for improving model ... In this article, we propose using epistemic uncertainty quantification (UQ) of machine learning models to identify rare samples in imbalanced regression problems for balancing the dataset. Transfer learning, reuse of a pretrained model on a new problem, is another model based approach for working with imbalanced classification datasets (Al-Stouhi and Reddy, 2016, Singh et al., 2021, Taherkhani et al., 2020). We propose UQ-driven imbalanced regression algorithm (UQDIR) for improving the machine learning model prediction accuracy in regression tasks when working with imbalanced datasets. In this article, we develop a new algorithm, UQDIR, for improving the prediction accuracy of machine learning models when working with imbalanced datasets for regression tasks.

link.springer.com favicon

springer

https://link.springer.com/article/10.1007/s44379-024-00011-x

[256] A survey on machine learning approaches for uncertainty quantification ... Recently, machine learning (ML) techniques, including Gaussian process regression, artificial neural networks, physics-informed neural networks, and many others, have garnered significant attention in both theoretical research and practical applications. A plethora of ML models are currently available, including the Kriging model , Gaussian process regression (GPR) , polynomial chaos expansion (PCE) , support vector machine (SVM) , artificial neural network (ANN) , Bayesian neural network (BNN) , physics-informed neural network (PINN) , among others . In reliability analysis, we review two prominent categories of ML techniques: data-driven neural networks and PINNs. Section 3 then shifts to ML strategies for inverse UQ analysis, covering key areas such as probabilistic model updating (PMU) and design optimization under uncertainty. Physics-Informed Deep Learning-Based Real-Time Structural Response Prediction Method.

sciencedirect.com favicon

sciencedirect

https://www.sciencedirect.com/science/article/abs/pii/S0021999122009652

[269] Uncertainty quantification in scientific machine learning: Methods ... Neural networks (NNs) are currently changing the computational paradigm on how to combine data with mathematical laws in physics and engineering in a profound way, tackling challenging inverse and ill-posed problems not solvable with traditional methods. However, quantifying errors and uncertainties in NN-based inference is more complicated than in traditional methods.

link.springer.com favicon

springer

https://link.springer.com/article/10.1007/s44379-024-00011-x

[270] A survey on machine learning approaches for uncertainty quantification ... Recently, machine learning (ML) techniques, including Gaussian process regression, artificial neural networks, physics-informed neural networks, and many others, have garnered significant attention in both theoretical research and practical applications. A plethora of ML models are currently available, including the Kriging model , Gaussian process regression (GPR) , polynomial chaos expansion (PCE) , support vector machine (SVM) , artificial neural network (ANN) , Bayesian neural network (BNN) , physics-informed neural network (PINN) , among others . In reliability analysis, we review two prominent categories of ML techniques: data-driven neural networks and PINNs. Section 3 then shifts to ML strategies for inverse UQ analysis, covering key areas such as probabilistic model updating (PMU) and design optimization under uncertainty. Physics-Informed Deep Learning-Based Real-Time Structural Response Prediction Method.

arxiv.org favicon

arxiv

https://arxiv.org/abs/2501.03282

[271] From Aleatoric to Epistemic: Exploring Uncertainty Quantification ... In just 3 minutes help us improve arXiv: cs arXiv:2501.03282 arXiv author ID From Aleatoric to Epistemic: Exploring Uncertainty Quantification Techniques in Artificial Intelligence Uncertainty quantification (UQ) is a critical aspect of artificial intelligence (AI) systems, particularly in high-risk domains such as healthcare, autonomous systems, and financial technology, where decision-making processes must account for uncertainty. This review explores the evolution of uncertainty quantification techniques in AI, distinguishing between aleatoric and epistemic uncertainties, and discusses the mathematical foundations and methods used to quantify these uncertainties. Subjects: Artificial Intelligence (cs.AI); Machine Learning (cs.LG) Cite as: arXiv:2501.03282 [cs.AI] (or arXiv:2501.03282v1 [cs.AI] for this version) From: Yichao Zhang [view email] cs.AI cs Bibliographic and Citation Tools Bibliographic Explorer Toggle Connected Papers Toggle scite.ai Toggle

arxiv.org favicon

arxiv

https://arxiv.org/html/2503.17385v1

[272] Uncertainty Quantification for Data-Driven Machine Learning Models in ... The American Society of Mechanical Engineers (ASME) VVUQ 70 subcommittee on "Verification, Validation, and Uncertainty Quantification (VVUQ) of Machine Learning" aims to coordinate, promote, and foster the development of standards that provide procedures for assessing and quantifying the credibility of ML algorithms applied to mechanistic

onlinelibrary.wiley.com favicon

wiley

https://onlinelibrary.wiley.com/doi/full/10.1155/2020/6068203

[282] Basic Framework and Main Methods of Uncertainty Quantification Second, by controlling the uncertainty of the important input factors, the designer can reduce the uncertainty of the model output with the minimum economic and time cost, so as to improve the robustness of the model prediction or reduce the failure probability of the structure system to the greatest extent and directly achieve the optimal