Dang, C., Valdebenito, M. A., Faes, M. G. R., Wei, P. F., & Beer, M. (2022). Structural reliability analysis: A Bayesian perspective. Struct. Saf., 99, 102259.
Abstract: Numerical methods play a dominant role in structural reliability analysis, and the goal has long been to produce a failure probability estimate with a desired level of accuracy using a minimum number of performance function evaluations. In the present study, we attempt to offer a Bayesian perspective on the failure probability integral estimation, as opposed to the classical frequentist perspective. For this purpose, a principled Bayesian Failure Probability Inference (BFPI) framework is first developed, which allows to quantify, propagate and reduce numerical uncertainty behind the failure probability due to discretization error. Especially, the posterior variance of the failure probability is derived in a semi-analytical form, and the Gaussianity of the posterior failure probability distribution is investigated numerically. Then, a Parallel Adaptive-Bayesian Failure Probability Learning (PA-BFPL) method is proposed within the Bayesian framework. In the PA-BFPL method, a variance-amplified importance sampling technique is presented to evaluate the posterior mean and variance of the failure probability, and an adaptive parallel active learning strategy is proposed to identify multiple updating points at each iteration. Thus, a novel advantage of PA-BFPL is that both prior knowledge and parallel computing can be used to make inference about the failure probability. Four numerical examples are investigated, indicating the potential benefits by advocating a Bayesian approach to failure probability estimation.
|
Dang, C., Wei, P. F., Faes, M. G. R., Valdebenito, M. A., & Beer, M. (2022). Interval uncertainty propagation by a parallel Bayesian global optimization method. Appl. Math. Model., 108, 220–235.
Abstract: This paper is concerned with approximating the scalar response of a complex computational model subjected to multiple input interval variables. Such task is formulated as finding both the global minimum and maximum of a computationally expensive black-box function over a prescribed hyper-rectangle. On this basis, a novel non-intrusive method, called `triple-engine parallel Bayesian global optimization', is proposed. The method begins by assuming a Gaussian process prior (which can also be interpreted as a surrogate model) over the response function. The main contribution lies in developing a novel infill sampling criterion, i.e., triple-engine pseudo expected improvement strategy, to identify multiple promising points for minimization and/or maximization based on the past observations at each iteration. By doing so, these identified points can be evaluated on the real response function in parallel. Besides, another potential benefit is that both the lower and upper bounds of the model response can be obtained with a single run of the developed method. Four numerical examples with varying complexity are investigated to demonstrate the proposed method against some existing techniques, and results indicate that significant computational savings can be achieved by making full use of prior knowledge and parallel computing.
|
Dang, C., Wei, P. F., Faes, M. G. R., Valdebenito, M. A., & Beer, M. (2022). Parallel adaptive Bayesian quadrature for rare event estimation. Reliab. Eng. Syst. Saf., 225, 108621.
Abstract: Various numerical methods have been extensively studied and used for reliability analysis over the past several decades. However, how to understand the effect of numerical uncertainty (i.e., numerical error due to the discretization of the performance function) on the failure probability is still a challenging issue. The active learning probabilistic integration (ALPI) method offers a principled approach to quantify, propagate and reduce the numerical uncertainty via computation within a Bayesian framework, which has not been fully investigated in context of probabilistic reliability analysis. In this study, a novel method termed `Parallel Adaptive Bayesian Quadrature' (PABQ) is proposed on the theoretical basis of ALPI, and is aimed at broadening its scope of application. First, the Monte Carlo method used in ALPI is replaced with an importance ball sampling technique so as to reduce the sample size that is needed for rare failure event estimation. Second, a multi-point selection criterion is proposed to enable parallel distributed processing. Four numerical examples are studied to demonstrate the effectiveness and efficiency of the proposed method. It is shown that PABQ can effectively assess small failure probabilities (e.g., as low as 10(-7)) with a minimum number of iterations by taking advantage of parallel computing.
|
Ding, C., Dang, C., Valdebenito, M. A., Faes, M. G. R., Broggi, M., & Beer, M. (2023). First-passage probability estimation of high-dimensional nonlinear stochastic dynamic systems by a fractional moments-based mixture distribution approach. Mech. Syst. Sig. Process., 185, 109775.
Abstract: First-passage probability estimation of high-dimensional nonlinear stochastic dynamic systems is a significant task to be solved in many science and engineering fields, but remains still an open challenge. The present paper develops a novel approach, termed 'fractional moments-based mixture distribution', to address such challenge. This approach is implemented by capturing the extreme value distribution (EVD) of the system response with the concepts of fractional moment and mixture distribution. In our context, the fractional moment itself is by definition a high-dimensional integral with a complicated integrand. To efficiently compute the fractional moments, a parallel adaptive sampling scheme that allows for sample size extension is developed using the refined Latinized stratified sampling (RLSS). In this manner, both variance reduction and parallel computing are possible for evaluating the fractional moments. From the knowledge of low-order fractional moments, the EVD of interest is then expected to be reconstructed. Based on introducing an extended inverse Gaussian distribution and a log extended skew-normal distribution, one flexible mixture distribution model is proposed, where its fractional moments are derived in analytic form. By fitting a set of fractional moments, the EVD can be recovered via the proposed mixture model. Accordingly, the first-passage probabilities under different thresholds can be obtained from the recovered EVD straightforwardly. The performance of the proposed method is verified by three examples consisting of two test examples and one engineering problem.
|
Faes, M. G. R., Valdebenito, M. A., Yuan, X. K., Wei, P. F., & Beer, M. (2021). Augmented reliability analysis for estimating imprecise first excursion probabilities in stochastic linear dynamics. Adv. Eng. Softw., 155, 102993.
Abstract: Imprecise probability allows quantifying the level of safety of a system taking into account the effect of both aleatory and epistemic uncertainty. The practical estimation of an imprecise probability is usually quite demanding from a numerical viewpoint, as it is necessary to propagate separately both types of uncertainty, leading in practical cases to a nested implementation in the so-called double loop approach. In view of this issue, this contribution presents an alternative approach that avoids the double loop by replacing the imprecise probability problem by an augmented, purely aleatory reliability analysis. Then, with the help of Bayes' theorem, it is possible to recover an expression for the failure probability as an explicit function of the imprecise parameters from the augmented reliability problem, which ultimately allows calculating the imprecise probability. The implementation of the proposed framework is investigated within the context of imprecise first excursion probability estimation of uncertain linear structures subject to imprecisely defined stochastic quantities and crisp stochastic loads. The associated augmented reliability problem is solved within the context of Directional Importance Sampling, leading to an improved accuracy at reduced numerical costs. The application of the proposed approach is investigated by means of two examples. The results obtained indicate that the proposed approach can be highly efficient and accurate.
|
Jerez, D. J., Jensen, H. A., Valdebenito, M. A., Misraji, M. A., Mayorga, F., & Beer, M. (2022). On the use of Directional Importance Sampling for reliability-based design and optimum design sensitivity of linear stochastic structures. Probabilistic Eng. Mech., 70, 103368.
Abstract: This contribution focuses on reliability-based design and optimum design sensitivity of linear dynamical structural systems subject to Gaussian excitation. Directional Importance Sampling (DIS) is implemented for reliability assessment, which allows to obtain first-order derivatives of the failure probabilities as a byproduct of the sampling process. Thus, gradient-based solution schemes can be adopted by virtue of this feature. In particular, a class of feasible-direction interior point algorithms are implemented to obtain optimum designs, while a direction-finding approach is considered to obtain optimum design sensitivity measures as a post -processing step of the optimization results. To show the usefulness of the approach, an example involving a building structure is studied. Overall, the reliability sensitivity analysis framework enabled by DIS provides a potentially useful tool to address a practical class of design optimization problems.
|
Ni, P. H., Jerez, D. J., Fragkoulis, V. C., Faes, M. G. R., Valdebenito, M. A., & Beer, M. (2022). Operator Norm-Based Statistical Linearization to Bound the First Excursion Probability of Nonlinear Structures Subjected to Imprecise Stochastic Loading. ASCE-ASME J. Risk Uncertain. Eng. Syst. A-Civ. Eng., 8(1), 04021086.
Abstract: This paper presents a highly efficient approach for bounding the responses and probability of failure of nonlinear models subjected to imprecisely defined stochastic Gaussian loads. Typically, such computations involve solving a nested double-loop problem, where the propagation of the aleatory uncertainty has to be performed for each realization of the epistemic parameters. Apart from near-trivial cases, such computation is generally intractable without resorting to surrogate modeling schemes, especially in the context of performing nonlinear dynamical simulations. The recently introduced operator norm framework allows for breaking this double loop by determining those values of the epistemic uncertain parameters that produce bounds on the probability of failure a priori. However, the method in its current form is only applicable to linear models due to the adopted assumptions in the derivation of the involved operator norms. In this paper, the operator norm framework is extended and generalized by resorting to the statistical linearization methodology to
|
Song, J. W., Wei, P. F., Valdebenito, M. A., Faes, M., & Beer, M. (2021). Data-driven and active learning of variance-based sensitivity indices with Bayesian probabilistic integration. Mech. Syst. Sig. Process., 163, 108106.
Abstract: Variance-based sensitivity indices play an important role in scientific computation and data mining, thus the significance of developing numerical methods for efficient and reliable estimation of these sensitivity indices based on (expensive) computer simulators and/or data cannot be emphasized too much. In this article, the estimation of these sensitivity indices is treated as a statistical inference problem. Two principle lemmas are first proposed as rules of thumb for making the inference. After that, the posterior features for all the (partial) variance terms involved in the main and total effect indices are analytically derived (not in closed form) based on Bayesian Probabilistic Integration (BPI). This forms a data-driven method for estimating the sensitivity indices as well as the involved discretization errors. Further, to improve the efficiency of the developed method for expensive simulators, an acquisition function, named Posterior Variance Contribution (PVC), is utilized for realizing optimal designs of experiments, based on which an adaptive BPI method is established. The application of this framework is illustrated for the calculation of the main and total effect indices, but the proposed two principle lemmas also apply to the calculation of interaction effect indices. The performance of the development is demonstrated by an illustrative numerical example and three engineering benchmarks with finite element models.
|
Valdebenito, M. A., Wei, P. F., Song, J. W., Beer, M., & Broggi, M. (2021). Failure probability estimation of a class of series systems by multidomain Line Sampling. Reliab. Eng. Syst. Saf., 213, 107673.
Abstract: This contribution proposes an approach for the assessment of the failure probability associated with a particular class of series systems. The type of systems considered involves components whose response is linear with respect to a number of Gaussian random variables. Component failure occurs whenever this response exceeds prescribed deterministic thresholds. We propose multidomain Line Sampling as an extension of the classical Line Sampling to work with a large number of components at once. By taking advantage of the linearity of the performance functions involved, multidomain Line Sampling explores the interactions that occur between failure domains associated with individual components in order to produce an estimate of the failure probability. The performance and effectiveness of multidomain Line Sampling is illustrated by means of two test problems and an application example, indicating that this technique is amenable for treating problems comprising both a large number of random variables and a large number of components.
|
Yuan, X. K., Faes, M. G. R., Liu, S. L., Valdebenito, M. A., & Beer, M. (2021). Efficient imprecise reliability analysis using the Augmented Space Integral. Reliab. Eng. Syst. Saf., 210, 107477.
Abstract: This paper presents an efficient approach to compute the bounds on the reliability of a structure subjected to uncertain parameters described by means of imprecise probabilities. These imprecise probabilities arise from epistemic uncertainty in the definition of the hyper-parameters of a set of random variables that describe aleatory uncertainty in some of the structure's properties. Typically, such calculation involves the solution of a so-called double-loop problem, where a crisp reliability problem is repeatedly solved to determine which realization of the epistemic uncertainties yields the worst or best case with respect to structural safety. The approach in this paper aims at decoupling this double loop by virtue of the Augmented Space Integral. The core idea of the method is to infer a functional relationship between the epistemically uncertain hyper-parameters and the probability of failure. Then, this functional relationship can be used to determine the best and worst case behavior with respect to the probability of failure. Three case studies are included to illustrate the effectiveness and efficiency of the developed methods.
|
Yuan, X. K., Liu, S. L., Faes, M., Valdebenito, M. A., & Beer, M. (2021). An efficient importance sampling approach for reliability analysis of time-variant structures subject to time-dependent stochastic load. Mech. Syst. Sig. Process., 159, 107699.
Abstract: Structural performance is affected by deterioration processes and external loads. Both effects may change over time, posing a challenge for conducting reliability analysis. In such context, this contribution aims at assessing the reliability of structures where some of its parameters are modeled as random variables, possibly including deterioration processes, and which are subjected to stochastic load processes. The approach is developed within the framework of importance sampling and it is based on the concept of composite limit states, where the time-dependent reliability problem is transformed into a series system with multiple performance functions. Then, an efficient two-step importance sampling density function is proposed, which splits time-invariant parameters (random variables) from the time-variant ones (stochastic processes). This importance sampling scheme is geared towards a particular class of problems, where the performance of the structural system exhibits a linear dependency with respect to the stochastic load for fixed time. This allows calculating the reliability associated with the series system most efficiently. Practical examples illustrate the performance of the proposed approach.
|
Yuan, X. K., Liu, S. L., Valdebenito, M. A., Faes, M. G. R., Jerez, D. J., Jensen, H. A., et al. (2021). Decoupled reliability-based optimization using Markov chain Monte Carlo in augmented space. Adv. Eng. Softw., 157, 103020.
Abstract: An efficient framework is proposed for reliability-based design optimization (RBDO) of structural systems. The RBDO problem is expressed in terms of the minimization of the failure probability with respect to design variables which correspond to distribution parameters of random variables, e.g. mean or standard deviation. Generally, this problem is quite demanding from a computational viewpoint, as repeated reliability analyses are involved. Hence, in this contribution, an efficient framework for solving a class of RBDO problems without even a single reliability analysis is proposed. It makes full use of an established functional relationship between the probability of failure and the distribution design parameters, which is termed as the failure probability function (FPF). By introducing an instrumental variability associated with the distribution design parameters, the target FPF is found to be proportional to a posterior distribution of the design parameters conditional on the occurrence of failure in an augmented space. This posterior distribution is derived and expressed as an integral, which can be estimated through simulation. An advanced Markov chain algorithm is adopted to efficiently generate samples that follow the aforementioned posterior distribution. Also, an algorithm that re-uses information is proposed in combination with sequential approximate optimization to improve the efficiency. Numeric examples illustrate the performance of the proposed framework.
|
Yuan, X. K., Liu, S. L., Valdebenito, M. A., Gu, J., & Beer, M. (2021). Efficient procedure for failure probability function estimation in augmented space. Struct. Saf., 92, 102104.
Abstract: An efficient procedure is proposed to estimate the failure probability function (FPF) with respect to design variables, which correspond to distribution parameters of basic structural random variables. The proposed procedure is based on the concept of an augmented reliability problem, which assumes the design variables as uncertain by assigning a prior distribution, transforming the FPF into an expression that includes the posterior distribution of those design variables. The novel contribution of this work consists of expressing this target posterior distribution as an integral, allowing it to be estimated by means of sampling, and no distribution fitting is needed, leading to an efficient estimation of FPF. The proposed procedure is implemented within three different simulation strategies: Monte Carlo simulation, importance sampling and subset simulation; for each of these cases, expressions for the coefficient of variation of the FPF estimate are derived. Numerical examples illustrate performance of the proposed approaches.
|