Dang, C., Valdebenito, M. A., Faes, M. G. R., Wei, P. F., & Beer, M. (2022). Structural reliability analysis: A Bayesian perspective. Struct. Saf., 99, 102259.
Abstract: Numerical methods play a dominant role in structural reliability analysis, and the goal has long been to produce a failure probability estimate with a desired level of accuracy using a minimum number of performance function evaluations. In the present study, we attempt to offer a Bayesian perspective on the failure probability integral estimation, as opposed to the classical frequentist perspective. For this purpose, a principled Bayesian Failure Probability Inference (BFPI) framework is first developed, which allows to quantify, propagate and reduce numerical uncertainty behind the failure probability due to discretization error. Especially, the posterior variance of the failure probability is derived in a semianalytical form, and the Gaussianity of the posterior failure probability distribution is investigated numerically. Then, a Parallel AdaptiveBayesian Failure Probability Learning (PABFPL) method is proposed within the Bayesian framework. In the PABFPL method, a varianceamplified importance sampling technique is presented to evaluate the posterior mean and variance of the failure probability, and an adaptive parallel active learning strategy is proposed to identify multiple updating points at each iteration. Thus, a novel advantage of PABFPL is that both prior knowledge and parallel computing can be used to make inference about the failure probability. Four numerical examples are investigated, indicating the potential benefits by advocating a Bayesian approach to failure probability estimation.

Dang, C., Wei, P. F., Faes, M. G. R., Valdebenito, M. A., & Beer, M. (2022). Interval uncertainty propagation by a parallel Bayesian global optimization method. Appl. Math. Model., 108, 220–235.
Abstract: This paper is concerned with approximating the scalar response of a complex computational model subjected to multiple input interval variables. Such task is formulated as finding both the global minimum and maximum of a computationally expensive blackbox function over a prescribed hyperrectangle. On this basis, a novel nonintrusive method, called `tripleengine parallel Bayesian global optimization', is proposed. The method begins by assuming a Gaussian process prior (which can also be interpreted as a surrogate model) over the response function. The main contribution lies in developing a novel infill sampling criterion, i.e., tripleengine pseudo expected improvement strategy, to identify multiple promising points for minimization and/or maximization based on the past observations at each iteration. By doing so, these identified points can be evaluated on the real response function in parallel. Besides, another potential benefit is that both the lower and upper bounds of the model response can be obtained with a single run of the developed method. Four numerical examples with varying complexity are investigated to demonstrate the proposed method against some existing techniques, and results indicate that significant computational savings can be achieved by making full use of prior knowledge and parallel computing.

Dang, C., Wei, P. F., Faes, M. G. R., Valdebenito, M. A., & Beer, M. (2022). Parallel adaptive Bayesian quadrature for rare event estimation. Reliab. Eng. Syst. Saf., 225, 108621.
Abstract: Various numerical methods have been extensively studied and used for reliability analysis over the past several decades. However, how to understand the effect of numerical uncertainty (i.e., numerical error due to the discretization of the performance function) on the failure probability is still a challenging issue. The active learning probabilistic integration (ALPI) method offers a principled approach to quantify, propagate and reduce the numerical uncertainty via computation within a Bayesian framework, which has not been fully investigated in context of probabilistic reliability analysis. In this study, a novel method termed `Parallel Adaptive Bayesian Quadrature' (PABQ) is proposed on the theoretical basis of ALPI, and is aimed at broadening its scope of application. First, the Monte Carlo method used in ALPI is replaced with an importance ball sampling technique so as to reduce the sample size that is needed for rare failure event estimation. Second, a multipoint selection criterion is proposed to enable parallel distributed processing. Four numerical examples are studied to demonstrate the effectiveness and efficiency of the proposed method. It is shown that PABQ can effectively assess small failure probabilities (e.g., as low as 10(7)) with a minimum number of iterations by taking advantage of parallel computing.

Ding, C., Dang, C., Valdebenito, M. A., Faes, M. G. R., Broggi, M., & Beer, M. (2023). Firstpassage probability estimation of highdimensional nonlinear stochastic dynamic systems by a fractional momentsbased mixture distribution approach. Mech. Syst. Sig. Process., 185, 109775.
Abstract: Firstpassage probability estimation of highdimensional nonlinear stochastic dynamic systems is a significant task to be solved in many science and engineering fields, but remains still an open challenge. The present paper develops a novel approach, termed 'fractional momentsbased mixture distribution', to address such challenge. This approach is implemented by capturing the extreme value distribution (EVD) of the system response with the concepts of fractional moment and mixture distribution. In our context, the fractional moment itself is by definition a highdimensional integral with a complicated integrand. To efficiently compute the fractional moments, a parallel adaptive sampling scheme that allows for sample size extension is developed using the refined Latinized stratified sampling (RLSS). In this manner, both variance reduction and parallel computing are possible for evaluating the fractional moments. From the knowledge of loworder fractional moments, the EVD of interest is then expected to be reconstructed. Based on introducing an extended inverse Gaussian distribution and a log extended skewnormal distribution, one flexible mixture distribution model is proposed, where its fractional moments are derived in analytic form. By fitting a set of fractional moments, the EVD can be recovered via the proposed mixture model. Accordingly, the firstpassage probabilities under different thresholds can be obtained from the recovered EVD straightforwardly. The performance of the proposed method is verified by three examples consisting of two test examples and one engineering problem.

Faes, M. G. R., & Valdebenito, M. A. (2021). Fully decoupled reliabilitybased optimization of linear structures subject to Gaussian dynamic loading considering discrete design variables. Mech. Syst. Sig. Process., 156, 107616.
Abstract: Reliabilitybased optimization (RBO) offers the possibility of finding an optimal design for a system according to a prescribed criterion while explicitly taking into account the effects of uncertainty. However, due to the necessity of solving simultaneously a reliability problem nested in an optimization procedure, the corresponding computational cost is usually high, impeding the applicability of the methods. This computational cost is even further enlarged when one or several design variables must belong to a discrete set, due to the requirement of resorting to integer programming optimization algorithms. To alleviate this issue, this contribution proposes a fully decoupled approach for a specific class of problems, namely minimization of the failure probability of a linear system subjected to an uncertain dynamic load of the Gaussian type, under the additional constraint that the design variables are integervalued. Specifically, by using the operator norm framework, as developed by the authors in previous work, this paper shows that by reducing the RBO problem with discrete design variables to the solution of a single deterministic optimization problem followed by a single reliability analysis, a large gain in numerical efficiency can be obtained without compromising the accuracy of the resulting optimal design. The application and capabilities of the proposed approach are illustrated by means of three examples.

Faes, M. G. R., Valdebenito, M. A., Yuan, X. K., Wei, P. F., & Beer, M. (2021). Augmented reliability analysis for estimating imprecise first excursion probabilities in stochastic linear dynamics. Adv. Eng. Softw., 155, 102993.
Abstract: Imprecise probability allows quantifying the level of safety of a system taking into account the effect of both aleatory and epistemic uncertainty. The practical estimation of an imprecise probability is usually quite demanding from a numerical viewpoint, as it is necessary to propagate separately both types of uncertainty, leading in practical cases to a nested implementation in the socalled double loop approach. In view of this issue, this contribution presents an alternative approach that avoids the double loop by replacing the imprecise probability problem by an augmented, purely aleatory reliability analysis. Then, with the help of Bayes' theorem, it is possible to recover an expression for the failure probability as an explicit function of the imprecise parameters from the augmented reliability problem, which ultimately allows calculating the imprecise probability. The implementation of the proposed framework is investigated within the context of imprecise first excursion probability estimation of uncertain linear structures subject to imprecisely defined stochastic quantities and crisp stochastic loads. The associated augmented reliability problem is solved within the context of Directional Importance Sampling, leading to an improved accuracy at reduced numerical costs. The application of the proposed approach is investigated by means of two examples. The results obtained indicate that the proposed approach can be highly efficient and accurate.

Fina, M., Lauff, C., Faes, M. G. R., Valdebenito, M. A., Wagner, W., & Freitag, S. (2023). Bounding imprecise failure probabilities in structural mechanics based on maximum standard deviation. Struct. Saf., 101, 102293.
Abstract: This paper proposes a framework to calculate the bounds on failure probability of linear structural systems whose performance is affected by both random variables and interval variables. This kind of problems is known to be very challenging, as it demands coping with aleatoric and epistemic uncertainty explicitly. Inspired by the framework of the operator norm theorem, it is proposed to consider the maximum standard deviation of the structural response as a proxy for detecting the crisp values of the interval parameters, which yield the bounds of the failure probability. The scope of application of the proposed approach comprises linear structural systems, whose properties may be affected by both aleatoric and epistemic uncertainty and that are subjected to (possibly imprecise) Gaussian loading. Numerical examples indicate that the application of such proxy leads to substantial numerical advantages when compared to a traditional doubleloop approach for coping with imprecise failure probabilities. In fact, the proposed framework allows to decouple the propagation of aleatoric and epistemic uncertainty.

Ni, P. H., Jerez, D. J., Fragkoulis, V. C., Faes, M. G. R., Valdebenito, M. A., & Beer, M. (2022). Operator NormBased Statistical Linearization to Bound the First Excursion Probability of Nonlinear Structures Subjected to Imprecise Stochastic Loading. ASCEASME J. Risk Uncertain. Eng. Syst. ACiv. Eng., 8(1), 04021086.
Abstract: This paper presents a highly efficient approach for bounding the responses and probability of failure of nonlinear models subjected to imprecisely defined stochastic Gaussian loads. Typically, such computations involve solving a nested doubleloop problem, where the propagation of the aleatory uncertainty has to be performed for each realization of the epistemic parameters. Apart from neartrivial cases, such computation is generally intractable without resorting to surrogate modeling schemes, especially in the context of performing nonlinear dynamical simulations. The recently introduced operator norm framework allows for breaking this double loop by determining those values of the epistemic uncertain parameters that produce bounds on the probability of failure a priori. However, the method in its current form is only applicable to linear models due to the adopted assumptions in the derivation of the involved operator norms. In this paper, the operator norm framework is extended and generalized by resorting to the statistical linearization methodology to

Song, J. W., Wei, P. F., Valdebenito, M. A., Faes, M., & Beer, M. (2021). Datadriven and active learning of variancebased sensitivity indices with Bayesian probabilistic integration. Mech. Syst. Sig. Process., 163, 108106.
Abstract: Variancebased sensitivity indices play an important role in scientific computation and data mining, thus the significance of developing numerical methods for efficient and reliable estimation of these sensitivity indices based on (expensive) computer simulators and/or data cannot be emphasized too much. In this article, the estimation of these sensitivity indices is treated as a statistical inference problem. Two principle lemmas are first proposed as rules of thumb for making the inference. After that, the posterior features for all the (partial) variance terms involved in the main and total effect indices are analytically derived (not in closed form) based on Bayesian Probabilistic Integration (BPI). This forms a datadriven method for estimating the sensitivity indices as well as the involved discretization errors. Further, to improve the efficiency of the developed method for expensive simulators, an acquisition function, named Posterior Variance Contribution (PVC), is utilized for realizing optimal designs of experiments, based on which an adaptive BPI method is established. The application of this framework is illustrated for the calculation of the main and total effect indices, but the proposed two principle lemmas also apply to the calculation of interaction effect indices. The performance of the development is demonstrated by an illustrative numerical example and three engineering benchmarks with finite element models.

Yuan, X. K., Faes, M. G. R., Liu, S. L., Valdebenito, M. A., & Beer, M. (2021). Efficient imprecise reliability analysis using the Augmented Space Integral. Reliab. Eng. Syst. Saf., 210, 107477.
Abstract: This paper presents an efficient approach to compute the bounds on the reliability of a structure subjected to uncertain parameters described by means of imprecise probabilities. These imprecise probabilities arise from epistemic uncertainty in the definition of the hyperparameters of a set of random variables that describe aleatory uncertainty in some of the structure's properties. Typically, such calculation involves the solution of a socalled doubleloop problem, where a crisp reliability problem is repeatedly solved to determine which realization of the epistemic uncertainties yields the worst or best case with respect to structural safety. The approach in this paper aims at decoupling this double loop by virtue of the Augmented Space Integral. The core idea of the method is to infer a functional relationship between the epistemically uncertain hyperparameters and the probability of failure. Then, this functional relationship can be used to determine the best and worst case behavior with respect to the probability of failure. Three case studies are included to illustrate the effectiveness and efficiency of the developed methods.

Yuan, X. K., Liu, S. L., Faes, M., Valdebenito, M. A., & Beer, M. (2021). An efficient importance sampling approach for reliability analysis of timevariant structures subject to timedependent stochastic load. Mech. Syst. Sig. Process., 159, 107699.
Abstract: Structural performance is affected by deterioration processes and external loads. Both effects may change over time, posing a challenge for conducting reliability analysis. In such context, this contribution aims at assessing the reliability of structures where some of its parameters are modeled as random variables, possibly including deterioration processes, and which are subjected to stochastic load processes. The approach is developed within the framework of importance sampling and it is based on the concept of composite limit states, where the timedependent reliability problem is transformed into a series system with multiple performance functions. Then, an efficient twostep importance sampling density function is proposed, which splits timeinvariant parameters (random variables) from the timevariant ones (stochastic processes). This importance sampling scheme is geared towards a particular class of problems, where the performance of the structural system exhibits a linear dependency with respect to the stochastic load for fixed time. This allows calculating the reliability associated with the series system most efficiently. Practical examples illustrate the performance of the proposed approach.

Yuan, X. K., Liu, S. L., Valdebenito, M. A., Faes, M. G. R., Jerez, D. J., Jensen, H. A., et al. (2021). Decoupled reliabilitybased optimization using Markov chain Monte Carlo in augmented space. Adv. Eng. Softw., 157, 103020.
Abstract: An efficient framework is proposed for reliabilitybased design optimization (RBDO) of structural systems. The RBDO problem is expressed in terms of the minimization of the failure probability with respect to design variables which correspond to distribution parameters of random variables, e.g. mean or standard deviation. Generally, this problem is quite demanding from a computational viewpoint, as repeated reliability analyses are involved. Hence, in this contribution, an efficient framework for solving a class of RBDO problems without even a single reliability analysis is proposed. It makes full use of an established functional relationship between the probability of failure and the distribution design parameters, which is termed as the failure probability function (FPF). By introducing an instrumental variability associated with the distribution design parameters, the target FPF is found to be proportional to a posterior distribution of the design parameters conditional on the occurrence of failure in an augmented space. This posterior distribution is derived and expressed as an integral, which can be estimated through simulation. An advanced Markov chain algorithm is adopted to efficiently generate samples that follow the aforementioned posterior distribution. Also, an algorithm that reuses information is proposed in combination with sequential approximate optimization to improve the efficiency. Numeric examples illustrate the performance of the proposed framework.
