|
Chicoisne, R., Espinoza, D., Goycoolea, M., Moreno, E., & Rubio, E. (2012). A New Algorithm for the Open-Pit Mine Production Scheduling Problem. Oper. Res., 60(3), 517–528.
Abstract: For the purpose of production scheduling, open-pit mines are discretized into three-dimensional arrays known as block models. Production scheduling consists of deciding which blocks should be extracted, when they should be extracted, and what to do with the blocks once they are extracted. Blocks that are close to the surface should be extracted first, and capacity constraints limit the production in each time period. Since the 1960s, it has been known that this problem can be cast as an integer programming model. However, the large size of some real instances (3-10 million blocks, 15-20 time periods) has made these models impractical for use in real planning applications, thus leading to the use of numerous heuristic methods. In this article we study a well-known integer programming formulation of the problem that we refer to as C-PIT. We propose a new decomposition method for solving the linear programming relaxation (LP) of C-PIT when there is a single capacity constraint per time period. This algorithm is based on exploiting the structure of the precedence-constrained knapsack problem and runs in O(mn log n) in which n is the number of blocks and m a function of the precedence relationships in the mine. Our computations show that we can solve, in minutes, the LP relaxation of real-sized mine-planning applications with up to five million blocks and 20 time periods. Combining this with a quick rounding algorithm based on topological sorting, we obtain integer feasible solutions to the more general problem where multiple capacity constraints per time period are considered. Our implementation obtains solutions within 6% of optimality in seconds. A second heuristic step, based on local search, allows us to find solutions within 3% in one hour on all instances considered. For most instances, we obtain solutions within 1-2% of optimality if we let this heuristic run longer. Previous methods have been able to tackle only instances with up to 150,000 blocks and 15 time periods.
|
|
|
Espinoza, D., & Moreno, E. (2014). A primal-dual aggregation algorithm for minimizing conditional value-at-risk in linear programs. Comput. Optim. Appl., 59(3), 617–638.
Abstract: Recent years have seen growing interest in coherent risk measures, especially in Conditional Value-at-Risk (). Since is a convex function, it is suitable as an objective for optimization problems when we desire to minimize risk. In the case that the underlying distribution has discrete support, this problem can be formulated as a linear programming (LP) problem. Over more general distributions, recent techniques, such as the sample average approximation method, allow to approximate the solution by solving a series of sampled problems, although the latter approach may require a large number of samples when the risk measures concentrate on the tail of the underlying distributions. In this paper we propose an automatic primal-dual aggregation scheme to exactly solve these special structured LPs with a very large number of scenarios. The algorithm aggregates scenarios and constraints in order to solve a smaller problem, which is automatically disaggregated using the information of its dual variables. We compare this algorithm with other common approaches found in related literature, such as an improved formulation of the full problem, cut-generation schemes and other problem-specific approaches available in commercial software. Extensive computational experiments are performed on portfolio and general LP instances.
|
|
|
Espinoza, D., Goycoolea, M., & Moreno, E. (2015). The precedence constrained knapsack problem: Separating maximally violated inequalities. Discret Appl. Math., 194, 65–80.
Abstract: We consider the problem of separating maximally violated inequalities for the precedence constrained knapsack problem. Though we consider maximally violated constraints in a very general way, special emphasis is placed on induced cover inequalities and induced clique inequalities. Our contributions include a new partial characterization of maximally violated inequalities, a new safe shrinking technique, and new insights on strengthening and lifting. This work follows on the work of Boyd (1993), Park and Park (1997), van de Leensel et al. (1999) and Boland et al. (2011). Computational experiments show that our new techniques and insights can be used to significantly improve the performance of cutting plane algorithms for this problem. (C) 2015 Elsevier B.V. All rights reserved.
|
|
|
Espinoza, D., Goycoolea, M., Moreno, E., & Newman, A. (2013). MineLib: a library of open pit mining problems. Ann. Oper. Res., 206(1), 93–114.
Abstract: Similar to the mixed-integer programming library (MIPLIB), we present a library of publicly available test problem instances for three classical types of open pit mining problems: the ultimate pit limit problem and two variants of open pit production scheduling problems. The ultimate pit limit problem determines a set of notional three-dimensional blocks containing ore and/or waste material to extract to maximize value subject to geospatial precedence constraints. Open pit production scheduling problems seek to determine when, if ever, a block is extracted from an open pit mine. A typical objective is to maximize the net present value of the extracted ore; constraints include precedence and upper bounds on operational resource usage. Extensions of this problem can include (i) lower bounds on operational resource usage, (ii) the determination of whether a block is sent to a waste dump, i.e., discarded, or to a processing plant, i.e., to a facility that derives salable mineral from the block, (iii) average grade constraints at the processing plant, and (iv) inventories of extracted but unprocessed material. Although open pit mining problems have appeared in academic literature dating back to the 1960s, no standard representations exist, and there are no commonly available corresponding data sets. We describe some representative open pit mining problems, briefly mention related literature, and provide a library consisting of mathematical models and sets of instances, available on the Internet. We conclude with directions for use of this newly established mining library. The library serves not only as a suggestion of standard expressions of and available data for open pit mining problems, but also as encouragement for the development of increasingly sophisticated algorithms.
|
|
|
Lagos, G., Espinoza, D., Moreno, E., & Vielma, J. P. (2015). Restricted risk measures and robust optimization. Eur. J. Oper. Res., 241(3), 771–782.
Abstract: In this paper we consider characterizations of the robust uncertainty sets associated with coherent and distortion risk measures. In this context we show that if we are willing to enforce the coherent or distortion axioms only on random variables that are affine or linear functions of the vector of random parameters, we may consider some new variants of the uncertainty sets determined by the classical characterizations. We also show that in the finite probability case these variants are simple transformations of the classical sets. Finally we present results of computational experiments that suggest that the risk measures associated with these new uncertainty sets can help mitigate estimation errors of the Conditional Value-at-Risk. (C) 2014 Elsevier B.V. All rights reserved.
|
|
|
Letelier, O. R., Espinoza, D., Goycoolea, M., Moreno, E., & Munoz, G. (2020). Production Scheduling for Strategic Open Pit Mine Planning: A Mixed-Integer Programming Approach. Oper. Res., 68(5), 1425–1444.
Abstract: Given a discretized representation of an ore body known as a block model, the open pit mining production scheduling problem that we consider consists of defining which blocks to extract, when to extract them, and how or whether to process them, in such a way as to comply with operational constraints and maximize net present value. Although it has been established that this problem can be modeled with mixed-integer programming, the number of blocks used to represent real-world mines (millions) has made solving large instances nearly impossible in practice. In this article, we introduce a new methodology for tackling this problem and conduct computational tests using real problem sets ranging in size from 20,000 to 5,000,000 blocks and spanning 20 to 50 time periods. We consider both direct block scheduling and bench-phase scheduling problems, with capacity, blending, and minimum production constraints. Using new preprocessing and cutting planes techniques, we are able to reduce the linear programming relaxation value by up to 33%, depending on the instance. Then, using new heuristics, we are able to compute feasible solutions with an average gap of 1.52% relative to the previously computed bound. Moreover, after four hours of running a customized branch-and-bound algorithm on the problems with larger gaps, we are able to further reduce the average from 1.52% to 0.71%.
|
|
|
Munoz, G., Espinoza, D., Goycoolea, M., Moreno, E., Queyranne, M., & Rivera Letelier, O. (2018). A study of the Bienstock-Zuckerberg algorithm: applications in mining and resource constrained project scheduling. Comput. Optim. Appl., 69(2), 501–534.
Abstract: We study a Lagrangian decomposition algorithm recently proposed by Dan Bienstock and Mark Zuckerberg for solving the LP relaxation of a class of open pit mine project scheduling problems. In this study we show that the Bienstock-Zuckerberg (BZ) algorithm can be used to solve LP relaxations corresponding to a much broader class of scheduling problems, including the well-known Resource Constrained Project Scheduling Problem (RCPSP), and multi-modal variants of the RCPSP that consider batch processing of jobs. We present a new, intuitive proof of correctness for the BZ algorithm that works by casting the BZ algorithm as a column generation algorithm. This analysis allows us to draw parallels with the well-known Dantzig-Wolfe decomposition (DW) algorithm. We discuss practical computational techniques for speeding up the performance of the BZ and DW algorithms on project scheduling problems. Finally, we present computational experiments independently testing the effectiveness of the BZ and DW algorithms on different sets of publicly available test instances. Our computational experiments confirm that the BZ algorithm significantly outperforms the DW algorithm for the problems considered. Our computational experiments also show that the proposed speed-up techniques can have a significant impact on the solve time. We provide some insights on what might be explaining this significant difference in performance.
|
|
|
Rivera Letelier, O., Espinoza, D., Goycoolea, M., Moreno, E., & Munoz, G. (2020). Production scheduling for strategic open pit mine planning: A mixed integer programming approach. Oper. Res., 68(5), 1425–1444.
Abstract: Given a discretized representation of an ore body known as a block model, the open pit mining production scheduling problem that we consider consists of defining which blocks to extract, when to extract them, and how or whether to process them, in such a way as to comply with operational constraints and maximize net present value. Although it has been established that this problem can be modeled with mixed-integer programming, the number of blocks used to represent real-world mines (millions) has made solving large instances nearly impossible in practice. In this article, we introduce a new methodology for tackling this problem and conduct computational tests using real problem sets ranging in size from 20,000 to 5,000,000 blocks and spanning 20 to 50 time periods. We consider both direct block scheduling and bench-phase scheduling problems, with capacity, blending, and minimum production constraints. Using new preprocessing and cutting planes techniques, we are able to reduce the linear programming relaxation value by up to 33\%, depending on the instance. Then, using new heuristics, we are able to compute feasible solutions with an average gap of 1.52% relative to the previously computed bound. Moreover, after four hours of running a customized branch-and-bound algorithm on the problems with larger gaps, we are able to further reduce the average from 1.52% to 0.71%
|
|