Parallel distributed‐memory simplex for large‐scale stochastic LP problems

Parallel distributed‐memory simplex for large‐scale stochastic LP problems

0.00 Avg rating0 Votes
Article ID: iaor20133979
Volume: 55
Issue: 3
Start Page Number: 571
End Page Number: 596
Publication Date: Jul 2013
Journal: Computational Optimization and Applications
Authors: , , ,
Keywords: programming: probabilistic, computational analysis: parallel computers
Abstract:

We present a parallelization of the revised simplex method for large extensive forms of two‐stage stochastic linear programming (LP) problems. These problems have been considered too large to solve with the simplex method; instead, decomposition approaches based on Benders decomposition or, more recently, interior‐point methods are generally used. However, these approaches do not provide optimal basic solutions, which allow for efficient hot‐starts (e.g., in a branch‐and‐bound context) and can provide important sensitivity information. Our approach exploits the dual block‐angular structure of these problems inside the linear algebra of the revised simplex method in a manner suitable for high‐performance distributed‐memory clusters or supercomputers. While this paper focuses on stochastic LPs, the work is applicable to all problems with a dual block‐angular structure. Our implementation is competitive in serial with highly efficient sparsity‐exploiting simplex codes and achieves significant relative speed‐ups when run in parallel. Additionally, very large problems with hundreds of millions of variables have been successfully solved to optimality. This is the largest‐scale parallel sparsity‐exploiting revised simplex implementation that has been developed to date and the first truly distributed solver. It is built on novel analysis of the linear algebra for dual block‐angular LP problems when solved by using the revised simplex method and a novel parallel scheme for applying product‐form updates.

Reviews

Required fields are marked *. Your email address will not be published.