Claim Missing Document
Check
Articles

Found 2 Documents
Search

Using machine learning to predict the number of alternative solutions to a minimum cardinality set covering problem Emerick, Brooks; Lu, Yun; Vasko, Francis J.
International Journal of Industrial Optimization Vol 2, No 1 (2021)
Publisher : Universitas Ahmad Dahlan

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.12928/ijio.v2i1.2948

Abstract

Although the characterization of alternative optimal solutions for linear programming problems is well known, such characterizations for combinatorial optimization problems are essentially non-existent. This is the first article to qualitatively predict the number of alternative optima for a classic NP-hard combinatorial optimization problem, namely, the minimum cardinality (also called unicost) set covering problem (MCSCP). For the MCSCP, a set must be covered by a minimum number of subsets selected from a specified collection of subsets of the given set. The MCSCP has numerous industrial applications that require that a secondary objective is optimized once the size of a minimum cover has been determined. To optimize the secondary objective, the number of MCSCP solutions is optimized. In this article, for the first time, a machine learning methodology is presented to generate categorical regression trees to predict, qualitatively (extra-small, small, medium, large, or extra-large), the number of solutions to an MCSCP. Within the machine learning toolbox of MATLAB®, 600,000 unique random MCSCPs were generated and used to construct regression trees. The prediction quality of these regression trees was tested on 5000 different MCSCPs. For the 5-output model, the average accuracy of being at most one off from the predicted category was 94.2%. 
Filling the gap in weighted set covering problem test instances: implications for both researchers and practitioners Vasko, Francis J.; Lu, Yun; Song, Myung Soon; Rando, Dominic
International Journal of Industrial Optimization Vol. 6 No. 1 (2025)
Publisher : Universitas Ahmad Dahlan

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.12928/ijio.v6i1.10836

Abstract

Since 1990, the quality of approximate solution methods for solving weighted set covering problems (WSCPs) has been measured based on how well they solve 65 WSCPs available in Beasley’s OR-Library.  In a 2024 paper, it has been shown that guaranteed optimal solutions can easily be obtained for 55 weighted set covering problems (WSCPs) in Beasley’s OR-Library using general-purpose integer programming software.  These 55 WSCPs have 500 rows and 5,000 columns or less and were solved in a few seconds on a standard PC.  However, the remaining 10 WSCPs have 1000 rows and 10,000 columns and either required considerably more than 1000 seconds to obtain guaranteed optimums (data set NRG) or no optimums were obtained (NRH).  The purpose of this short paper is to try to quantify the solution times needed to solve WSCPs using general-purpose integer programming software that are larger than 500 rows and 5,000 columns, but less than 1,000 rows and 10,000 columns.  This is important because the size and solution time gap is so large that solution times go from a few seconds for the 55 “smaller” WSCPs to very large solution times for the two largest data sets.  To fill this gap, 40 new WSCP instances are defined and their solution times are analyzed to determine when to expect that WSCPs, based on size and density, can be solved to optimality in a timely manner using general-purpose integer programming software like Gurobi or CPLEX.