Overview

The (unweighted) Max Cut problem is to partition the vertex set of a given graph G=(V,E) into two sets S and V \ S so as to maximize the total number of edges between those two sets. Such a partition is called a maximum cut. Computing a maximum cut of a graph is a well-known problem in the area of computer science; it is one of Karp's 21 NP-complete problems. Furthermore, its weighted variant is one of Karp's 21 NP-complete problems.

Kernelization is a general theoretical framework for preprocessing instances of NP-hard problems into (generally smaller) instances with bounded size, via the repeated application of data reduction rules. For the fundamental Max Cut problem, kernelization algorithms are theoretically highly efficient for various parameterizations. However, the efficacy of these reduction rules in practice---to aid solving highly challenging benchmark instances to optimality---remains entirely unexplored. We engineered a new suite of efficient data reduction rules that subsume most of the previously published rules, and demonstrate their significant impact on benchmark data sets, including synthetic instances, and data sets from the VLSI and image segmentation application domains. Our experiments reveal that current state-of-the-art solvers can be sped up by up to multiple orders of magnitude when combined with our data reduction rules. On social and biological networks in particular, kernelization enables us to solve four instances that were previously unsolved in a ten-hour time limit with state-of-the-art solvers; three of these instances are now solved in less than two seconds. Here, we provide the implementation of the algorithms.



News:
8th January 2024: We moved this website to github.
7th January 2020: Our paper has been presented at ALENEX'20.
8th January 2020: Initial release of code.
26th May 2019: Published ArXiv Report. Link to PDF.


License

The program is licenced under MIT.
If you publish results using our algorithms, please acknowledge our work by quoting the following paper (PDF):

@inbook{doi:10.1137/1.9781611976007.3,
             author = {Damir Ferizovic and Demian Hespe and Sebastian Lamm and Matthias Mnich and Christian Schulz and Darren Strash},
             title = {Engineering Kernelization for Maximum Cut},
             booktitle = {2020 Proceedings of the Workshop on Algorithm Engineering and Experiments (ALENEX)},
             chapter = {},
             pages = {27-41},
             doi = {10.1137/1.9781611976007.3},
             URL = {https://epubs.siam.org/doi/abs/10.1137/1.9781611976007.3},
             eprint = {https://epubs.siam.org/doi/pdf/10.1137/1.9781611976007.3}
}

Download

  • Code available on GitHub here

Support

  • Write us an email if you need support!
  • We are glad for any comments and error reports (or even bug fixes or feature requests) that you send us.

Other Open Source Projects

  • KaHIP -- Karlsruhe High Quality Partitioning
  • ParHIP -- Parallel High Quality Partitioning
  • KaMIS -- Karlsruhe Maximum Independent Sets
  • KaLP -- Karlsruhe Longest Paths
  • VieClus -- Vienna Graph Clustering
  • KaDraw -- Karlsruhe Graph Drawing
  • VieM -- Vienna Process Mapping
  • KaGen -- Karlsruhe Graph Generation
  • KaHyPar -- Karlsruhe Hypergraph Partitioning