Parameterized approximation algorithm

From HandWiki
Short description: Type of algorithm

A parameterized approximation algorithm is a type of algorithm that aims to find approximate solutions to NP-hard optimization problems in polynomial time in the input size and a function of a specific parameter. These algorithms are designed to combine the best aspects of both traditional approximation algorithms and fixed-parameter tractability.

In traditional approximation algorithms, the goal is to find solutions that are at most a certain factor [math]\displaystyle{ \alpha }[/math] away from the optimal solution, known as an [math]\displaystyle{ \alpha }[/math]-approximation, in polynomial time. On the other hand, parameterized algorithms are designed to find exact solutions to problems, but with the constraint that the running time of the algorithm is polynomial in the input size and a function of a specific parameter [math]\displaystyle{ k }[/math]. The parameter describes some property of the input and is small in typical applications. The problem is said to be fixed-parameter tractable (FPT) if there is an algorithm that can find the optimum solution in [math]\displaystyle{ f(k)n^{O(1)} }[/math] time, where [math]\displaystyle{ f(k) }[/math] is a function independent of the input size [math]\displaystyle{ n }[/math].

A parameterized approximation algorithm aims to find a balance between these two approaches by finding approximate solutions in FPT time: the algorithm computes an [math]\displaystyle{ \alpha }[/math]-approximation in [math]\displaystyle{ f(k)n^{O(1)} }[/math] time, where [math]\displaystyle{ f(k) }[/math] is a function independent of the input size [math]\displaystyle{ n }[/math]. This approach aims to overcome the limitations of both traditional approaches by having stronger guarantees on the solution quality compared to traditional approximations while still having efficient running times as in FPT algorithms. An overview of the research area studying parameterized approximation algorithms can be found in the survey of Marx[1] and the more recent survey by Feldmann et al.[2]

Obtainable approximation ratios

The full potential of parameterized approximation algorithms is utilized when a given optimization problem is shown to admit an [math]\displaystyle{ \alpha }[/math]-approximation algorithm running in [math]\displaystyle{ f(k)n^{O(1)} }[/math] time, while in contrast the problem neither has a polynomial-time [math]\displaystyle{ \alpha }[/math]-approximation algorithm (under some complexity assumption, e.g., [math]\displaystyle{ P\neq NP }[/math]), nor an FPT algorithm for the given parameter [math]\displaystyle{ k }[/math] (i.e., it is at least W[1]-hard).

For example, some problems that are APX-hard and W[1]-hard admit a parameterized approximation scheme (PAS), i.e., for any [math]\displaystyle{ \varepsilon\gt 0 }[/math] a [math]\displaystyle{ (1+\varepsilon) }[/math]-approximation can be computed in [math]\displaystyle{ f(k,\varepsilon)n^{g(\varepsilon)} }[/math] time for some functions [math]\displaystyle{ f }[/math] and [math]\displaystyle{ g }[/math]. This then circumvents the lower bounds in terms of polynomial-time approximation and fixed-parameter tractability. A PAS is similar in spirit to a polynomial-time approximation scheme (PTAS) but additionally exploits a given parameter [math]\displaystyle{ k }[/math]. Since the degree of the polynomial in the runtime of a PAS depends on a function [math]\displaystyle{ g(\varepsilon) }[/math], the value of [math]\displaystyle{ \varepsilon }[/math] is assumed to be arbitrary but constant in order for the PAS to run in FPT time. If this assumption is unsatisfying, [math]\displaystyle{ \varepsilon }[/math] is treated as a parameter as well to obtain an efficient parameterized approximation scheme (EPAS), which for any [math]\displaystyle{ \varepsilon\gt 0 }[/math] computes a [math]\displaystyle{ (1+\varepsilon) }[/math]-approximation in [math]\displaystyle{ f(k,\varepsilon)n^{O(1)} }[/math] time for some function [math]\displaystyle{ f }[/math]. This is similar in spirit to an efficient polynomial-time approximation scheme (EPTAS).

k-cut

The k-cut problem has no polynomial-time [math]\displaystyle{ (2-\varepsilon) }[/math]-approximation algorithm for any [math]\displaystyle{ \varepsilon\gt 0 }[/math], assuming [math]\displaystyle{ P\neq NP }[/math] and the small set expansion hypothesis.[3] It is also W[1]-hard parameterized by the number [math]\displaystyle{ k }[/math] of required components.[4] However an EPAS exists, which computes a [math]\displaystyle{ (1+\varepsilon) }[/math]-approximation in [math]\displaystyle{ (k/\varepsilon)^{O(k)}n^{O(1)} }[/math] time.[5]

Steiner Tree

The Steiner tree problem is FPT parameterized by the number of terminals.[6] However, for the "dual" parameter consisting of the number [math]\displaystyle{ k }[/math] of non-terminals contained in the optimum solution, the problem is W[2]-hard (due to a folklore reduction from the Dominating Set problem). Steiner Tree is also known to be APX-hard.[7] However, there is an EPAS computing a [math]\displaystyle{ (1+\varepsilon) }[/math]-approximation in [math]\displaystyle{ 2^{O(k^2/\varepsilon^4)}n^{O(1)} }[/math] time.[8]

Strongly-connected Steiner subgraph

It is known that the Strongly Connected Steiner Subgraph problem is W[1]-hard parameterized by the number [math]\displaystyle{ k }[/math] of terminals,[9] and also does not admit an [math]\displaystyle{ O(\log^{2-\varepsilon} n) }[/math]-approximation in polynomial time (under standard complexity assumptions).[10] However a 2-approximation can be computed in [math]\displaystyle{ 3^{k}n^{O(1)} }[/math] time.[11] Furthermore, this is best possible, since no [math]\displaystyle{ (2-\varepsilon) }[/math]-approximation can be computed in [math]\displaystyle{ f(k)n^{O(1)} }[/math] time for any function [math]\displaystyle{ f }[/math], under Gap-ETH.[12]

k-median and k-means

For the well-studied metric clustering problems of k-median and k-means parameterized by the number [math]\displaystyle{ k }[/math] of centers, it is known that no [math]\displaystyle{ (1+2/e-\varepsilon) }[/math]-approximation for k-Median and no [math]\displaystyle{ (1+8/e-\varepsilon) }[/math]-approximation for k-Means can be computed in [math]\displaystyle{ f(k)n^{O(1)} }[/math] time for any function [math]\displaystyle{ f }[/math], under Gap-ETH.[13] Matching parameterized approximation algorithms exist,[13] but it is not known whether matching approximations can be computed in polynomial time.

Clustering is often considered in settings of low dimensional data, and thus a practically relevant parameterization is by the dimension of the underlying metric. In the Euclidean space, the k-Median and k-Means problems admit an EPAS parameterized by the dimension [math]\displaystyle{ d }[/math],[14][15] and also an EPAS parameterized by [math]\displaystyle{ k }[/math].[16][17] The former was generalized to an EPAS for the parameterization by the doubling dimension.[18] For the loosely related highway dimension parameter, only an approximation scheme with XP runtime is known to date.[19]

k-center

For the metric k-center problem a 2-approximation can be computed in polynomial time. However, when parameterizing by either the number [math]\displaystyle{ k }[/math] of centers,[20] the doubling dimension (in fact the dimension of a Manhattan metric),[21] or the highway dimension,[20] no parameterized [math]\displaystyle{ (2-\varepsilon) }[/math]-approximation algorithm exists, under standard complexity assumptions. Furthermore, the k-Center problem is W[1]-hard even on planar graphs when simultaneously parameterizing it by the number [math]\displaystyle{ k }[/math] of centers, the doubling dimension, the highway dimension, and the pathwidth.[22] However, when combining [math]\displaystyle{ k }[/math] with the doubling dimension an EPAS exists,[22] and the same is true when combining [math]\displaystyle{ k }[/math] with the highway dimension.[23] For the more general version with vertex capacities, an EPAS exists for the parameterization by k and the doubling dimension, but not when using k and the highway dimension as the parameter.[24] Regarding the pathwidth, k-Center admits an EPAS even for the more general treewidth parameter, and also for cliquewidth.[25]

Densest subgraph

An optimization variant of the k-Clique problem is the Densest k-Subgraph problem (which is a 2-ary Constraint Satisfaction problem), where the task is to find a subgraph on [math]\displaystyle{ k }[/math] vertices with maximum number of edges. It is not hard to obtain a [math]\displaystyle{ (k-1) }[/math]-approximation by just picking a matching of size [math]\displaystyle{ k/2 }[/math] in the given input graph, since the maximum number of edges on [math]\displaystyle{ k }[/math] vertices is always at most [math]\displaystyle{ {k \choose 2}= k(k-1)/2 }[/math]. This is also asymptotically optimal, since under Gap-ETH no [math]\displaystyle{ k^{1-o(1)} }[/math]-approximation can be computed in FPT time parameterized by [math]\displaystyle{ k }[/math].[26]

Dominating set

For the Dominating set problem it is W[1]-hard to compute any [math]\displaystyle{ g(k) }[/math]-approximation in [math]\displaystyle{ f(k)n^{O(1)} }[/math] time for any functions [math]\displaystyle{ g }[/math] and [math]\displaystyle{ f }[/math].[27]

Approximate kernelization

Kernelization is a technique used in fixed-parameter tractability to pre-process an instance of an NP-hard problem in order to remove "easy parts" and reveal the NP-hard core of the instance. A kernelization algorithm takes an instance [math]\displaystyle{ I }[/math] and a parameter [math]\displaystyle{ k }[/math], and returns a new instance [math]\displaystyle{ I' }[/math] with parameter [math]\displaystyle{ k' }[/math] such that the size of [math]\displaystyle{ I' }[/math] and [math]\displaystyle{ k' }[/math] is bounded as a function of the input parameter [math]\displaystyle{ k }[/math], and the algorithm runs in polynomial time. An [math]\displaystyle{ \alpha }[/math]-approximate kernelization algorithm is a variation of this technique that is used in parameterized approximation algorithms. It returns a kernel [math]\displaystyle{ I' }[/math] such that any [math]\displaystyle{ \beta }[/math]-approximation in [math]\displaystyle{ I' }[/math] can be converted into an [math]\displaystyle{ \alpha\beta }[/math]-approximation to the input instance [math]\displaystyle{ I }[/math] in polynomial time. This notion was introduced by Lokshtanov et al.,[28] but there are other related notions in the literature such as Turing kernels[29] and [math]\displaystyle{ \alpha }[/math]-fidelity kernelization.[30]

As for regular (non-approximate) kernels, a problem admits an α-approximate kernelization algorithm if and only if it has a parameterized α-approximation algorithm. The proof of this fact is very similar to the one for regular kernels.[28] However the guaranteed approximate kernel might be of exponential size (or worse) in the input parameter. Hence it becomes interesting to find problems that admit polynomial sized approximate kernels. Furthermore, a polynomial-sized approximate kernelization scheme (PSAKS) is an [math]\displaystyle{ \alpha }[/math]-approximate kernelization algorithm that computes a polynomial-sized kernel and for which [math]\displaystyle{ \alpha }[/math] can be set to [math]\displaystyle{ 1+\varepsilon }[/math] for any [math]\displaystyle{ \varepsilon\gt 0 }[/math].

For example, while the Connected Vertex Cover problem is FPT parameterized by the solution size, it does not admit a (regular) polynomial sized kernel (unless [math]\displaystyle{ NP\subseteq coNP/poly }[/math]), but a PSAKS exists.[28] Similarly, the Steiner Tree problem is FPT parameterized by the number of terminals, does not admit a polynomial sized kernel (unless [math]\displaystyle{ NP\subseteq coNP/poly }[/math]), but a PSAKS exists.[28] When parameterizing Steiner Tree by the number of non-terminals in the optimum solution, the problem is W[2]-hard (and thus admits no exact kernel at all, unless FPT=W[2]), but still admits a PSAKS.[8]

Talks on parameterized approximations

References

  1. Marx, Daniel (2008). "Parameterized Complexity and Approximation Algorithms". The Computer Journal 51 (1): 60–78. doi:10.1093/comjnl/bxm048. https://doi.org/10.1093/comjnl/bxm048. 
  2. Feldmann, Andreas Emil; Karthik C. S; Lee, Euiwoong; Manurangsi, Pasin (2020). "A Survey on Approximation in Parameterized Complexity: Hardness and Algorithms" (in en). Algorithms 13 (6): 146. doi:10.3390/a13060146. ISSN 1999-4893.  This article incorporates text from this source, which is available under the CC BY 4.0 license.
  3. Manurangsi, Pasin (2018). "Inapproximability of Maximum Biclique Problems, Minimum k-Cut and Densest At-Least-k-Subgraph from the Small Set Expansion Hypothesis" (in en). Algorithms 11 (1): 10. doi:10.3390/a11010010. ISSN 1999-4893. 
  4. G. Downey, Rodney; Estivill-Castro, Vladimir; Fellows, Michael; Prieto, Elena; Rosamund, Frances A. (2003-04-01). "Cutting Up Is Hard To Do: The Parameterised Complexity of k-Cut and Related Problems" (in en). Electronic Notes in Theoretical Computer Science. CATS'03, Computing: the Australasian Theory Symposium 78: 209–222. doi:10.1016/S1571-0661(04)81014-4. ISSN 1571-0661. 
  5. Lokshtanov, Daniel; Saurabh, Saket; Surianarayanan, Vaishali (2022-04-25). "A Parameterized Approximation Scheme for Min $k$-Cut". SIAM Journal on Computing: FOCS20–205. doi:10.1137/20M1383197. ISSN 0097-5397. https://epubs.siam.org/doi/10.1137/20M1383197. 
  6. Dreyfus, S. E.; Wagner, R. A. (1971). "The steiner problem in graphs" (in en). Networks 1 (3): 195–207. doi:10.1002/net.3230010302. https://onlinelibrary.wiley.com/doi/10.1002/net.3230010302. 
  7. Chlebík, Miroslav; Chlebíková, Janka (2008-10-31). "The Steiner tree problem on graphs: Inapproximability results" (in en). Theoretical Computer Science. Algorithmic Aspects of Global Computing 406 (3): 207–214. doi:10.1016/j.tcs.2008.06.046. ISSN 0304-3975. https://www.sciencedirect.com/science/article/pii/S0304397508004660. 
  8. 8.0 8.1 Dvořák, Pavel; Feldmann, Andreas E.; Knop, Dušan; Masařík, Tomáš; Toufar, Tomáš; Veselý, Pavel (2021-01-01). "Parameterized Approximation Schemes for Steiner Trees with Small Number of Steiner Vertices". SIAM Journal on Discrete Mathematics 35 (1): 546–574. doi:10.1137/18M1209489. ISSN 0895-4801. https://epubs.siam.org/doi/10.1137/18M1209489. 
  9. Guo, Jiong; Niedermeier, Rolf; Suchý, Ondřej (2011-01-01). "Parameterized Complexity of Arc-Weighted Directed Steiner Problems". SIAM Journal on Discrete Mathematics 25 (2): 583–599. doi:10.1137/100794560. ISSN 0895-4801. https://epubs.siam.org/doi/10.1137/100794560. 
  10. Halperin, Eran; Krauthgamer, Robert (2003-06-09). "Polylogarithmic inapproximability". Proceedings of the thirty-fifth annual ACM symposium on Theory of computing. STOC '03. New York, NY, USA: Association for Computing Machinery. pp. 585–594. doi:10.1145/780542.780628. ISBN 978-1-58113-674-6. https://doi.org/10.1145/780542.780628. 
  11. Chitnis, Rajesh; Hajiaghayi, MohammadTaghi; Kortsarz, Guy (2013). Gutin, Gregory; Szeider, Stefan. eds (in en). Fixed-Parameter and Approximation Algorithms: A New Look. Lecture Notes in Computer Science. 8246. Cham: Springer International Publishing. 110–122. doi:10.1007/978-3-319-03898-8_11. ISBN 978-3-319-03898-8. https://link.springer.com/chapter/10.1007/978-3-319-03898-8_11. 
  12. Chitnis, Rajesh; Feldmann, Andreas Emil; Manurangsi, Pasin (2021-04-19). "Parameterized Approximation Algorithms for Bidirected Steiner Network Problems". ACM Transactions on Algorithms 17 (2): 12:1–12:68. doi:10.1145/3447584. ISSN 1549-6325. https://doi.org/10.1145/3447584. 
  13. 13.0 13.1 Cohen-Addad, Vincent; Gupta, Anupam; Kumar, Amit; Lee, Euiwoong; Li, Jason (2019). Baier, Christel; Chatzigiannakis, Ioannis; Flocchini, Paola et al.. eds. "Tight FPT Approximations for k-Median and k-Means". 46th International Colloquium on Automata, Languages, and Programming (ICALP 2019). Leibniz International Proceedings in Informatics (LIPIcs) (Dagstuhl, Germany: Schloss Dagstuhl–Leibniz-Zentrum fuer Informatik) 132: 42:1–42:14. doi:10.4230/LIPIcs.ICALP.2019.42. ISBN 978-3-95977-109-2. http://drops.dagstuhl.de/opus/volltexte/2019/10618. 
  14. Kolliopoulos, Stavros G.; Rao, Satish (1999), Nešetřil, Jaroslav, ed., "A Nearly Linear-Time Approximation Scheme for the Euclidean k-median Problem", Algorithms - ESA’ 99 (Berlin, Heidelberg: Springer Berlin Heidelberg) 1643: pp. 378–389, doi:10.1007/3-540-48481-7_33, ISBN 978-3-540-66251-8, http://link.springer.com/10.1007/3-540-48481-7_33, retrieved 2023-01-24 
  15. Cohen-Addad, Vincent (2018-01-01), "A Fast Approximation Scheme for Low-Dimensional k-Means", Proceedings of the 2018 Annual ACM-SIAM Symposium on Discrete Algorithms (SODA), Proceedings (Society for Industrial and Applied Mathematics): pp. 430–440, doi:10.1137/1.9781611975031.29, ISBN 978-1-61197-503-1, https://epubs.siam.org/doi/10.1137/1.9781611975031.29, retrieved 2023-01-24 
  16. Feldman, Dan; Monemizadeh, Morteza; Sohler, Christian (2007-06-06). "A PTAS for k-means clustering based on weak coresets". Proceedings of the twenty-third annual symposium on Computational geometry - SCG '07. New York, NY, USA: Association for Computing Machinery. pp. 11–18. doi:10.1145/1247069.1247072. ISBN 978-1-59593-705-6. https://doi.org/10.1145/1247069.1247072. 
  17. Feldman, Dan; Langberg, Michael (2011-06-06). "A unified framework for approximating and clustering data". Proceedings of the forty-third annual ACM symposium on Theory of computing. STOC '11. New York, NY, USA: Association for Computing Machinery. pp. 569–578. doi:10.1145/1993636.1993712. ISBN 978-1-4503-0691-1. https://doi.org/10.1145/1993636.1993712. 
  18. Cohen-Addad, Vincent; Feldmann, Andreas Emil; Saulpic, David (2021-10-31). "Near-linear Time Approximation Schemes for Clustering in Doubling Metrics". Journal of the ACM 68 (6): 44:1–44:34. doi:10.1145/3477541. ISSN 0004-5411. https://doi.org/10.1145/3477541. 
  19. Feldmann, Andreas Emil; Saulpic, David (2021-12-01). "Polynomial time approximation schemes for clustering in low highway dimension graphs" (in en). Journal of Computer and System Sciences 122: 72–93. doi:10.1016/j.jcss.2021.06.002. ISSN 0022-0000. https://www.sciencedirect.com/science/article/pii/S0022000021000647. 
  20. 20.0 20.1 Feldmann, Andreas Emil (2019-03-01). "Fixed-Parameter Approximations for k-Center Problems in Low Highway Dimension Graphs" (in en). Algorithmica 81 (3): 1031–1052. doi:10.1007/s00453-018-0455-0. ISSN 1432-0541. https://doi.org/10.1007/s00453-018-0455-0. 
  21. Feder, Tomás; Greene, Daniel (1988-01-01). "Optimal algorithms for approximate clustering". Proceedings of the twentieth annual ACM symposium on Theory of computing - STOC '88. New York, NY, USA: Association for Computing Machinery. pp. 434–444. doi:10.1145/62212.62255. ISBN 978-0-89791-264-8. https://doi.org/10.1145/62212.62255. 
  22. 22.0 22.1 Feldmann, Andreas Emil; Marx, Dániel (2020-07-01). "The Parameterized Hardness of the k-Center Problem in Transportation Networks" (in en). Algorithmica 82 (7): 1989–2005. doi:10.1007/s00453-020-00683-w. ISSN 1432-0541. https://doi.org/10.1007/s00453-020-00683-w. 
  23. Becker, Amariah; Klein, Philip N.; Saulpic, David (2018). Azar, Yossi; Bast, Hannah; Herman, Grzegorz. eds. "Polynomial-Time Approximation Schemes for k-center, k-median, and Capacitated Vehicle Routing in Bounded Highway Dimension". 26th Annual European Symposium on Algorithms (ESA 2018). Leibniz International Proceedings in Informatics (LIPIcs) (Dagstuhl, Germany: Schloss Dagstuhl–Leibniz-Zentrum fuer Informatik) 112: 8:1–8:15. doi:10.4230/LIPIcs.ESA.2018.8. ISBN 978-3-95977-081-1. http://drops.dagstuhl.de/opus/volltexte/2018/9471. 
  24. Feldmann, Andreas Emil; Vu, Tung Anh (2022). Bekos, Michael A.; Kaufmann, Michael. eds (in en). Generalized $$k$$-Center: Distinguishing Doubling and Highway Dimension. Lecture Notes in Computer Science. 13453. Cham: Springer International Publishing. 215–229. doi:10.1007/978-3-031-15914-5_16. ISBN 978-3-031-15914-5. https://link.springer.com/chapter/10.1007/978-3-031-15914-5_16. 
  25. Katsikarelis, Ioannis; Lampis, Michael; Paschos, Vangelis Th. (2019-07-15). "Structural parameters, tight bounds, and approximation for (k,r)-center" (in en). Discrete Applied Mathematics. Combinatorial Optimization: between Practice and Theory 264: 90–117. doi:10.1016/j.dam.2018.11.002. ISSN 0166-218X. https://www.sciencedirect.com/science/article/pii/S0166218X18306024. 
  26. Dinur, Irit; Manurangsi, Pasin (2018). Karlin, Anna R.. ed. "ETH-Hardness of Approximating 2-CSPs and Directed Steiner Network". 9th Innovations in Theoretical Computer Science Conference (ITCS 2018). Leibniz International Proceedings in Informatics (LIPIcs) (Dagstuhl, Germany: Schloss Dagstuhl–Leibniz-Zentrum fuer Informatik) 94: 36:1–36:20. doi:10.4230/LIPIcs.ITCS.2018.36. ISBN 978-3-95977-060-6. http://drops.dagstuhl.de/opus/volltexte/2018/8367. 
  27. S., Karthik C.; Laekhanukit, Bundit; Manurangsi, Pasin (2018-06-20). "On the parameterized complexity of approximating dominating set". Proceedings of the 50th Annual ACM SIGACT Symposium on Theory of Computing. STOC 2018. New York, NY, USA: Association for Computing Machinery. pp. 1283–1296. doi:10.1145/3188745.3188896. ISBN 978-1-4503-5559-9. https://doi.org/10.1145/3188745.3188896. 
  28. 28.0 28.1 28.2 28.3 Lokshtanov, Daniel; Panolan, Fahad; Ramanujan, M. S.; Saurabh, Saket (2017-06-19). "Lossy kernelization". Proceedings of the 49th Annual ACM SIGACT Symposium on Theory of Computing. STOC 2017. New York, NY, USA: Association for Computing Machinery. pp. 224–237. doi:10.1145/3055399.3055456. ISBN 978-1-4503-4528-6. https://doi.org/10.1145/3055399.3055456. 
  29. Hermelin, Danny; Kratsch, Stefan; Sołtys, Karolina; Wahlström, Magnus; Wu, Xi (2015-03-01). "A Completeness Theory for Polynomial (Turing) Kernelization" (in en). Algorithmica 71 (3): 702–730. doi:10.1007/s00453-014-9910-8. ISSN 1432-0541. https://doi.org/10.1007/s00453-014-9910-8. 
  30. Fellows, Michael R.; Kulik, Ariel; Rosamond, Frances; Shachnai, Hadas (2018-05-01). "Parameterized approximation via fidelity preserving transformations" (in en). Journal of Computer and System Sciences 93: 30–40. doi:10.1016/j.jcss.2017.11.001. ISSN 0022-0000. https://www.sciencedirect.com/science/article/pii/S0022000017302222.