Mk Landscapes, NK Landscapes, MAX-kSAT: A Proof that the Only Challenging Problems are Deceptive

This paper investigates Gray Box Optimization for pseudo-Boolean optimization problems composed of M subfunctions, where each subfunction accepts at most k variables. We will refer to these as Mk Landscapes. In Gray Box optimization, the optimizer is given access to the set of M subfunctions. If the set of subfunctions is k-bounded and separable, the Gray Box optimizer is guaranteed to return the global optimum with 1 evaluation. A problem is said to be order k deceptive if the average values of hyperplanes over combinations of k variables cannot be used to infer a globally optimal solution. Hyperplane averages are always efficiently computable for Mk Landscapes. If a problem is not deceptive, the Gray Box optimizer also returns the global optimum after 1 evaluation. Finally, these concepts are used to understand the nonlinearity of problems in the complexity class P, such as Adjacent NK Landscapes. These ideas are also used to understand the problem structure of NP Hard problems such as MAX-kSAT and general Mk Landscapes. In general, NP Hard problems are profoundly deceptive.

[1]  L. Darrell Whitley,et al.  Genetic Algorithm Behavior in the MAXSAT Domain , 1998, PPSN.

[2]  F. Guerra Spin Glasses , 2005, cond-mat/0507581.

[3]  Robert B. Heckendorn Embedded Landscapes , 2002, Evolutionary Computation.

[4]  Janet Wiles,et al.  A comparison of neutral landscapes - NK, NKp and NKq , 2002, Proceedings of the 2002 Congress on Evolutionary Computation. CEC'02 (Cat. No.02TH8600).

[5]  Hector J. Levesque,et al.  Hard and Easy Distributions of SAT Problems , 1992, AAAI.

[6]  L. Darrell Whitley,et al.  A Tractable Walsh Analysis of SAT and its Implications for Genetic Algorithms , 1998, AAAI/IAAI.

[7]  David E. Goldberg,et al.  Genetic Algorithms in Search Optimization and Machine Learning , 1988 .

[8]  L. Darrell Whitley,et al.  Constant time steepest descent local search with lookahead for NK-landscapes and MAX-kSAT , 2012, GECCO '12.

[9]  D. E. Goldberg,et al.  Genetic Algorithms in Search , 1989 .

[10]  Clifford Stein,et al.  Introduction to Algorithms, 2nd edition. , 2001 .

[11]  Thomas Stützle,et al.  Stochastic Local Search: Foundations & Applications , 2004 .

[12]  Xin-She Yang,et al.  Introduction to Algorithms , 2021, Nature-Inspired Optimization Algorithms.

[13]  Alden H. Wright,et al.  The computational complexity of N-K fitness functions , 2000, IEEE Trans. Evol. Comput..

[14]  Endre Boros,et al.  Pseudo-Boolean optimization , 2002, Discret. Appl. Math..

[15]  David E. Goldberg,et al.  Genetic Algorithms and Walsh Functions: Part I, A Gentle Introduction , 1989, Complex Syst..

[16]  Andrew M. Sutton,et al.  Efficient identification of improving moves in a ball for pseudo-boolean problems , 2014, GECCO.

[17]  John H. Holland,et al.  Adaptation in Natural and Artificial Systems: An Introductory Analysis with Applications to Biology, Control, and Artificial Intelligence , 1992 .

[18]  John H. Holland,et al.  Building Blocks, Cohort Genetic Algorithms, and Hyperplane-Defined Functions , 2000, Evolutionary Computation.

[20]  Thomas Jansen,et al.  Analyzing Evolutionary Algorithms , 2015, Natural Computing Series.

[21]  Doug Hains,et al.  Greedy or Not? Best Improving versus First Improving Stochastic Local Search for MAXSAT , 2013, AAAI.

[22]  Johan Håstad,et al.  Some optimal inapproximability results , 2001, JACM.

[23]  Sébastien Vérel,et al.  Local Optima Networks of NK Landscapes With Neutrality , 2011, IEEE Transactions on Evolutionary Computation.

[24]  Thomas Jansen,et al.  Analyzing Evolutionary Algorithms: The Computer Science Perspective , 2012 .