On the non-convergence of differential evolution: some generalized adversarial conditions and a remedy

In this paper, we analyze the convergence behavior of Differential Evolution (DE) and theoretically prove that under certain adversarial conditions, the generic DE algorithm may not at all converge to the global optimum even on apparently simpler fitness landscapes. We characterize these function classes and initialization conditions theoretically and provide mathematical supports to the non-convergence behavior of DE. To overcome these adversarial conditions, we propose a slightly modified variant of DE called Differential Evolution with Noisy Mutation (DENM), which incorporates a noise term in the mutation step. We analytically show that DENM can converge to the global optima within a finite budget of function evaluations.