Open access
Autor(in)
Datum
2023Typ
- Doctoral Thesis
ETH Bibliographie
yes
Altmetrics
Abstract
The human brain and the evolutionary process (that created the human brain) are two powerful general problem solvers from nature. The attempt to understand and design algorithms based on them has led to the fields of neuroscience and evolutionary computing. This thesis contributes to these two areas by analyzing evolutionary algorithms and bio-inspired neural networks from a theoretical and empirical perspective.
Concretely, we contribute to the understanding of the two bio-inspired methods with three separate projects. First, we study the effect of population size in a mutation-based evolutionary algorithm on optimizing pseudo-Boolean functions. We find that for some monotone functions, increasing the population size by just a constant can have devastating impacts on performance. This is in stark contrast to many other benchmark functions on which a larger population size leads to either positive or neutral effects. Moreover, we show that large population sizes only cause troubles far away from the optimum, which is counter-intuitive since usually optimization gets harder as we approach the optimum.
Next, we consider a dynamic parameter control mechanism in controlling the offspring population size of a non-elitist evolutionary algorithm. Previous work has shown that the mechanism can run into problems if the fitness landscape is too easy, and it is conjectured that the easiest benchmark function, OneMax, suffers the most from this issue. However, we show that there are other functions for which the problem is more severe than for OneMax, thus disproving the conjecture.
Lastly, we model a sparse neural network with unstructured connections in the olfactory system of Drosophila. We show that despite the constraints imposed by the biological system, with a bio-plausible mechanism of global inhibition, the network can be a close approximation to a powerful machine learning model. We further investigate the effect of sparsity and find that the network can achieve a good balance between learning and noise resistance by employing sparse connections. Mehr anzeigen
Persistenter Link
https://doi.org/10.3929/ethz-b-000614107Publikationsstatus
publishedExterne Links
Printexemplar via ETH-Bibliothek suchen
Verlag
ETH ZurichThema
Evolutionary algorithm; Monotone functions; Runtime analysis; Parameter control; Computational neuroscience; Neural networksOrganisationseinheit
03672 - Steger, Angelika / Steger, Angelika
ETH Bibliographie
yes
Altmetrics