MSc project proposal 2024
Title of the MSc project: Evolutionary Unconstraint Networks (EVOLUNET)
|
University: Université Claude Bernard Lyon 1 INSA Lyon INRIA |
Laboratories and teams : CREATIS(MYRIAD Team) |
Supervisors: M. Guillaume BESLON in BEAGLE team (LIRIS) M. Emmanuel ROUX, in MYRIAD team (CREATIS) M. Jonathan Rouzaud-Cornabas in BEAGLE team (LIRIS) |
Keywords: Neural Architecture Search, Machine Learning, Computational Evolution, Digital Evolution |
Scientific field and context:
One of the main motivations of this project is to explore new neural network architectures (Neural Architecture Search) [White et al., 2023] via the use of a computational evolution algorithm.
Despite the current predominance of approaches in which fixed, deep architectures are trained by error gradient backpropagation algorithms, the exploration of network architectures by evolutionary algorithms (gradient-free) represents a competitive and relatively unexplored alternative [Conti et al., 2018].
Yet the neural architecture research domain is rooted in computational evolution [Gaier and Ha, 2019], hence our motivation to rely on original evolutionary principles [Banzhaf et al., 2006]. Indeed, we want to "learn" the neural network architecture while taking advantage of the frugality of the results obtained (evolution starting from a principle of minimal complexity) and their intrinsic capabilities [Gaier and Ha, 2019], which are sometimes surprising [Lehman et al., 2020].
In particular, the aim is to enable the exploitation of a library of varied mutational operators, inspired by evolutionary genomics. In biology, genomes are known to be subject to a wide variety of sequence-modifying mechanisms (substitution, InDels, chromosome rearrangement, crossover, etc.) during their evolution, but "classic" evolutionary algorithms are generally based on one single mutation operator (or two a most, typically substitution and crossover).
However, a growing number of results show that "alternative" operators (in particular rearrangements) can facilitate exploration of the search space [Trujillo et al., 2022]. Indeed, rearrangements make it possible to gene duplication/deletion, to explain the molecular complexification of organisms in the course of phylogeny. This is why we propose to integrate these operators in iteratively building a neural network through an innovative artificial evolution mechanism.
Expected innovative contributions:
Bioinspired mechanisms to regulate the complexity of neural networks according to the complexity of the task to be solved and the resources available to solve it.
Explore the possibility of letting the network population continue to adapt over time (Life-Long Learning), without requiring excessive external computational resources (no computational clusters, for example).
Research program and proposed scientific approach:
Explore innovative computational bio-genomic methods to learn/individualize frugal neural network architectures.
Test chromosomal rearrangement-type mutational operators (duplication/deletion) likely to modify network complexity during evolution.
Testing simplification approaches (evolutionary pruning) to constrain network complexity [Liard et al., 2020].
Test mixed precision approaches to synaptic weights, with the evolutionary algorithm adapting both the structure of the network (via the chromosomal rearrangement mechanisms mentioned above) and the precision of the weights via "InDels" mechanisms enabling adaptation of gene length.
Compare, on simple tasks (low-dimensional classification), the low-energy neural networks obtained (intrinsic limitation of network complexity by the evolution algorithm) with state-of-the-art architectures for which complexity is predefined, generating very high consumption of computing resources.
Expected candidate profile (prerequisite): machine learning, programming. Interest for biological, bio-genomic, and/or biomedical field(s).
Skills that will be developed during the project: strong experience in applied machine learning, integrating bio-inspired evolution mechanisms for neural network architecture design, collaborative/versionned programming. Ability to write a research article.
Contacts:
guillaume.beslon@inria.fr, emmanuel.roux@creatis.insa-lyon.fr
REFERENCES (first cited – first listed):
White C. et al., Neural architecture search: Insights from 1000 papers, arXiv preprint arXiv:2301.08727, 2023. https://deepai.org/publication/neural-architecture-search-insights-from-1000-papers
Conti E., V. Madhavan, F. Petroski Such, J. Lehman, K. Stanley, and J. Clune, Improving Exploration in Evolution Strategies for Deep Reinforcement Learning via a Population of Novelty-Seeking Agents, in Advances in Neural Information Processing Systems, Curran Associates, Inc., 2018. https://proceedings.neurips.cc/paper_files/paper/2018/hash/b1301141feffabac455e1f90a7de2054-Abstract.html
Gaier A. and Ha D., Weight Agnostic Neural Networks, in Advances in Neural Information Processing Systems 32, H. Wallach, H. Larochelle, A. Beygelzimer, F. Alché-Buc, E. Fox, and R. Garnett, Eds., Curran Associates, Inc., 2019, pp. 5364–5378. http://papers.nips.cc/paper/8777-weight-agnostic-neural-networks.pdf
Banzhaf W., G. Beslon, S. Christensen, J. A. Foster, F. Képès, V. Lefort, ... and J. J. Ramsden (2006). From artificial evolution to computational evolution: a research agenda. Nature Reviews Genetics, 7(9), 729-735.
Lehman J. et al., The Surprising Creativity of Digital Evolution: A Collection of Anecdotes from the Evolutionary Computation and Artificial Life Research Communities, Artif. Life, vol. 26, no. 2, pp. 274–306, May 2020, doi: 10.1162/artl_a_00319.
Trujillo L., P. Banse, & G. Beslon (2022). Getting higher on rugged landscapes: Inversion mutations open access to fitter adaptive peaks in NK fitness landscapes. PLoS Computational Biology, 18(10), e1010647.
Liard V., D. P. Parsons, J. Rouzaud-Cornabas, & G. Beslon (2020). The complexity ratchet: Stronger than selection, stronger than evolvability, weaker than robustness. Artificial life, 26(1), 38-57.