Asynchronous Multi-fidelity Hyperparameter Optimization Of Spiking Neural Networks - Université de Lille
Communication Dans Un Congrès Année : 2024

Asynchronous Multi-fidelity Hyperparameter Optimization Of Spiking Neural Networks

Thomas Firmin
El-Ghazali Talbi
  • Fonction : Auteur

Résumé

Spiking Neural Network (SNN) are peculiar networks based on the dynamics of timed spikes between fully asynchronous neurons. Their design is complex and differs from usual artificial neural networks as they are highly sensitive to their hyperparameters. Some SNNs are unable to emit enough spikes at their outputs, causing a more challenging, even an impossible, learning task. Such networks are called silent networks. By considering mistuned hyperparameters and architecture, this concept describes a generalization of the signal-loss problem. In this work, to accelerate the hyperparameter optimization of SNNs trained by surrogate gradient, we propose to leverage silent networks and multi-fidelity. We designed an asynchronous black-box constrained and cost-aware Bayesian optimization algorithm to handle high-dimensional search spaces containing many silent networks, considered as infeasible solutions. Large-scale experimentation was computed on a multi-nodes and multi-GPUs environment. By considering the cost of evaluations, we were able to quickly obtain acceptable results for SNNs trained on a small proportion of the training dataset. We can rapidly stabilize the inherent high sensitivity of the SNNs' hyperparameters before computing expensive and more precise evaluations. We have extended our methodology for search spaces containing 21 and up to 46 layer-wise hyperparameters. Despite an increased difficulty due to the higher dimensional space, our results are competitive, even better, compared to their baseline. Finally, while up to 70% of sampled solutions were silent networks, their impact on the budget was less than 4%. The effect of silent networks on the available resources becomes almost negligible, allowing to define higher dimensional, more general and flexible search spaces.
Fichier principal
Vignette du fichier
snn_multifidelity_icons2024.pdf (6.09 Mo) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04781629 , version 1 (18-11-2024)

Identifiants

  • HAL Id : hal-04781629 , version 1

Citer

Thomas Firmin, Pierre Boulet, El-Ghazali Talbi. Asynchronous Multi-fidelity Hyperparameter Optimization Of Spiking Neural Networks. International Conference on Neuromorphic Systems (ICONS 2024), Jul 2024, Washington, United States. ⟨hal-04781629⟩
0 Consultations
0 Téléchargements

Partager

More