Article Image
read

Here it is, my first paper publication in Frontiers in Computational Neurosciences, in collaboration with Jean Rouat and Bertrand Reulet. It was a long path, but I’m proud of the result.

Here is a brief summary of my contribution to the great scientific enterprise:

Nowadays, the most powerful machine learning models require tens of thousands of GPUs and significant energy consumption to train billions of parameters. As reservoir computers become a time and cost-efficient alternative to traditional learning methods, a systematic approach to network design becomes crucial.

Previous studies have established the importance of critical regimes, or the “edge of chaos,” in optimizing the performance of spiking and binary neural networks. However, more attention must be given to understanding how dynamics and performance vary across reservoirs with the same connectivity statistics. Our work focuses on Random Boolean Networks (RBNs) and investigates how specific connectivity parameters influence the dynamics near critical points.

We identify distinct dynamical attractors and quantify their statistics, revealing that most reservoirs exhibit a dominant attractor. Our findings demonstrate that a positive excitatory balance in RBNs leads to a critical point with enhanced memory performance, while a negative inhibitory balance results in another critical point with improved prediction performance.

Interestingly, we observe that the intrinsic attractor dynamics have minimal influence on performance but act as a hallmark of criticality. This allows us to give specific recommendations for reducing the computational burden of the random generation process of reservoirs.

In the coming weeks, I am going to submit a paper closely related to this one,

So stay tuned!

Blog Logo

Emmanuel Calvet


Published

Image

Emmanuel Calvet

Engineer, researcher, and entrepreneur in artificial intelligence, blockchain, and quantum technology.

Back to Overview