fbpx Brainstorming the research: the future of brain studies | Science in the net

Brainstorming the research: the future of brain studies

Primary tabs

Read time: 4 mins

One of the most unknown places, still far from being explored and fully understood, is not (only) the deep universe: it is our brain. That is why USA and the European Union, in the very last years, started two separate, but parallel, initiatives to map the human brain. On March 2014, there has been a public announcement that the two projects will merge. Or will they smash?

The Human Brain Project

The first attempt to simulate some parts of the structure of a brain is dated back to 2005, when IBM, in collaboration with the Ecole Politecnique of Lausanne in Switzerland, has started a research program called the Blue Brain project. After just one year, researchers were able to create a virtual but biologically realistic model of a single neuron, thanks to a supercomputer called Blue Gene. In 2008, the Blue Brain project has recreated a cellular neocortical column of 10,000 cells and finally, in 2011, a virtual model of about a million neurons was built.

Given these results, the European Union decided in 2013 to foster the efforts and, thus, to map the entire human brain. Together with the Graphene flagship, the Human Brain Project (HBP) entered the funding program called Future and Emerging Technology (FET). This gave the researchers the access to over one billion euro in ten years. The HBP is still based in Lausanne (even after the recent tensions between Switzerland and the EU) and involves 135 partner institutions from 26 countries.

Briefly, the HBP is supposed to create a virtual but realistic model of the entire human brains. This will not only create a manageable model to study the mostly unknown brain diseases such as Alzheimer, but, through neuroinformatics, it is meant to have implications in neurorobotics and high performance computing. 

BRAIN initiative

The Brain Research through Advancing Innovative Neurotechnologies (BRAIN) is partly, but not deeply, different from the European HBP, both in aims and structure. Announced in April 2nd, 2013 by President Obama (who strongly endorsed the initiative), BRAIN will map the activity of every single neuron in our brain. The aim is to give “new dynamic picture of the brain that, for the first time, shows how individual cells and complex neural circuits interact in both time and space” (as explained in the interim report, which explains the six high priority research areas). The initiative is funded by the National Institutes of Health, the Defense Advanced Research Projects Agency (DARPA) and the National Science Foundation (NSF), with a budget of $ 40 millions for the fiscal year 2014. But, for the next year, President Obama has already doubled the funding, from $ 100 millions to $ 200 millions.

Synergies and difficulties

The two programmes will have approximately the same deadlines, the same budget (about 1 billion euro each) and many scientists from the two projects have already begun to collaborate informally, as it occurs, for instance at the Allen Institute for Brain Science. As reported by Nature, US government officials already gave the news of the merging process: HBP and BRAIN will work together. Meanwhile, after settling the political clash with the EU (at least for the moment), even Israel has been involved in the HBP/BRAIN partnership, through the HBP.

Many scientists said it was a natural wedding since, basically, the two programs deal with the same object (the brain) but from different perspectives (briefly: HBP is more focus on creating a “virtual brain”, BRAIN on discovering the interactions between all the brain's cells). But many other analysts fear that something can go very wrong: skeptics base their criticisms on at least three aspects.

First of all, many scientists say that the paradigm underlying the two projects is nowadays out of date or simply wrong. In other words, they say that the plasticity of the brain activities cannot be “photographed”, even through an incredibly complex virtual model. Then someone threatens a so-called “data tsunami”, as it has been estimated that a complete virtual brain could generate about 300,000 petabytes a year, namely 300,000 thousands billions of data every year. That is almost the same traffic of data the internet generated in 2013 (some analysts calculated it in 32,000 petabytes per month). For comparison, the Large Hadron Collider, once fully working, will generate “just” 15 petabytes. Finally, on the ethical side, shared principles have to be discussed, which is always not easy while dealing with the study of parts of the human body.


Scienza in rete è un giornale senza pubblicità e aperto a tutti per garantire l’indipendenza dell’informazione e il diritto universale alla cittadinanza scientifica. Contribuisci a dar voce alla ricerca sostenendo Scienza in rete. In questo modo, potrai entrare a far parte della nostra comunità e condividere il nostro percorso. Clicca sul pulsante e scegli liberamente quanto donare! Anche una piccola somma è importante. Se vuoi fare una donazione ricorrente, ci consenti di programmare meglio il nostro lavoro e resti comunque libero di interromperla quando credi.


prossimo articolo

Why have neural networks won the Nobel Prizes in Physics and Chemistry?

This year, Artificial Intelligence played a leading role in the Nobel Prizes for Physics and Chemistry. More specifically, it would be better to say machine learning and neural networks, thanks to whose development we now have systems ranging from image recognition to generative AI like Chat-GPT. In this article, Chiara Sabelli tells the story of the research that led physicist and biologist John J. Hopfield and computer scientist and neuroscientist Geoffrey Hinton to lay the foundations of current machine learning.

Image modified from the article "Biohybrid and Bioinspired Magnetic Microswimmers" https://onlinelibrary.wiley.com/doi/epdf/10.1002/smll.201704374

The 2024 Nobel Prize in Physics was awarded to John J. Hopfield, an American physicist and biologist from Princeton University, and to Geoffrey Hinton, a British computer scientist and neuroscientist from the University of Toronto, for utilizing tools from statistical physics in the development of methods underlying today's powerful machine learning technologies.