Articles

GPU to discover the connectivity of the human brain

by Jacqueline Ruck Writer, Blogger

A new GPU-based machine learning algorithm developed by researchers at the Indian Institute of Sciences (IISc) can help scientists better understand and predict connectivity between different brain regions. The algorithm , called Regularized, Accelerated, Linear Fascicle Evaluation, or ReAl-LiFE , can quickly analyze the huge amounts of data generated from diffusion magnetic resonance imaging (dMRI) of the human brain .

Using ReAL-LiFE, the team was able to evaluate dMRI data more than 150 times faster than existing state-of-the-art algorithms.

"Tasks that previously took hours or days can be completed in seconds or minutes," says Devarajan Sridharan, Associate Professor, Center for Neuroscience (CNS), IISc, and corresponding author of the study published in the journal  Nature Computational Science  .

Neurons and brain connectivity

Millions of neurons fire in the brain every second, generating electrical pulses that travel through neural networks from one point in the brain to another via connecting wires, or "axons." These connections are essential for the calculations that the brain performs.

 "Understanding brain connectivity is critical to uncovering relationships between the brain and behavior at scale," says Varsha Sreenivasan, CNS PhD student and first author of the study.

However, conventional approaches to studying brain connectivity often use animal models and are invasive. dMRI scans, on the other hand, provide a non-invasive method to study brain connectivity in humans.

The cables (axons) that connect different areas of the brain are your information highways. Because the axon bundles are shaped like tubes, water molecules move through them along their length in a directed manner. dMRI allows scientists to track this movement to create a complete map of the network of fibers in the brain, called a connectome .

Unfortunately, it is not easy to identify these connectomes. The data obtained from the scans only provides the net flux of water molecules at each point in the brain. “Imagine that the water molecules are cars. The information obtained is the direction and speed of the vehicles at each point in space and time without information about the roads. Our task is similar to inferring road networks by looking at these traffic patterns,” explains The Tech Observer.

The image shows the connections between the midbrain and various regions of the neocortex. The connections to each region are shown in a different color and were all estimated with diffusion MRI and tractography in the living human brain (Credits: Varsha Sreenivasan and Devarajan Sridharan)

Algorithms to identify neural networks

To identify these networks accurately, conventional algorithms closely match the predicted dMRI signal of the inferred connectome to the observed dMRI signal. Scientists had previously developed an algorithm called LiFE (Linear Fascicle Evaluation) to carry out this optimization, but one of its challenges was that it worked on traditional central processing units (CPUs), making the calculation time-consuming.

In the new study, Sridharan's team modified their algorithm to reduce the computational effort involved in several ways, including removing redundant connections, thereby significantly improving LiFE performance. To further speed up the algorithm, the team also redesigned it to run on specialized electronic chips, the kind found in high-end gaming computers , called Graphics Processing Units (GPUs) , which helped them analyze data. at speeds 100-150 times faster than previous approaches.

This improved algorithm, ReAl-LiFE, was also able to predict how a human test subject would behave or perform a specific task.

In other words, using the connection strengths estimated by the algorithm for each individual, the team was able to explain variations in cognitive and behavioral test scores in a group of 200 participants.

Such analysis may also have medical applications. “Large-scale data processing is increasingly necessary for big data neuroscience applications , especially to understand healthy brain function and brain pathology,” says Sreenivasan.

For example, using the connectomes obtained, the team hopes to be able to identify the first signs of aging or deterioration of brain function before they manifest themselves in the behavior of Alzheimer's patients.

"In another study, we found that an earlier version of ReAL-LiFE could perform better than other competing algorithms in distinguishing Alzheimer's disease patients from healthy controls," says Sridharan. He adds that its GPU-based implementation is very general and can also be used to address optimization problems in many other fields.

 


Sponsor Ads


About Jacqueline Ruck Freshman   Writer, Blogger

11 connections, 0 recommendations, 41 honor points.
Joined APSense since, October 24th, 2018, From New York, United States.

Created on Aug 15th 2022 23:24. Viewed 174 times.

Comments

No comment, be the first to comment.
Please sign in before you comment.