‘Mind-Reading’ Technology Translates Brainwaves into Photos


Researchers are developing “mind-reading” technology that can translate a person’s brainwaves into photographic images. 

In an article published in Nature, researchers at Radboud University in the Netherlands revealed the results from an experiment where they showed photos of faces to two volunteers inside a powerful brain-reading functional magnetic resonance imaging (fMRI) scanner. 

An fMRI scanner is a type of noninvasive brain imaging technology that detects brain activity by measuring changes in blood flow.

As the volunteers looked at the images of faces, the fMRI scanned the activity of neurons in the areas of their brain responsible for vision. 

The researchers then fed this information into a computer’s artificial intelligence (AI) algorithm which could build an accurate image based on the information from the fMRI scan. 

Continue reading… “‘Mind-Reading’ Technology Translates Brainwaves into Photos”

‘Artificial synapse’ could make neural networks work more like brains

Networks of nanoscale resistors that work in a similar way to nerve cells in the body could offer advantages over digital machine learning

By Alex Wilkins

A resistor that works in a similar way to nerve cells in the body could be used to build neural networks for machine learning.

Many large machine learning models rely on increasing amounts of processing power to achieve their results, but this has vast energy costs and produces large amounts of heat.

One proposed solution is analogue machine learning, which works like a brain by using electronic devices similar to neurons to act as the parts of the model. However, these devices have so far not been fast, small or efficient enough to provide advantages over digital machine learning.

Murat Onen at the Massachusetts Institute of Technology and his colleagues have created a nanoscale resistor that transmits protons from one terminal to another. This functions a bit like a synapse, a connection between two neurons, where ions flow in one direction to transmit information. But these “artificial synapses” are 1000 times smaller and 10,000 times faster than their biological counterparts.

Just as a human brain learns by remodelling the connections between millions of interconnected neurons, so too could machine learning models run on networks of these nanoresistors.

“We are doing somewhat similar things [to biology], like ion transport, but we are now doing it so fast, whereas biology couldn’t,” says Onen, whose device is a million times faster than previous proton-transporting devices.

Continue reading… “‘Artificial synapse’ could make neural networks work more like brains”




Scientists have finally decoded the bizarre behaviors of brain cells — and recreated them in tiny computer chips.

The tiny neurons could change the way we build medical devices because they replicate healthy biological activity but require only a billionth of the energy needed by microprocessors, according to a University of Bath.

Neurons behave similar to electrical circuits within the body, but their behavior is less predictable — especially when it comes to parsing the relationship between their input and output electrical impulses. But these new artificial brain cells successfully mimic the behavior of rat neurons from two specific regions of the brain, according to researchpublished Tuesday in Nature Communications.

“Until now neurons have been like black boxes, but we have managed to open the black box and peer inside,” University of Bath physicist Alain Nogaret said in the release. “Our work is paradigm changing because it provides a robust method to reproduce the electrical properties of real neurons in minute detail.”


Scientists taught a petri dish of brain cells to play pong faster than an AI

By Hope Corrigan

Move over Alder Lake, this is a new kind of hybrid chip.

As a lover of tough single player games, I’m quite accustomed to getting my butt handed to me by AI, and usually not even a real one. I also happen to be the owner of a full sized human brain. Though it’s not without its problems, the human brain’s ability to learn and change is usually why I eventually overcome those difficult in-game challenges.

So when I read about a few human brain cells in a petri dish that are already performing much better at a videogame than AI can, it’s concerning to me and my gaming future. New Scientist reports that a team in Australia has been growing these small puddles of brain and now one has learnt to play Pong, in fairly impressive time.

Cortical labs is a company working on integrating biological neurons with your more traditional silicon based computing hardware. They grow brain cells on microelectronic arrays, so the cells can be stimulated. These hybrid chips are said to be able to learn and restructure themselves to get past problems, like stopping a sneaky ball that wants in your goal.

Continue reading… “Scientists taught a petri dish of brain cells to play pong faster than an AI”

Human Brain Project: Researchers design artificial cerebellum that can learn to control a robot’s movement

The Robot used by the Applied Computational Neuroscience research group of the University of Granada.

Researchers at Human Brain Project partner University of Granada in Spain have designed a new artificial neural network that mimics the structure of the cerebellum, one of the evolutionarily older parts of the brain, which plays an important role in motor coordination. When linked to a robotic arm, their system learned to perform precise movements and interact with humans in different circumstances, surpassing performance of previous AI-based robotic steering systems. The results have been published in the journal Science Robotics.

It is the most biologically realistic and detailed model of the cerebellum to date capable of work in real-time, and replicates not only aspects of the structure, but also its adaptability and capacity to learn. By taking inspiration from the brain in this way, the scientists were able to solve one of the common technological challenges in robotics: Their cerebellar spiking neural network enables the robot to deal with so-called latency, or time delays, which is a central real-world problem for computational systems in robotics, especially during wireless or remote steering.

The research could also help to control new bio-inspired robots, which are equipped with elastic and flexible components that replicate the muscles and tendons of the human body. Such “co-bots” are safer for human interaction, but their flexibility makes it difficult to use classical control techniques. 

Continue reading… “Human Brain Project: Researchers design artificial cerebellum that can learn to control a robot’s movement”

A system to control robotic arms based on augmented reality and a brain-computer interface

By Ingrid Fadelli

For people with motor impairments or physical disabilities, completing daily tasks and house chores can be incredibly challenging. Recent advancements in robotics, such as brain-controlled robotic limbs, have the potential to significantly improve their quality of life.

Researchers at Hebei University of Technology and other institutes in China have developed an innovative system for controlling robotic arms that is based on augmented reality (AR) and a brain-computer interface. This system, presented in a paper published in the Journal of Neural Engineering, could enable the development of bionic or prosthetic arms that are easier for users to control.

“In recent years, with the development of robotic arms, brain science and information decoding technology, brain-controlled robotic arms have attained increasing achievements,” Zhiguo Luo, one of the researchers who carried out the study, told TechXplore. “However, disadvantages like poor flexibility restrict their widespread application. We aim to promote the lightweight and practicality of brain-controlled robotic arms.”

The system developed by Luo and his colleagues integrates AR technology, which allows users to view an enhanced version of their surroundings that includes digital elements, and a brain-controlled interface, with a conventional method for controlling robotic limbs known as asynchronous control. This ultimately allows users to achieve greater control over robotic arms, enhancing the accuracy and efficiency of the resulting movements.

Continue reading… “A system to control robotic arms based on augmented reality and a brain-computer interface”

First Personalized Deep Brain Stimulation Treats Depression

Severely depressed patient treated via a personalized neural biomarker.

By Kaja Perina

Personalized medicine is the applied treatment or medication that is based on information about a patient’s genetics, lifestyle, and environment. A new study published in Nature Medicine shows how the combination of precision-medicine and deep brain stimulation (DBS) successfully treated a patient with severe treatment-resistant depression.

Led by Andrew Krystal, PhD, professor of psychiatry and member of the University of California San Francisco (UCSF) Weill Institute for Neurosciences, the study opens the possibility of using a precision-medicine approach combined with deep brain stimulations for the treatment of psychiatric disorders.

Researchers discovered a neural biomarker by finding the pattern of brain activity associated with the onset of symptoms and used that data to personalize a DBS device to activate when the pattern is spotted. Specifically, the device gave a 1mA electrical stimulation for six seconds when it detected the neural biomarker.

Continue reading… “First Personalized Deep Brain Stimulation Treats Depression”

World’s Smallest Brain-Inspired Computer – So Small That It Can Harvest Its Energy Itself

The energy consumption of the device will be so small that it can harvest its energy itself, directly from its surroundings. The project has received funding from the Villum Experiment program.


Artificial intelligence (AI) has seen explosive growth in recent years, but despite major progress, the power required to run AI algorithms continues to increase.

In stark contrast to this, the human brain only requires around 20W to perform more than 10 quadrillions (10,000,000,000,000,000) operations. This is 12 orders of magnitude better than modern supercomputer technologies.

“That’s why we’re conducting intensive research into developing new hardware that mimics the structure of the human brain, with neurons, synapses, and neural networks, known as brain-inspired computing (BICs). But even though we’ve managed to drastically reduce the energy consumption of AI algorithms, there’s still a long way to go before BICs are as efficient as the human brain when it comes to size and energy efficiency,” says Hooman Farkhani, an assistant professor at the Department of Electrical and Computer Engineering at Aarhus University.

Continue reading… “World’s Smallest Brain-Inspired Computer – So Small That It Can Harvest Its Energy Itself”

Intel’s Loihi 2 speeds effort to make neuromorphic chips like human brains

Intel’s Loihi 2 neuromorphic chip measures 30 square millimeters.

Stephen Shankland

The chip also is a key product in Intel’s plan to reclaim its processor manufacturing prowess.

Intel unveiled its Loihi 2 chip on Thursday, the second generation of a processor family that marries conventional electronics with the architecture of human brains to try to inject some new progress into the computing industry. On top of that, the chip also helps Intel advance its own manufacturing technology.

Loihi 2, an example of a technology called neuromorphic computing, is about 10 times faster than its predecessor, according to Intel. The speed improvement is the result of an eightfold increase in the number of digital neurons, a chip equivalent to human brain cells that mimic the way the brains handle information. The chip also can be programmed better to help researchers tackle more computing tasks.

The chip is built with a preproduction version of the Intel 4 manufacturing process, too, an advanced method Intel plans to use to build mainstream Intel chips arriving in 2023. The Intel 4 process can etch electronics more densely on a chip, a crucial advantage for Intel’s need to pack a million digital neurons on a chip measuring 30 square millimeters.

Loihi chips are particularly good at rapidly spotting sensory input like gestures, sounds and even smells, says Mike Davies, leader of the Intel Labs group that developed Loihi. Some experiments have focused on artificial skin that could give robots a better sense of touch. “We can detect slippage if a robot hand is picking up a cup,” Davies said.

Neuromorphic computing differs from artificial intelligence, a revolutionary computer technology based more loosely on how brains learn and respond, because it focuses more on the physical characteristics of human gray matter.

Continue reading… “Intel’s Loihi 2 speeds effort to make neuromorphic chips like human brains”

Samsung wants to reverse engineer human brain and replicate it on 3D chip

By Asif S.

Samsung has announced a new way to reverse engineer the human brain and mimic it with semiconductor chips. The world’s biggest memory chip maker has collaborated with Harvard University researchers to share a new approach that takes the world one step closer to making neuromorphic chips.

Harvard scholars and Samsung engineers have published a new perspective paper titled ‘Neuromorphic electronics based on copying and pasting the brain’ on Nature Electronics.

Continue reading… “Samsung wants to reverse engineer human brain and replicate it on 3D chip”

New Brain Implant Restores Sense of Touch on Fingertips

A chiropractor performing a nerve conduction velocity (NCV) test on a patient.

By  Fabienne Lang

The little electrode brain implant has the potential to help millions of people living with paralysis and neuropathy.

Imagine not being able to feel the warmth of a hand holding yours, or the buttons of your shirt as you try and do it up.

Millions of people live with paralysis and peripheral neuropathy — when nerves in the body’s extremities, such as hands and feet, are damaged — and aren’t able to feel sensations through their fingertips and toes. 

But that might all be about to change.

Researchers at The Feinstein Institutes for Medical Research managed to evoke the sense of touch in fingers using a minimally invasive electrode brain implant. The study, a first-in-human one, offers the potential to change the lives of millions of people around the world.

The details were published in the journal Brain Stimulation.

Continue reading… “New Brain Implant Restores Sense of Touch on Fingertips”

Tiny “Neurograins” Could Power Next Generation of Brain-Computer Interfaces

Tiny chips called neurograins are able to sense electrical activity in the brain and transmit that data wirelessly. Credit: Jihun Lee/ Brown University

Brain-computer interfaces (BCIs) are emerging assistive devices that may one day help people with brain or spinal injuries to move or communicate. BCI systems depend on implantable sensors that record electrical signals in the brain and use those signals to drive external devices like computers or robotic prosthetics.

Most current BCI systems use one or two sensors to sample up to a few hundred neurons, but neuroscientists are interested in systems that are able to gather data from much larger groups of brain cells.

Now, a team of researchers has taken a key step toward a new concept for a future BCI system — one that employs a coordinated network of independent, wireless microscale neural sensors, each about the size of a grain of salt, to record and stimulate brain activity. The sensors, dubbed “neurograins,” independently record the electrical pulses made by firing neurons and send the signals wirelessly to a central hub, which coordinates and processes the signals.

Continue reading… “Tiny “Neurograins” Could Power Next Generation of Brain-Computer Interfaces”