Western Sydney Uni and Intel to Jointly Develop Brain-Inspired Computing System
Researchers at the International Center for Neuromorphic Systems (ICNS) at the University of Western Sydney (WSU) have teamed up with Intel to create a proof of concept for a scalable, open source, and configurable neuromorphic computing system so they can learn more about how the brain works and how to build better AI.
Neuromorphic computing aims to use computer science to develop AI hardware that is more flexible and able to emulate the functions of the human brain, including contextual interpretation, sensory applications, and autonomic adaptation.
See: What is neuromorphic computing? Everything you need to know about how it is changing the future of IT
“We don’t really know how the brain picks up signals from our bodily sensors and processes them, and makes sense of the world around it. One of the reasons is that we can’t simulate the brain on ordinary computers – it’s just too slow, even simulating like a cubic millimeter of the brain takes weeks to simulate for a few seconds – and that hinders some of the understanding of how the brain works, ”said ICNS director André van Schaik, at ZDNet.
“Therefore, we need to build a machine that can mimic the brain rather than simulate, the difference being that it’s more of a hardware implementation where these things run faster and in parallel.”
He added that being able to understand the brain is only one of those “last frontiers of science”.
“You can’t just study the human brain in humans at the right level of detail and scale… or do an EEG where you get brain waves but don’t get any resolution of what individual neurons are doing in the brain. from someone, but with this system you can. I hope we can find out how brains work and then evolve, but also how they fail, ”said van Schaik.
At the same time, van Schaik believes the solution could improve the way AI systems are built, describing current methods used to train AI models as “very brute force methods.”
“They really learn from many examples… [but] brain learning works very differently from what we now call AI. Again we are not sure how it works and again we cannot simulate this on current computers at any scale, ”he said.
According to van Schaik, the team envisions that the proof-of-concept setup would look a lot like today’s data centers. It would consist of three computing racks in a cold environment, would incorporate FPGAs (Field-Programmable Gate Arrays) compatible with the Intel Configurable Network Protocol Accelerator (COPA) and would be connected by a high performance computing network matrix (HPC). . The system would then be fed with information, such as computational neuroscience, neuroanatomy and neurophysiology.
The system is believed to be the product of work by Intel’s Neuromorphic Research Community (INRC) with its Loihi neuromorphic computation process.
Van Schaik said that while the Loihi chip is very energy efficient, it is also less flexible as it is a custom designed chip and therefore not configurable, compared to using FPGAs that can be configured and reconfigured. using software.
“We want to offer this more flexible and more energy intensive system as a separate channel for this community,” he said.
“We are currently able to simulate much larger networks than they can on this platform.”
The research also has an aspect of sustainability, with van Schaik explaining that the system to be built would be able to process more data, with less energy. The projected thermal power of the system is 38.8 kW at full load.
“[In] the advent of AI, machine learning and smart devices… we collect so much data… when that data goes to the cloud, it consumes electricity… and we’re actually on a trajectory… [where] data uses as much electricity as anything else in the world, ”he said.
“If we are currently looking at the data centers that process the data… they consume massive amounts of electricity. Human sound is around 25 watts… we hope that by building AI and data processing more like brains, we can do it with a lot less power. ”