Brain-Inspired Cognitive Architecture Is Now Solving Computational Challenges Faced By AI


With the development of artificial intelligence intensifying across the globe, IT companies are looking for ways to revamp their architecture to make more robust. Increasingly, researchers are turning to brain-inspired architecture with co-located memory and processing, resulting in computers which are 200 times faster than conventional computers. Such is the excitement around AI hardware, that this phase has been dubbed as a “renaissance of hardware” as vendors are rushing to build domain-specific or workload-specific architectures that can significantly scale and improve computational efficiency.



Advertisement


And as we nudge forward in the mobile era, the workloads are going to look extremely dissimilar since the requirements of computing are changing. Businesses have to rely on a different architecture, each meant for a particular workload. This is where vendors are making a shift from Von Neumann computing architecture and are striving to improve the performance of computing with multi-core CPU architectures.

Now, the rapid gains in neuroscience have also spurred researchers across the globe to propose Brain-inspired computing architecture to develop highly advanced cognitive systems. IBM researchers are working on a new computer architecture which can process data efficiently for AI workloads. However, what’s remarkable is that this new architecture is inspired by the brain and will feature coexisting processing and memory units.



Get started with deep learning using MATLAB


The IBM team argues that traditional computers were built on Von Neumann architecture, developed in the 1940s and featured a central processor that executes logic and arithmetic, a memory unit, storage, and input and output devices. But the current industry requirement has necessitated a move from homogeneous to heterogeneous computing architecture which has led to an increased research in applied materials and neuroscience.

Why Does AI Workload Require New Computing Architecture?

So what kind of changes are required in computing architecture for AI workload? According to researchers, there have to be consistent, ground-breaking breakthroughs in material sciences and neuroscience to advance AI processing. Let’s look at the two key requirements for AI workloads which have arisen an interest in brain-inspired computer architecture. Technologists emphasise that multi-core CPUs have reached their performance and efficiency limit and are adding to architectural challenges.

  1. Memory requirement: Since AI workloads depend on a lot of data — for processing huge amount of data, AI workloads require faster access to memory. In traditional CPUs have multi-level cache architecture which is not well-suited for AI.
  2. Parallel processing requirement: Parallel computing is on a rise and AI workloads and architectures have to be designed which can execute parallelism at scale.

Brain-Inspired Computing Architecture

There has been a rise in the development of cognitive which researchers assert emulates brain architecture modelling, cognitive architecture design and cognitive architecture fostering (by making them learn in a certain environment) and application to products. IBM’s new paper discusses the three layers of inspiration from the human brain.

Firstly, the team took inspiration from the brain’s memory and processing, emulating a memory device to perform computational tasks in the memory itself. The second feature was inspired by the brain’s synaptic network structures and developed as arrays of phase change memory (PCM) devices to speed up the training process for deep neural networks. And finally, the researchers drew on the stochastic nature of neurons and synapses to develop a powerful computational substrate for spiking neural networks.

Speaking about testing the systems, Abu Sebastian from IBM said, “These systems are expected to be better than conventional computing systems in some tasks, and they also surpass traditional systems in terms of efficiency.” In an experiment where the researchers ran an unsupervised machine learning algorithm on the computational memory platform, they found the brain-inspired memory platform to be 200x faster in performance than conventional computing systems.

Conclusion

Given the pace at which knowledge of neuroscience is rapidly increasing, thanks to frenetic research, there have been attempts to correlate AI with brain’s cognitive architecture. Researchers cite that constructing AI systems is based on the hypothesis that it is possible to build a general-purpose intelligent machine which can replicate human-level intelligence.  

A key aspect of the brain actively researched by neuroscientists is how deep learning in a way appears to replicate the cerebral neocortex which plays an important role. There is also research on the connectome, which forms the cognitive architecture of the brain and plays a role in advancing several breakthroughs in neuroscience. Experts cite that the neuron model, now widely known to be used in the artificial neural network has lots of functions despite being a simple internal structure.



Try deep learning using MATLAB


Provide your comments below

comments

Let’s block ads! (Why?)