Intel Has Unveiled the Largest Neuromorphic Computer Inspired by The Brain

Janani R April 29, 2024 | 10:30 AM Technology

Intel has developed the world's largest neuromorphic computer, aiming to replicate the brain's data processing and storage methods. This advancement could enhance the efficiency and capabilities of artificial intelligence models. However, experts caution that engineering challenges must be addressed before the device can surpass current standards in AI performance.

Neuromorphic computers diverge from traditional machines as they utilize artificial neurons for both computation and data storage, mirroring the human brain's function. This integrated approach eliminates the need to transfer data between distinct components, potentially alleviating a bottleneck present in current computer systems.

Figure 1."Intel's Loihi 2 Chips Drive the Hala Point Neuromorphic Computer"

Figure 1 Show in "Intel's Loihi 2 Chips Drive the Hala Point Neuromorphic Computer”. The newly developed Hala Point neuromorphic computer by Intel showcases significant advancements in energy efficiency, claiming to utilize 100 times less energy than conventional machines for optimization tasks. This architecture, comprising 1.15 billion artificial neurons spread across 1152 Loihi 2 Chips, is capable of executing 380 trillion synaptic operations per second. [1] Unlike conventional AI models that mechanically process inputs through each layer of artificial neurons, Hala Point mimics the brain's information processing, potentially revolutionizing AI training and execution. Despite its immense power, the system occupies just six racks within a standard server case, akin to a microwave oven's footprint. Intel's Mike Davies suggests that larger-scale machines are feasible, as the current design was not limited by specific technical challenges.

Hala Point stands unrivaled in scale among existing machines, with DeepSouth, another neuromorphic computer set to be completed soon, trailing behind with a claimed capacity of 228 trillion synaptic operations per second. Despite its groundbreaking potential, the Loihi 2 chips powering Hala Point are still prototypes produced in limited quantities by Intel. However, according to Davies, the primary bottleneck lies in the layers of software required to translate real-world problems into a format compatible with neuromorphic computing and execute processing tasks. This software aspect remains in its early stages of development, reflecting the nascent state of neuromorphic computing technology. Davies highlights the software's limitations as a significant factor, suggesting that building larger machines may not be practical until this aspect is further advanced.

Intel envisions that a machine-like Hala Point could enable the development of AI models capable of continuous learning, eliminating the need for retraining from scratch for each new task—a departure from current models. However, James Knight from the University of Sussex dismisses this notion as "hype." Knight highlights that current models, such as ChatGPT, leverage parallel processing with graphics cards, allowing multiple chips to contribute to training simultaneously. [2] In contrast, neuromorphic computers operate with a single input and cannot be trained in parallel. Knight argues that training complex models like ChatGPT on such hardware, let alone achieving continuous learning, could take decades to accomplish.

Davies acknowledges that current neuromorphic hardware isn't suitable for training large AI models from scratch. However, he envisions a future where these systems could facilitate continual learning by taking pre-trained models and adapting them to new tasks over time. He believes that large-scale neuromorphic systems like Hala Point have the potential to efficiently address such continual learning challenges, albeit the methods are still in the research phase.

Knight shares optimism about the broader impact of neuromorphic computers in computer science problem-solving, anticipating increased efficiency once the necessary developer tools for writing software compatible with this unique hardware become more mature.

In addition, neuromorphic computing holds promise as a pathway toward achieving artificial general intelligence (AGI), often considered human-level intelligence. This approach contrasts with the current reliance on large language models like ChatGPT, which many AI experts believe may not lead to AGI. Knight remarks that the idea of using neuromorphic computing to create brain-like models is gaining traction and may represent the future direction of AI research.

References:

  1. https://www.cryptopolitan.com/intel-develops-worlds-largest-neuromorphic-computer-to-train-ai-inspired-by-human-brain/
  2. https://www.newscientist.com/article/2426523-intel-reveals-worlds-biggest-brain-inspired-neuromorphic-computer/

Cite this article:

Janani R (2024), Intel Has Unveiled the Largest Neuromorphic Computer Inspired by The Brain, AnaTechmaz, pp.391

Recent Post

Blog Archive