Dr. Natalie Rotermund and Jakob Mertes, Artificial Intelligence Center Hamburg (ARIC) e.V., Hamburg, September 2025
In line with technical progress, including in the development of artificial intelligence (AI), the demands on the underlying computing architectures are increasing. Alternative information processing concepts offer new possibilities for the efficient handling of different types of data. Neuromorphic computing enables new ways of energy-optimized processing of time-dependent information and sensor-based, low-latency interaction between the environment, humans and machines. This article is part 1 of a four-part text. It provides an introduction from the technical basics through to application and aims to encourage people to consider the potential of neuromorphic computing in product development and process innovation.
- Part 1 – Introduction to neuromorphic computing
- Part 2 – Neuromorphic hardware (tba)
- Part 3 – Neuromorphic algorithms and models (tba)
- Part 4 – Application areas of neuromorphic computing (tba)
Definition and classification
Neuromorphic computing (NMC) is a concept of information processing that is based on the structure and functioning of neuronal networks of biological organisms. It includes the development of hardware and software that simulate neuronal and synaptic structures and functions.
Due to the rapidly growing demands on data processing systems and the potential for innovation through technological advances, alternative computing architectures, summarized under the term future computing, are currently attracting interest. In addition to NMC, this also includes quantum computing, general analog and optical computing. These approaches pursue the common goal of overcoming the existing limits of conventional information processing and optimizing specific types of processing tasks in a targeted manner.
NMCs are artificial neural networks for information processing that are both physically realized in the form of special hardware and implemented in the form of software/algorithms that define the mathematical rules of information processing in addition to the physical foundations of the network. One perspective on this specialist area is the use of technology as an implementation platform for artificial intelligence.
As the technology matures, it is expected that NMC will not only systematically complement existing AI systems, but also create new opportunities for adaptive, energy-efficient and context-aware intelligence. Current analyses consider NMC to be one of the emerging key technologies. And it is considered to have great potential for the development of the next generation of artificial intelligence [4]. At the same time, there are challenges in harnessing the technology, particularly with regard to the availability of hardware, the trainability of neuromorphic AI models and a limited talent pool. Furthermore, no widely accepted benchmarks and standards have yet been established for either the hardware or the software of neuromorphic systems, which complicates their development and integration.
Even if the technology is discussed in the media to a limited extent, increasing interest is developing in specialist circles around the world. Individual market estimates forecast market shares for NMC that are comparable to quantum computing [5-8]. China is investing strategically in NMC. In summer 2025, for example, the commissioning of the largest neuromorphic computer to date with 2 billion neurons, known as the Darwin Monkey System, was announced. This means that China will overtake the Hala Point neuromorphic cluster in the USA (1.15 billion neurons) in terms of the number of neurons [9].
Germany and Europe have a strong basic research base in neuromorphic software and hardware as well as neuroscience. The transfer of this know-how from research into the value chains and the accompanying development of a low-threshold, accessible full-stack ecosystem are building blocks for the long-term competitiveness of European locations and, together with other approaches to future computing, offer the opportunity to shape IT sovereignty through independent technology innovation [10].
Our understanding of the term (delimitation)
To date, no generally accepted understanding of the term neuromorphic computing has been established. In addition, various terms in connection with NMC, such as the term neuron, have different meanings depending on the context. This makes the topic difficult to grasp. At the beginning of the in-depth look at the content, a distinction is therefore made between NMC and other implementations of artificial neural learning or conceptually borrowed specialist areas.
In the understanding used here, NMC differs fundamentally both in its hardware structure and in its algorithmic information processing from the artificial neural networks (ANNs) and the classic Von Neumann architecture that are in widespread use today:
In contrast to the NMC, data is processed and stored on the hardware side within a functional unit. In addition, there is a tendency towards a functional conditionality and fusion of hardware and software, as can also be found in biological intelligence.
Sometimes a sole hardware or sensor implementation of the principles mentioned is also understood as neuromorphic. This text refers to the combination of neuromorphic software and hardware.
Instead of continuous activations and weighted sums – the basis of KNNs – neuromorphic systems work with spiking neural networks (SNNs), which process information in a time-dependent, event-based manner and in the form of discrete impulses, so-called spikes. In the context of the traditional classification of deep neural networks, SNNs can also be assigned to
An in-depth introduction to the properties and functionality of neuromorphic hardware and software (SNNs) will be provided in Parts 2 and 3 of this topic paper.

In connection with neural networks and deep learning in the classic sense, terms such as neuron or neural network are used, which are based on biological models. In these systems, however, information processing differs fundamentally from the biological model: the classic neural network (CNN) is not physical, it is merely represented by the algorithm. Information processing is time-independent, continuous and based on weighted matrix operations. This is in contrast to the event-based, time-coded processing of neuromorphic and biological systems.
NMC should also be distinguished from so-called neural processing units (NPUs, neural hardware), i.e. certain chip modules that are already found in many cell phones. These are not neuromorphic systems. Although these modules accelerate AI calculations through an adapted chip architecture and a high degree of parallelization, they are based on classic hardware concepts and are optimized for operations within NMCs.
Biological computing (also known as wetware) is an experimental field in which biological neurons or organoids are cultivated on chips to form a hybrid organic computer system. In some cases, it pursues completely new hardware and software concepts that can use elements of the NMC. Depending on whether and to what extent SNNs are used in signal processing and how the integrated hardware components are designed, biological computing may or may not fit into the inclusion criteria for MNC used here. For this reason and due to its very early stage of technological maturity, this approach will not be addressed in detail below.
Brain computer interfaces (BCIs ), i.e. technical systems that enable a direct interface between the brain and computer and can be both invasive (e.g. implants) and non-invasive (e.g. EEG caps), can sometimes use neuromorphic software or hardware, but often do not currently do so and cannot be classified as NMC per se. Interfaces between the central and peripheral nervous system and computers are likely to be an important area of application for neuromorphic technology in the future.
How do classical and neuromorphic AI systems differ in their basic processing principles? The next section takes a comparative look at this question.
Neural networks as the basis of information processing – comparison of information processing using the example of visual information
Neural networks form the basis of biological information processing and today’s implementations of artificial intelligence. In order to introduce essential differences and similarities between biological systems, artificial neural networks in classical AI architectures and neural networks for NMC, the basic principles of the first stages of visual information acquisition and information processing are compared below by way of example. The focus here is on clarifying and contextualizing the principles of time-dependent, asynchronous and event-based processing.
The basics of neurophysiology, such as the structure of biological neurons, synapses and the mechanisms for the formation of action potentials, are also helpful for understanding the concepts of the NMC; at the end of the text there is a short excursus on some terms, which can be consulted for the following section if necessary.
Biological system – visual stimulus perception in the human eye
Visual stimuli reach the eye in the form of photons, pass through the cornea, lens and vitreous body and reach the retina, where photoreceptors with the light-sensitive molecule rhodopsin convert light into electrical signals; photoisomerization triggers a signal cascade that leads to hyperpolarization of the cell, which influences the release of neurotransmitters and ultimately generates action potentials in the so-called ganglion cells, which are transmitted to the brain via the optic nerve.
Due to retinal pre-processing mechanisms, such as receptive fields and lateral inhibition, it is primarily changes in the incoming image signal that are transmitted. This means that static stimuli that do not change over time lead to less neuronal activity, while dynamic stimuli (e.g. movement, changes in brightness) trigger a stronger reaction. This information coding has an inherent
In an asynchronous manner, new or relevant information is forwarded on an event basis, while constant stimuli are suppressed; stimulus forwarding can thus be described as selective or ‘event-based’.
Assignments of meaning, for example object classifications such as the recognition ‘this is an apple’ and further processing of visual information take place after the retinal pre-processed stimuli have been passed on to higher brain regions, such as the visual cortex [11].
Visual information acquisition in classical artificial neural networks (ANNs)
In KNNs, information is recorded using digital image data, which is fed into the network as numerical matrices (e.g. RGB values). This data usually comes from standard cameras, which convert the photons into pixel values via sensor arrays (e.g. CMOS-based).
The pixel values of the three color channels (R, G, B) are fed into the input layer as numerical inputs (see Figure 2). Typically normalized to the range [0, 1] or [-1, 1]. In a flat vector representation, this corresponds to one input neuron per pixel value per color channel; in convolutional networks, the spatial structure is preserved.
Processing then takes place in the deep layers (hidden layers) of the networkwhich transform the input data into more abstract representations. With increasing depth of the layers of the ANN, there is usually a more finely granulated feature abstraction. For example, the output value can be the probability for a certain object category.
The timing of the transfer of the numerical values to the input layer and the further transmission to the deeper layers of the network is determined by a synchronized clocked, synchronized sequence and is not event-based. Each time the values are retrieved, the numerical values of all pixels or neurons are retrieved and processed further. When coding the signal, there are no inherent temporal component [12].
Visual information acquisition in neuromorphic systems (NMC, SNNs)
In the context of neuromorphic visual systems, photons can be detected either by conventional cameras or by specialized neuromorphic sensors. In the case of conventional cameras, the image data must first be converted into a compatible format for neuromorphic systems, which is an inefficient and therefore rarely used approach.
Neuromorphic visual sensors, on the other hand, are functionally based on the biological retina: they record visual stimuli not as continuous image data, but via event-based sensor technology. One example of this is the Dynamic Vision Sensor, a neuromorphic camera whose pixels react independently of each other to local changes in brightness. As soon as the light intensity at a pixel exceeds a defined threshold value, a so-called event is triggered; independent of a global clock or fixed frame rates.
This sensor technology works asynchronously: each pixel acts autonomously and only sends a signal when a relevant change is detected. This results in an energy-efficient, dynamic data stream that is already pre-processed at sensor level and consists exclusively of changes in status. The information is forwarded in the form of time-coded spikes [13, 14].
The further processing and meaning assignment of these incoming events takes place in the downstream layers of preferably locally integrated SNNs, which react to the temporal dynamics of the input signals in the same way as biological information processing (see Figure 2).
Summary: Comparison of visual information processing
Visual information processing differs fundamentally between biological systems, classical artificial neural networks (ANNs) and neuromorphic systems (NMC). While biological and neuromorphic systems process stimuli asynchronously and event-based, i.e. neurons only become active when relevant changes occur, classic ANNs work synchronously and continuously with mostly complete activation rates. The temporal dynamics of visual signals are inherent to the data coding and processing of SNNs and the brain.
Even this brief comparison shows that the different networks have distinct working principles that harbor different limitations and potentials.
Parts 2 and 3 of the topic paper deepen the introduction to the principles of neuromorphic hardware and the structure, algorithms and training of SNNs. Part 4 is dedicated to the current status of the application of neuromorphic computing and provides an outlook on future developments.

EXKURS – Basic neurobiological terms
For more detailed information, the sources of this excursus can be consulted, which provide a good introduction to basic concepts and principles of neurophysiology [11, 15-17]:
Neurons and glial cells – cell types of the brain (there are others)
Neurons consist of three main components: Dendrites, the soma (cell body) and the axon. Dendrites receive signals from other cells and transmit them to the soma. The soma contains the cell nucleus and the organelles, which are responsible for metabolism and protein synthesis. The axon, which transmits electrical signals (action potentials) over long distances, originates from the soma. At the end of the axon are synapses that release chemical messengers (neurotransmitters) to transmit signals to other cells. Glial cells such as astrocytes, oligodendrocytes and microglia provide structural support to neurons, supply them with nutrients, insulate axons (myelin sheaths), perform immunological functions and are involved in information processing. There are hundreds of different types of neurons with distinct electrophysiological properties and excitation characteristics.
The membrane potential
The membrane potential is the electrical voltage difference between the inside of the cell and the extracellular space that is present in all neurons. It is the basis of electrical excitability and thus of information transmission. The membrane potential is created by the uneven distribution of ions (especially Na⁺, K⁺, Cl-) and the selective permeability of the neuronal cell membrane. Potassium ions (K⁺) diffuse out of the cell through leaking potassium channels, which leads to a negative charge inside the cell. The sodium-potassium pump (Na⁺/K⁺-ATPase) actively contributes to maintaining this gradient. Other ion channels also play a role. The typical resting potential is around -70 mV.
The action potential (‘spike’)
An action potential is a brief reversal of the membrane potential, triggered by the opening of voltage-gated sodium channels. As soon as a threshold potential of approx. -55 mV is reached, Na⁺ flows into the cell, which leads to depolarization. Potassium channels then open, K⁺ flows out of the cell and the membrane potential returns to rest (repolarization). This rapid, directed signal transmission along the axon is the basis of neuronal communication. The action potential follows the all-or-nothing principle; if the threshold potential is exceeded, an action potential or spike is triggered.
The brain – biological neuronal networks
The human brain consists of around 86 billion neurons that are connected to each other via one hundred trillion synapses. These networks are highly non-linear, dynamic and context-dependent. The neuronal transmission of information can be chemical, via synapses and neurotransmitters, or electrical (via gap junctions). The strength and number of these connections is plastic; it changes through experience and learning. This synaptic plasticity is the basis for memory formation and learning processes. Information processing is parallel, distributed and often stochastic. Despite intensive research, many aspects of the brain are not yet fully understood, particularly with regard to emergent phenomena such as consciousness or creativity.
Sources
- [1] Mehonic et al, 2024, Roadmap to neuromorphic computing with emerging technologies. APL Materials
- [2] Li et al, 2023, Brain inspired Computing: A Systematic Survey and Future Trends. TechRxiv by IEEE
- [3] Bitkom, 2023, Future Computing: Overview of technological Landscape.
- [4] Gartner Impact Radar, 2024: https://www.gartner.de/de/artikel/30-neue-technologien-beeinflussen-geschaeftsentscheidungen
- [5] Global Market Insights, Neuromorphic Computing Market, 2024: https://www.gminsights.com/industry-analysis/neuromorphic-computing-market
- [6] Precedence Research, Neuromorphic Computing Market Size and Forecast 2025 to 2034, 2025: https://www.precedenceresearch.com/neuromorphic-computing-market
- [7] Statista, Forecast on the development of the market potential for quantum computing in the years 2024 to 2040, 2025: https://de.statista.com/statistik/daten/studie/1198523/umfrage/entwicklung-des-marktpotenzials-fuer-quantencomputer/
- [8] Precedence Reasearch, Quantum Computing Market Size and Forecast 2025 to 2034, 2025: https://www.precedenceresearch.com/quantum-computing-market
- [9] Tech in Asia, China debuts brain-like computer with 2 billion artificial neurons, 2025: https://www.techinasia.com/news/china-debuts-brain-like-computer-with-2-billion-artificial-neurons
- [10] Kudithipudi et al, 2025, Neuromophic computing at scale. Nature Review
- [11] Bear et al, 2018, Neuroscience. Springer
- [12] Goodfellow et al, 2016, Deep Learning. MIT Press book: https://www.deeplearningbook.org/
- [13] Guillermo et al, 2020, Event-based Vision: A Survey. IEEE: https://arxiv.org/pdf/1904.08405
- [14] Prophsee Company Website: https://www.prophesee.ai/
- [15] Purves, D. et al. (2018). Neuroscience (6th ed.), Oxford University Press
- [16] https://www.mpg.de/gehirn
- [17] www.dasgehirn.info
Authors
For publications, interviews, podcasts etc. on the topic of neuromorphic computing, please contact the authors. Further information about the ARIC can be found in our media kit.

Dr. Natalie Rotermund holds a doctorate in neuroscience. As a scientific advisor at ARIC, she is responsible for the key topics of quantum technologies and AI in medicine and life sciences and has already published a paper on the use of health data.

Jakob Mertes is a Machine Learning Engineer at ARIC and advises companies on the responsible use of AI. He recently reported on our blog about the presentation of the European AI platform AIOD.EU in Paris.
This format is offered as part of the EDIH Hamburg with the support of the European Union and the Hamburgische Investitions- und Förderbank.


