
ComplexValued Neurons
ComplexValued Neural Networks
The primarily CIL research area is ComplexValued Neural Networks (CVNNs),
mainly MultiValued Neurons and neural networks
based on them.
ComplexValued Neural Networks become increasingly popular. The use of
complexvalued inputs/outputs, weights and activation functions make it possible
to increase the functionality of a single neuron and of a neural network, to
improve their performance and to reduce the training time.
The history of complex numbers shows that although it took a long time for
them to be accepted (almost 300 years from the first reference to "imaginary
numbers" by Girolamo Cardano in 1545 to Leonard Euler's and Carl Friedrich
Gauss' works published in 1748 and 1831, respectively), they have become an
integral part of engineering and mathematics. It is difficult to imagine today
how signal processing, aerodynamics, hydrodynamics, energy science, quantum
mechanics, circuit analysis, and many other areas of engineering and science
could develop without complex numbers. It is a fundamental mathematical fact
that complex numbers are a necessary and absolutely natural part of numerical
world. Their necessity clearly follows from the Fundamental Theorem of Algebra,
which states that every nonconstant singlevariable polynomial of degree n with
complex coefficients has exactly n complex roots, if each root is counted up to
its multiplicity.
Answering a question frequently asked by some "conservative" people, what one
can get using complexvalued neural networks ("twice more" parameters, more
computations, etc.), we may say that one may get the same as using the Fourier
transform, but not just the Walsh transform in signal processing. There are many
engineering problems in the modern world where complexvalued signals and
functions of complex variables are involved and where they are unavoidable.
Thus, to employ neural networks for their analysis, approximation, etc., the use
of complexvalued neural networks is natural. However, even in the analysis of
realvalued signals (for example, images or audio signals) one of the most
frequently used approaches is frequency domain analysis, which immediately leads
us to the complex domain. In fact, analyzing signal properties in the frequency
domain, we see that each signal is characterized by magnitude and phase that
carry different information about the signal. This fundamental fact was deeply
discovered by A.V. Oppenheim and J.S. Lim in their paper "The importance of
phase in signals", IEEE Proceedings, v. 69, No 5, 1981,pp.: 529 541. They
have shown that the phase in the Fourier spectrum of a signal
is much more informative than the magnitude: particularly in
the Fourier spectrum of images, just phase contains the
information about all shapes, edges, orientation of all objects.
This property can be illustrated by the following example. Let us consider two
popular test images “Lena” and “Bridge”.
Let us take their Fourier transforms and then let us swap magnitude and phase
of their Fourier spectra combining the phase of “Lena” with the magnitude of
“Bridge” and wiseversa. After taking the inverse Fourier transform we clearly
realize that those images were restored whose phases were combined with the
counterpart magnitude:


Restored from Lena Phase + Bridge Magnitude 
Restored from Bridge phase + Lena Magnitude 
Thus, in fact, phase contains information of what is represented by the
corresponding signal. To use this information properly, the most appropriate
solution is movement to the complex domain. Hence, one of the most important
characteristics of ComplexValued Neural Networks is the proper treatment of
amplitude and phase information, e.g., the treatment of waverelated phenomena
such as electromagnetism, light waves, quantum waves and oscillatory
phenomenon.
There are different specific types of complexvalued neurons and
complexvalued activation functions. But it is important to mention that all
ComplexValued Neurons and ComplexValued Neural Networks have a couple of very
important advantages over their realvalued counterparts. The first one is that
they have much higher functionality. The second one is their
better plasticity and flexibility: they learn
faster and generalize better. The higher functionality means first of all
the ability of a single neuron to learn those input/output mappings that
are nonlinearly separable in the real domain. This means the ability to
learn them in the initial space without creating higher degree inputs and
without moving to the higher dimensional space, respectively. To illustrate this
ability of a complexvalued neuron, let us consider how it can easily solve the
XOR problem, which is a classical nonlinearly separable
problem. Let us take the Universal Binary Neuron (UBN), which is comprehensively
observed by I. Aizenberg et al. in the monograph Multivalued and universal
binary neurons: theory, learning, applications, Kluwer Academic Publishers,
Boston/Dordrecht/London, 2000. UBN is a neuron with binary inputs and output and
with the complexvalued weights. Let us consider UBN with the activation
function PB defined on the complex plane as follows:
It is very easy to check that the weighting vector (0, 1, i) where
i is an imaginary unity, implements the XOR function on a single UBN.
This is illustrated by the following table:
This example clearly shows that a complexvalued neuron is more functional
than any traditional realvalued neuron. ComplexValued Neural Networks are
respectively more functional than their realvalued counterparts.
ComplexValued Neural Networks is a rapidly growing area. Its popularity is
confirmed by successful organization of a number of special sessions in the most
representative international conferences in the area over last 78 years (ICONIP
2002, Singapore; ICANN/ICONIP 2003, Istanbul; ICONIP 2004, Calcutta; WCCIIJCNN
2006, Vancouver; "Fuzzy Days 2006"; Dortmund, ICANN 2007, Porto; WCCIIJCNN
2008, Hong Kong; IJCNN 2009, Atlanta; WCCIIJCNN 2010, Barcelona). Everywhere
these sessions had large audience, which is growing continuously. There were
many interesting presentations and very productive discussions. In 2010, the
CVNN Task Force group was established by the IEEE Computational Intelligence
Society Technical Committee on Neural Networks.
There are several new directions in CVNNs development: from formal
generalization of the commonly used algorithms to the complexvalued case to the
use of original complexvalued activation functions that can increase
significantly the neuron and network functionality. There are also many
interesting applications of CVNNs in pattern recognition and classification,
image processing, time series prediction, bioinformatics, robotics,
etc.
