3
3
Abstract—One of the main, long-term objectives of artificial Handling the complex interactions/operations between se-
intelligence is the creation of thinking machines. To that end, mantic objects requires both orderly semantic object represen-
substantial effort has been placed into designing cognitive sys- tations and machinery to carry out useful object manipulation
tems; i.e. systems that can manipulate semantic-level information.
A substantial part of that effort is oriented towards designing operations. Hyperdimensional vector-based representation sys-
the mathematical machinery underlying cognition in a way that tems [11] have emerged as the de facto standard approach and
is very efficiently implementable in hardware. In this work we are employed in both the SPA and ACT-R. Their mathematical
propose a ‘semi-holographic’ representation system that can be machinery typically includes generalised vector addition (com-
implemented in hardware using only multiplexing and addition bine two vectors in such way that the result is as similar to both
operations, thus avoiding the need for expensive multiplication.
The resulting architecture can be readily constructed recycling operands as possible), vector binding (combine two vectors
standard microprocessor elements and ¡Something about hard- in such way that the result is as dissimilar to both operands
ware performance¿. Our proposed ‘cognitive processing unit’ as possible) and normalisation (scale vector elements so that
(CoPU) is intended as just one (albeit crucial) part of much overall vector magnitude remains constant). These operations
larger cognitive systems where artificial neural networks of all may be instantiated in holographic (all operands and results
kinds and associative memories work in concord to give rise to
intelligence. have fixed, common length) or non-holographic manners.
Non-holographic systems have employed convolution [12] or
I. I NTRODUCTION tensor products [13] as binding. Holographic approaches have
The explosive scale of research output and investment in the used circular convolution [11] and element-wise XOR [14].
field of artificial intelligence (AI) and machine learning (ML) Meanwhile, element-wise addition tends to remain the vector
testify to the tremendous impact of the field to the world. addition operation of choice across the board.
Thus far this has manifested itself as a mass-scale prolifera- Finally, whichever computational methodology is adopted
tion of artificial neural network-based (ANN) algorithms for for cognitive computing must be implementable in hardware
data classification. This covers multiple data modalities such with extremely high power efficiency in order to realise
as most prominently images [1] and speech/sound [2], and its full potential for practical impact. This is the objective
relies on a number of standard, popular ANN architectures, pursued by a number of accelerator architectures spanning
most notably multi-layer perceptrons [3], recurrent NNs (in from limited precision analogue neuron-based circuits [15],
particular, LSTM [4] and GRU [5]) and convolutional NNs through analogue/digital mixtures [16] to fully analogue chips
[6] amongst many others [7], [8]. seeking to emulate the diffusive kinetics of real synapses [17].
Thus far the vast majority of market-relevant ANN-based More recently memristor-based architectures has also emerged
systems belong to the domain of statistical learning, i.e. [18].
perform tasks which can be generally reduced to some sort In this work, we summarise an existing, abstract mathemat-
of pattern recognition and interpolation (in time, space, etc.). ical structure for carrying out semantic object manipulation
This, though demonstrably useful, is akin to memorising computations and propose an alternative, hardware-friendly
every answer to every question plus some ability to cope instantiation. Our approach uses vector concatenation and
with uncertainty. In contrast, higher level intelligence must be modular addition as its fundamental operations (in contrast
able to support fluid reasoning and syntactic generalisation, to the more typical element-wise vector addition and matrix-
i.e. applying previous knowledge/experience to solve novel vector multiplication respectively). Crucially, the chosen set
problems. This requires the packaging of classified information of operations no longer forms a holographic representation
generated by traditional ANNs into higher level variables system. This trades away some ‘expressivity’ (ability to form
(which we may call ‘semantic objects’), which can then be flu- semantic object expressions within limited resources) in ex-
ently manipulated at that higher level of abstraction. A number change for compression: Unlike holographic representations
of cognitive architectures have been proposed to perform such semantic object vector length depends on its information
post processing, most notably the ACT-R architecture [9] and content. Furthermore, the proposed system avoids use of multi-
the semantic pointer architecture (SPA) [10], which is an effort plication completely, thus allowing for both fast and efficient
to manipulate symbols using neuron-based implementations. processing in hardware (avoiding both expensive multipliers
and relatively slow spiking systems). Finally, we illustrate Chain
how the proposed system can be easily mapped onto a simple
vector processing unit and provide some preliminary, expected
performance metrics based on simulations in Cadence on a
|a1 a2 a3 ... | b1 b2 b3 ... | ... | x1 x2 ... |
65nm technology. a1=5