___________(1)
___________(1)
브레인 모사 인공지능 기술
Brain-Inspired Artificial Intelligence
ABSTRACT
The field of brain science (or neuroscience in a broader sense) has inspired researchers in artificial intelligence
(AI) for a long time. The outcomes of neuroscience such as Hebb’s rule had profound effects on the early AI
models, and the models have developed to become the current state-of-the-art artificial neural networks.
However, the recent progress in AI led by deep learning architectures is mainly due to elaborate mathematical
methods and the rapid growth of computing power rather than neuroscientific inspiration. Meanwhile, major
limitations such as opacity, lack of common sense, narrowness, and brittleness have not been thoroughly
resolved. To address those problems, many AI researchers turn their attention to neuroscience to get insights
and inspirations again. Biologically plausible neural networks, spiking neural networks, and connectome-
based networks exemplify such neuroscience-inspired approaches. In addition, the more recent field of brain
network analysis is unveiling complex brain mechanisms by handling the brain as dynamic graph models. We
argue that the progress toward the human-level AI, which is the goal of AI, can be accelerated by leveraging
the novel findings of the human brain network.
KEYWORDS 인공지능, 인공신경망, 심층신경망, 신경과학, 브레인 네트워크, 생물학적 근사성, 스파이킹 신경망, 레저버
컴퓨팅, 커넥톰
AI
Neuronal Network 1a
5
AI
1 Regional Network
10 AI 1b6
Anatomical
Computational Limit 1c7
2 Functional 1d 8
Opacity Lack of Com
mon Sense Narrow AI
ness Brittleness AI
34
AI Ⅱ. 심층신경망(DNN)
1. 초기 모델
1
1 가. 최초의 신경 모델
9
1943
66개의 서브영역 1000개의 서브영역
뉴런 및 뉴런 간의 상호작용 영역 및 영역 간의 상호작용
McCulloch Pitts
(c) 해부학적 모델 (d) 기능적 모델
10
Threshold
출처 뉴런 네트워크[5] CC BY-SA 4.0, 영역 네트워크[6] CC BY 3.0, 해
부학적 모델[7] CC BY-SA 4.0, 기능적 모델[8] CC BY-NC 2.0 개작.
1
108 전자 신 36 3 2021 6
나. 뉴런의 학습 모델 14 2
Hebb
16
13 17
1984
Geoffrey Hinton
Boltzmann
다. 퍼셉트론 Boltzmann machine
1958 Frank Rosenblatt 18 2000 Hinton
McCull Boltzmann machine
och Pitts Restricted Boltzmann
Machine RBM
Deep Belief
Network 19
2
김철호 브레인 모사 인공 기 109
가. 심층신경망
1) 가중치 전달 문제
110 36 3 2021 6
25
DNN
26
3) 선형 연산
가. 지역적 학습 알고리즘
Global Information
Local Information
27
25
2
32
1 33
8
34
29 다. 하이브리드 구조
30
24
35
나. 양자화 신경망
32 64
1 5
31 Ⅲ. 스파이킹 신경망(SNN)
STDP
Spike Timing Dependent Plasticity
40 Hebb
3
Long Term Potentiation
Spike Train
36 37 3
STDP
Encoding
Rate Coding
41
Temporal Coding TTFS
Time To First Spike Coding
Burst Coding
Phase Coding SpikeProp 42
38
3. 스파이킹 신경망의 전망
39
2. 스파이킹 신경망의 학습
43
113
28 Network Neuroscience
44
DTI Diffusion
Tensor Imaging EEG Electroencephalogram
Pearson Correla
tion ICA Independent Component Analysis
45
Thresholding
Brain Network Analysis
Consensus Network
1. 브레인 네트워크 모델
2. 레저버 컴퓨팅
Node 가. 개요
Edge RC Reservoir Computing
Dynamics 4
Control RNN Recur
rent Neural Network 46 RC
114 36 3 2021 6
𝑊 다. LSM
𝑊𝑖𝑛
LSM
Computational Neuroscience
𝑊𝑜𝑢𝑡
Error feedback
ESN
Reservoir
LSM
4
SNN
49
50
나. ESN
RC ESN Echo State 2010
Network LSM Liquid State
Machines ESN 2001
Herbert Jaeger
302
51
55
57 59
55 56
Adja
cency Matrix
RC Memory
Capacity AI
Wiring Cost
55 56
2. 모티프
Motif
61
1. 동시 진동
Synchronized Oscillation
116 36 3 2021 6
AI
Ⅵ. 결론
3. 동적 재구성
AI
Hebb
Dynamic Reconfiguration
62 63
AI
Pattern Recognition DNN SNN
Cognition
AI
4. 생성 모델 AI
약어 정리
64 AI Artificial Intelligence
CNN Convolutional Neural Network
DNN Deep Neural Network
DTI Diffusion Tensor Imaging
117
ECCV 2016, vol. 9908, Springer, Cham Switzerland, 2016, pp. [48] L. Grigoryeva et al., “Echo state networks are universal,”
525–542. Neural Netw., vol. 108, Dec. 2018, pp. 495-508.
[34] Y. Yang et al., “Training high-performance and large-scale [49] H. Jaeger, W. Maass, and J. Principe, “Special issue on echo
deep neural networks with full 8-bit integers,” Neural Netw., state networks and liquid state machines,” Neural Netw., vol.
vol. 125, May 2020, pp. 70–82. 20, no. 3, Apr. 2017, pp. 287-289.
[35] M. Lechner et al., “Neural circuit policies enabling auditable [50] O. Sporns, “The human connectome: Origins and challenges,”
autonomy,” Nat. Mach. Intell., vol. 2, Oct. 2020, pp. 642-652. NeuroImage, vol. 80, Oct. 2013. pp. 53-61.
[36] E.D. Adrian et al., “The impulses produced by sensory nerve [51] S.W. Oh et al., “A mesoscale connectome of the mouse brain,”
endings,” J. Physiol., vol. 61 no. 4, 1926, pp. 465-483. Nature, vol. 508, no. 7495, Apr. 2014.
[37] G.Q. Bi et al., “Synaptic modifications in cultured hippocampal [52] D. Meunier et al., “Hierarchical modularity in human brain
neurons: Dependence on spike timing, synaptic strength, and functional networks,” Front. Neuroinform., vol. 3, 2009.
postsynaptic cell type,” J. Neurosci., vol. 18, no. 24, 1998, pp. [53] N.T. Markov et al., “A weighted and directed interareal
10464-10472. connectivity matrix for macaque cerebral cortex,” Cereb.
[38] W. Guo et al., “Neural coding in spiking neural networks: A Cortex, vol. 24, 2014.
comparative study for robust neuromorphic systems,” Front. [54] R.F. Betzel and D.S. Bassett, “Specificity and robustness
Behav. Neurosci., vol. 15, 2021. of long-distance connections in weighted, interareal
[39] S.J. Thorpe, “Spike arrival times: A highly efficient coding connectomes,” PNAS, vol. 115, no. 2, May 2018.
scheme for neural networks,” Parallel Process. Neural Syst., [55] F. Damicelli et al., “Brain connectivity meets reservoir
1990, pp. 91-94. computing,” Neurosci., Jan. 2021.
[40] D.E. Feldman, “The spike-timing dependence of plasticity,” [56] L.E. Suarez et al., “Learning function from structure in
Neuron, vol. 75, no. 4, 2012, pp. 556-571. neuromorphic networks,” Preprint form Biology, Nov. 2020,
[41] T. Masquelier et al., “Spike timing dependent plasticity finds doi: 10.1101/2020.11.10.350876.
the start of repeating patterns in continuous spike trains,” [57] W. Luo and Ji-Song Guan, “Do brain oscillations orchestrate
PloS one, vol. 3, no. 1, 2008, e1377. memory?,” Brain Sci. Adv., vol. 4, no. 1, Oct. 2018. pp. 16-33.
[42] S.M. Bohte et al., “SpikeProp: Backpropagation for networks [58] R. Fuevara Erra et al., “Neural synchronization from the
of spiking neurons,” in Proc. ESANN, Bruges, Belgium, Apr. perspective of non-linear dynamics,” Front. Comput.
2000, pp. 419-424. Neurosci., Oct. 2017.
[43] A. Kugele et al., “Efficient processing of spatio-temporal data [59] P. Fries, “Rhythms for cognition: Communication through
streams with spiking neural networks,” Front. Neurosci., vol. coherence,” Neuron, vol. 88, no. 1, Oct. 2015.
14, 2020.
[60] C. Duclos et al., “Brain network motifs are markers of loss and
[44] D.S. Bassett, et al., “Network neuroscience,” Nat. Neurosci., recovery of consciousness,” Sci. Rep., vol. 11, Mar. 2020.
Mar. 2017.
[61] O. Sporns et al., “Motifs in brain networks,” PLoS Biol., vol. 2,
[45] A. Fomito and E.T. Bullmore, “Connectomic intermediate Nov. 2004.
phenotypes for psychiatric disorders,” Front. Psychiatry, Apr.
[62] D.S. Bassert et al., “Dynamic reconfiguration of human brain
2012.
networks during learning,” PNAS, May 2011, pp. 7641-7646.
[46] M. Lukosevicius, H. Jaeger, and B. Schrauwen, “Reservoir
[63] M. Pedersenet al., “Multilayer network switching rate predicts
Comput. Trends,” vol. 26, May 2012, pp. 365–371.
brain performance,” PNAS, vol. 115, Dec. 2018.
[47] H. Jaeger, “The “echo state” approach to analysing and training
[64] R .F. Betzel et al., “Generative models for network
recurrent neural networks–with an erratum note,” Bonn, GMD
neuroscience: Prospects and promise,” J. R. Soc., vol. 14, no.
Tech. Rep. vol. 148, Jan. 2010.
136, Jun. 2017.