0% found this document useful (0 votes)
24 views103 pages

Slides - Graph Signal Processing and Applications in Neuroscience

Uploaded by

mymnaka82125
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
24 views103 pages

Slides - Graph Signal Processing and Applications in Neuroscience

Uploaded by

mymnaka82125
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 103

Graph signal processing

Concepts, tools and applications in neuroscience

Xiaowen Dong
Oxford-Man Institute
Department of Engineering Science

NISOx, Big Data Institute, February 2019


Outline
• Motivation

• Graph signal processing (GSP): Basic concepts

• Spectral filtering: Basic tools of GSP

• Connection with literature

• Applications in neuroscience

2/47
Outline
• Motivation

• Graph signal processing (GSP): Basic concepts

• Spectral filtering: Basic tools of GSP

• Connection with literature

• Applications in neuroscience

2/47
Data are often structured

Temperature data Traffic data

Social network data Neuroimaging data

3/47
Data are often structured

Temperature data Traffic data

Social network data Neuroimaging data

We need to take into account the structure behind the data

3/47
Graphs are appealing tools
! Efficient representations for pairwise relations between entities

The Königsberg Bridge Problem


[Leonhard Euler, 1736]

4/47
Graphs are appealing tools
• Efficient representations for pairwise relations between entities

v1
v3 v4
v2 v5
v7 v6
v8
v9

5/47
Graphs are appealing tools
• Efficient representations for pairwise relations between entities
! Structured data can be represented by graph signals

RN

+ f : V ! RN
0
v1
- v3 v4
v2 v5
v7 v6
v8
v1
v3 v4 v9
v2 v5
v7 v6
v8
v9

5/47
Graphs are appealing tools
• Efficient representations for pairwise relations between entities
! Structured data can be represented by graph signals

RN

+ f : V ! RN
0
v1
- v3 v4
v2 v5
v7 v6
v8
v1
v3 v4 v9
v2 v5
v7 v6
v8
v9

Takes into account both structure (edges) and


data (values at vertices)
5/47
Graph signals are pervasive

• Vertices:
- 9000 grid cells in London
• Edges:
- geographical proximity of grid
cells
• Signal:
- # Flickr users who have taken
photos in two and a half year

6/47
Graph signals are pervasive

• Vertices:
- 1000 Twitter users
• Edges:
- following relationship among
users
• Signal:
- # Apple-related hashtags they
have posted in six weeks

7/47
Graph signals are pervasive

• Vertices:
- brain regions
• Edges:
- structural connectivity between
brain regions
• Signal:
- blood-oxygen-level-dependent
(BOLD) time series

8/47
Research challenges

f : V ! RN
v1
v3 v4
v2 v5
v7 v6
v8
v9

How to generalise classical signal processing tools


on irregular domains such as graphs?

9/47
Graph signal processing
• Graph signals provide a nice compact format to encode structure within
data

• Generalisation of classical signal processing tools can greatly benefit


analysis of such data

• Numerous applications: Transportation, biomedical,


social network analysis, etc.
f : V ! RN
v1
v3 v4
v2 v5
v7 v6
v8
v9

10/47
Graph signal processing
• Graph signals provide a nice compact format to encode structure within
data

• Generalisation of classical signal processing tools can greatly benefit


analysis of such data

• Numerous applications: Transportation, biomedical,


social network analysis, etc.
f : V ! RN
• An increasingly rich literature
v1
- classical signal processing v3 v4
v2 v5
- algebraic and spectral graph theory v7 v6
v8
- computational harmonic analysis v9
- machine learning

10/47
Outline
• Motivation

• Graph signal processing (GSP): Basic concepts

• Spectral filtering: Basic tools of GSP

• Connection with literature

• Applications in neuroscience

11/47
Two paradigms
! The main approaches can be categorised into two families:
- vertex (spatial) domain designs
- frequency (graph spectral) domain designs

f : V ! RN
v1
v3 v4
v2 v5
v7 v6
v8
v9

12/47
Two paradigms
! The main approaches can be categorised into two families:
- vertex (spatial) domain designs
- frequency (graph spectral) domain designs
Important for analysis of signal properties

f : V ! RN
v1
v3 v4
v2 v5
v7 v6
v8
v9

12/47
Need for frequency
• Classical Fourier transform provides the frequency domain representation
of the signals

Source: http://www.physik.uni-kl.de

f : V ! RN
v1
v3 v4
v2 v5
v7 v6
v8
v9

13/47
Need for frequency
• Classical Fourier transform provides the frequency domain representation
of the signals

Source: http://www.physik.uni-kl.de

f : V ! RN
v1
A notion of frequency for graph signals: v2
v3 v4
v5
v6
We need the graph Laplacian matrix v8 v7
v9

13/47
Graph Laplacian
v7 Weighted and undirected graph:
v8
v1 v3 G = {V, E}
v6

v2 v4 v5

14/47
Graph Laplacian
v7 Weighted and undirected graph:
v8
v1 v3 G = {V, E}
v6
D = diag(d(v1 ), · · · , d(vN ))
v4 v5
<latexit sha1_base64="g0ItODg/32vxWBxIIlCwMgNpTJw=">AAACDHicbVDLSgMxFM34rPVVdekmWIQWSpkRQTdCUReupIJ9QGcomUzahmYeJHeKZegHuPFX3LhQxK0f4M6/MdPOQlsPBE7OOZfkHjcSXIFpfhtLyyura+u5jfzm1vbObmFvv6nCWFLWoKEIZdsligkesAZwEKwdSUZ8V7CWO7xK/daIScXD4B7GEXN80g94j1MCWuoWitcXNrAHSDxO+pOSVxp1rXLFpl4IqpLebstlnTKr5hR4kVgZKaIM9W7hy/ZCGvssACqIUh3LjMBJiAROBZvk7VixiNAh6bOOpgHxmXKS6TITfKwVD/dCqU8AeKr+nkiIr9TYd3XSJzBQ814q/ud1YuidOwkPohhYQGcP9WKBIcRpM9jjklEQY00IlVz/FdMBkYSC7i+vS7DmV14kzZOqZVatu9Ni7TKrI4cO0REqIQudoRq6QXXUQBQ9omf0it6MJ+PFeDc+ZtElI5s5QH9gfP4AcVqZ7Q==</latexit>

v2

0 1
1 0 0 0 0 0 0 0
B0 3 0 0 0 0 0 0C
B C
B0 0 4 0 0 0 0 0C
B C
B0 0 0 2 0 0 0 0C
B C
B0 0 0 0 2 0 0 0C
B C
B0 0 0 0 0 4 0 0C
B C
@0 0 0 0 0 0 3 0A
0 0 0 0 0 0 0 1

D W

14/47
Graph Laplacian
v7 Weighted and undirected graph:
v8
v1 v3 G = {V, E}
v6
D = diag(d(v1 ), · · · , d(vN ))
v4 v5
<latexit sha1_base64="g0ItODg/32vxWBxIIlCwMgNpTJw=">AAACDHicbVDLSgMxFM34rPVVdekmWIQWSpkRQTdCUReupIJ9QGcomUzahmYeJHeKZegHuPFX3LhQxK0f4M6/MdPOQlsPBE7OOZfkHjcSXIFpfhtLyyura+u5jfzm1vbObmFvv6nCWFLWoKEIZdsligkesAZwEKwdSUZ8V7CWO7xK/daIScXD4B7GEXN80g94j1MCWuoWitcXNrAHSDxO+pOSVxp1rXLFpl4IqpLebstlnTKr5hR4kVgZKaIM9W7hy/ZCGvssACqIUh3LjMBJiAROBZvk7VixiNAh6bOOpgHxmXKS6TITfKwVD/dCqU8AeKr+nkiIr9TYd3XSJzBQ814q/ud1YuidOwkPohhYQGcP9WKBIcRpM9jjklEQY00IlVz/FdMBkYSC7i+vS7DmV14kzZOqZVatu9Ni7TKrI4cO0REqIQudoRq6QXXUQBQ9omf0it6MJ+PFeDc+ZtElI5s5QH9gfP4AcVqZ7Q==</latexit>

v2
L=D
<latexit sha1_base64="SkU0GGpgUn5NzI5VLaynfG9zEnI=">AAAB7HicbVA9SwNBEJ3zM8avqKXNYhBsDHciaCMEtbCwiOAlgeQIe5u9ZMne3rE7J4SQ32BjoYitP8jOf+MmuUITHww83pthZl6YSmHQdb+dpeWV1bX1wkZxc2t7Z7e0t183SaYZ91kiE90MqeFSKO6jQMmbqeY0DiVvhIObid944tqIRD3iMOVBTHtKRIJRtJJ/f3V72uiUym7FnYIsEi8nZchR65S+2t2EZTFXyCQ1puW5KQYjqlEwycfFdmZ4StmA9njLUkVjboLR9NgxObZKl0SJtqWQTNXfEyMaGzOMQ9sZU+ybeW8i/ue1Mowug5FQaYZcsdmiKJMEEzL5nHSF5gzl0BLKtLC3EtanmjK0+RRtCN78y4ukflbx3Ir3cF6uXudxFOAQjuAEPLiAKtxBDXxgIOAZXuHNUc6L8+58zFqXnHzmAP7A+fwBxXON/Q==</latexit>
W Equivalent to G!

0 1
1
B0
B
0
3
0
0
0
0
0
0
0
0
0
0
0
0CC
! Symmetric
B0 0 4 0 0 0 0 0C
B C
B0
B
B0
0
0
0
0
2
0
0
2
0
0
0
0
0C
0C
C ! Off-diagonal entries non-positive
B C
B0 0 0 0 0 4 0 0C
Rows sum up to zero
B C
@0 0 0 0 0 0 3 0A !
0 0 0 0 0 0 0 1

D W L

14/47
Graph Laplacian
v7 Weighted and undirected graph:
v8
v1 v3 G = {V, E}
v6
D = diag(d(v1 ), · · · , d(vN ))
v4 v5
<latexit sha1_base64="g0ItODg/32vxWBxIIlCwMgNpTJw=">AAACDHicbVDLSgMxFM34rPVVdekmWIQWSpkRQTdCUReupIJ9QGcomUzahmYeJHeKZegHuPFX3LhQxK0f4M6/MdPOQlsPBE7OOZfkHjcSXIFpfhtLyyura+u5jfzm1vbObmFvv6nCWFLWoKEIZdsligkesAZwEKwdSUZ8V7CWO7xK/daIScXD4B7GEXN80g94j1MCWuoWitcXNrAHSDxO+pOSVxp1rXLFpl4IqpLebstlnTKr5hR4kVgZKaIM9W7hy/ZCGvssACqIUh3LjMBJiAROBZvk7VixiNAh6bOOpgHxmXKS6TITfKwVD/dCqU8AeKr+nkiIr9TYd3XSJzBQ814q/ud1YuidOwkPohhYQGcP9WKBIcRpM9jjklEQY00IlVz/FdMBkYSC7i+vS7DmV14kzZOqZVatu9Ni7TKrI4cO0REqIQudoRq6QXXUQBQ9omf0it6MJ+PFeDc+ZtElI5s5QH9gfP4AcVqZ7Q==</latexit>

v2
L=D
<latexit sha1_base64="SkU0GGpgUn5NzI5VLaynfG9zEnI=">AAAB7HicbVA9SwNBEJ3zM8avqKXNYhBsDHciaCMEtbCwiOAlgeQIe5u9ZMne3rE7J4SQ32BjoYitP8jOf+MmuUITHww83pthZl6YSmHQdb+dpeWV1bX1wkZxc2t7Z7e0t183SaYZ91kiE90MqeFSKO6jQMmbqeY0DiVvhIObid944tqIRD3iMOVBTHtKRIJRtJJ/f3V72uiUym7FnYIsEi8nZchR65S+2t2EZTFXyCQ1puW5KQYjqlEwycfFdmZ4StmA9njLUkVjboLR9NgxObZKl0SJtqWQTNXfEyMaGzOMQ9sZU+ybeW8i/ue1Mowug5FQaYZcsdmiKJMEEzL5nHSF5gzl0BLKtLC3EtanmjK0+RRtCN78y4ukflbx3Ir3cF6uXudxFOAQjuAEPLiAKtxBDXxgIOAZXuHNUc6L8+58zFqXnHzmAP7A+fwBxXON/Q==</latexit>
W Equivalent to G!
1 1
Lnorm = D <latexit sha1_base64="G40rk+NS/3Ev4vyZqJRNrkqkmoc=">AAACHHicbVBNS8NAEN3Ur1q/oh69BItQDy1JFfQiFO3Bg4cKthWaWjbbTbt0swm7E7GE/BAv/hUvHhTx4kHw37j9OGjrg4HHezPMzPMizhTY9reRWVhcWl7JrubW1jc2t8ztnYYKY0lonYQ8lLceVpQzQevAgNPbSFIceJw2vcHFyG/eU6lYKG5gGNF2gHuC+Yxg0FLHPLrquEAfIBGhDNKz6l1SdH2JSeKkSTlNC9Vi83BW7Jh5u2SPYc0TZ0ryaIpax/x0uyGJAyqAcKxUy7EjaCdYAiOcpjk3VjTCZIB7tKWpwAFV7WT8XGodaKVr+aHUJcAaq78nEhwoNQw83Rlg6KtZbyT+57Vi8E/bCRNRDFSQySI/5haE1igpq8skJcCHmmAimb7VIn2sYwCdZ06H4My+PE8a5ZJjl5zr43zlfBpHFu2hfVRADjpBFXSJaqiOCHpEz+gVvRlPxovxbnxMWjPGdGYX/YHx9QOSYqGj</latexit>
2 (D W )D 2

0 1
1
B0
B
0
3
0
0
0
0
0
0
0
0
0
0
0
0CC
! Symmetric
B0 0 4 0 0 0 0 0C
B C
B0
B
B0
0
0
0
0
2
0
0
2
0
0
0
0
0C
0C
C ! Off-diagonal entries non-positive
B C
B0 0 0 0 0 4 0 0C
Rows sum up to zero
B C
@0 0 0 0 0 0 3 0A !
0 0 0 0 0 0 0 1

D W L

14/47
Graph Laplacian

Why graph Laplacian?

15/47
Graph Laplacian

Why graph Laplacian?


- approximation of the Laplace operator

v j1
<latexit sha1_base64="MMoibnViGv7rQC7CiLUYHHmUZvY=">AAAB7nicbVBNS8NAEJ34WetX1aOXxSJ4KokIeix68VjBfkAbwma7adduNmF3UiihP8KLB0W8+nu8+W/ctjlo64OBx3szzMwLUykMuu63s7a+sbm1Xdop7+7tHxxWjo5bJsk0402WyER3Qmq4FIo3UaDknVRzGoeSt8PR3cxvj7k2IlGPOEm5H9OBEpFgFK3UHgf5U+BNg0rVrblzkFXiFaQKBRpB5avXT1gWc4VMUmO6npuin1ONgkk+Lfcyw1PKRnTAu5YqGnPj5/Nzp+TcKn0SJdqWQjJXf0/kNDZmEoe2M6Y4NMveTPzP62YY3fi5UGmGXLHFoiiTBBMy+530heYM5cQSyrSwtxI2pJoytAmVbQje8surpHVZ89ya93BVrd8WcZTgFM7gAjy4hjrcQwOawGAEz/AKb07qvDjvzseidc0pZk7gD5zPH0xuj4c=</latexit>

v j2 v i v j4 (Lf )(i) = 4f (i)


<latexit sha1_base64="mSF8IO4PcIrqqTkYHHRwp+81fm4=">AAACFnicbVDLSsNAFJ34rPUVdelmsAgJ0pLUgm6EohsXLirYB6QhTKaTduzkwcxEKKFf4cZfceNCEbfizr9x2nShrReGczjnXu7c4yeMCmlZ39rS8srq2npho7i5tb2zq+/tt0ScckyaOGYx7/hIEEYj0pRUMtJJOEGhz0jbH15N/PYD4YLG0Z0cJcQNUT+iAcVIKsnTy8ZNYBrUhBewFigsO4Fx79nmyQSqOZzmUDNdTy9ZFWtacJHYM1ICs2p4+le3F+M0JJHEDAnh2FYi3QxxSTEj42I3FSRBeIj6xFE0QiERbjY9awyPldKDQczViyScqr8nMhQKMQp91RkiORDz3kT8z3NSGZy7GY2SVJII54uClEEZw0lGsEc5wZKNFEGYU/VXiAeIIyxVkkUVgj1/8iJpVSu2VbFva6X65SyOAjgER8AANjgDdXANGqAJMHgEz+AVvGlP2ov2rn3krUvabOYA/Cnt8wfSXZor</latexit>
[f (j1 ) + f (j2 ) + f (j3 ) + f (j4 )]
<latexit sha1_base64="SZgEewXqPKGPrjDdSUmH9iyx/Q8=">AAAB6nicbVBNS8NAEJ3Ur1q/qh69LBbBU0lEqMeiF48V7Qe0oWy2m3bpZhN2J4US+hO8eFDEq7/Im//GbZuDtj4YeLw3w8y8IJHCoOt+O4WNza3tneJuaW//4PCofHzSMnGqGW+yWMa6E1DDpVC8iQIl7ySa0yiQvB2M7+Z+e8K1EbF6wmnC/YgOlQgFo2ilx0lf9MsVt+ouQNaJl5MK5Gj0y1+9QczSiCtkkhrT9dwE/YxqFEzyWamXGp5QNqZD3rVU0YgbP1ucOiMXVhmQMNa2FJKF+nsio5Ex0yiwnRHFkVn15uJ/XjfF8MbPhEpS5IotF4WpJBiT+d9kIDRnKKeWUKaFvZWwEdWUoU2nZEPwVl9eJ62rqudWvYfrSv02j6MIZ3AOl+BBDepwDw1oAoMhPMMrvDnSeXHenY9la8HJZ07hD5zPH12EjdY=</latexit>

<latexit sha1_base64="0WpiUtTuwSLZ3suMV9SflJtIxbU=">AAAB7nicbVBNS8NAEJ3Ur1q/qh69LBbBU0mKoMeiF48V7Ae0IWy223btZhN2J4US+iO8eFDEq7/Hm//GbZuDtj4YeLw3w8y8MJHCoOt+O4WNza3tneJuaW//4PCofHzSMnGqGW+yWMa6E1LDpVC8iQIl7ySa0yiUvB2O7+Z+e8K1EbF6xGnC/YgOlRgIRtFK7UmQPQW1WVCuuFV3AbJOvJxUIEcjKH/1+jFLI66QSWpM13MT9DOqUTDJZ6VeanhC2ZgOeddSRSNu/Gxx7oxcWKVPBrG2pZAs1N8TGY2MmUah7YwojsyqNxf/87opDm78TKgkRa7YctEglQRjMv+d9IXmDOXUEsq0sLcSNqKaMrQJlWwI3urL66RVq3pu1Xu4qtRv8ziKcAbncAkeXEMd7qEBTWAwhmd4hTcncV6cd+dj2Vpw8plT+APn8wdN84+I</latexit> <latexit sha1_base64="OCG9IK2LIUnKt4ohP8DOu+UBQsQ=">AAAB7nicbVBNS8NAEJ3Ur1q/qh69LBbBU0mkoMeiF48V7Ae0IWy223btZhN2J4US+iO8eFDEq7/Hm//GbZuDtj4YeLw3w8y8MJHCoOt+O4WNza3tneJuaW//4PCofHzSMnGqGW+yWMa6E1LDpVC8iQIl7ySa0yiUvB2O7+Z+e8K1EbF6xGnC/YgOlRgIRtFK7UmQPQW1WVCuuFV3AbJOvJxUIEcjKH/1+jFLI66QSWpM13MT9DOqUTDJZ6VeanhC2ZgOeddSRSNu/Gxx7oxcWKVPBrG2pZAs1N8TGY2MmUah7YwojsyqNxf/87opDm78TKgkRa7YctEglQRjMv+d9IXmDOXUEsq0sLcSNqKaMrQJlWwI3urL66R1VfXcqvdQq9Rv8ziKcAbncAkeXEMd7qEBTWAwhmd4hTcncV6cd+dj2Vpw8plT+APn8wdQ/Y+K</latexit>

v j3 standard 5-point stencil for approximating r2 f


<latexit sha1_base64="2+8ZFeB5iqb5LZu1Xh8egbpiWzY=">AAAB8nicbVBNS8NAEJ3Ur1q/qh69LBbBiyUpgh6LXjxWsB/QxrLZbtqlm03YnQgl9Gd48aCIV3+NN/+N2zYHbX0w8Hhvhpl5QSKFQdf9dgpr6xubW8Xt0s7u3v5B+fCoZeJUM95ksYx1J6CGS6F4EwVK3kk0p1EgeTsY38789hPXRsTqAScJ9yM6VCIUjKKVuhc9RQNJH2sk7JcrbtWdg6wSLycVyNHol796g5ilEVfIJDWm67kJ+hnVKJjk01IvNTyhbEyHvGupohE3fjY/eUrOrDIgYaxtKSRz9fdERiNjJlFgOyOKI7PszcT/vG6K4bWfCZWkyBVbLApTSTAms//JQGjOUE4soUwLeythI6opQ5tSyYbgLb+8Slq1qudWvfvLSv0mj6MIJ3AK5+DBFdThDhrQBAYxPMMrvDnovDjvzseiteDkM8fwB87nDy6XkIU=</latexit>

<latexit sha1_base64="gNwBRORUkG0svNV4vLepOappi8M=">AAAB7nicbVBNS8NAEJ3Ur1q/qh69LBbBU0lU0GPRi8cK9gPaEDbbbbt2swm7k0IJ/RFePCji1d/jzX/jts1BWx8MPN6bYWZemEhh0HW/ncLa+sbmVnG7tLO7t39QPjxqmjjVjDdYLGPdDqnhUijeQIGStxPNaRRK3gpHdzO/NebaiFg94iThfkQHSvQFo2il1jjInoLLaVCuuFV3DrJKvJxUIEc9KH91ezFLI66QSWpMx3MT9DOqUTDJp6VuanhC2YgOeMdSRSNu/Gx+7pScWaVH+rG2pZDM1d8TGY2MmUSh7YwoDs2yNxP/8zop9m/8TKgkRa7YYlE/lQRjMvud9ITmDOXEEsq0sLcSNqSaMrQJlWwI3vLLq6R5UfXcqvdwVand5nEU4QRO4Rw8uIYa3EMdGsBgBM/wCm9O4rw4787HorXg5DPH8AfO5w9PeI+J</latexit>

15/47
Graph Laplacian

Why graph Laplacian?


- approximation of the Laplace operator

v j1
<latexit sha1_base64="MMoibnViGv7rQC7CiLUYHHmUZvY=">AAAB7nicbVBNS8NAEJ34WetX1aOXxSJ4KokIeix68VjBfkAbwma7adduNmF3UiihP8KLB0W8+nu8+W/ctjlo64OBx3szzMwLUykMuu63s7a+sbm1Xdop7+7tHxxWjo5bJsk0402WyER3Qmq4FIo3UaDknVRzGoeSt8PR3cxvj7k2IlGPOEm5H9OBEpFgFK3UHgf5U+BNg0rVrblzkFXiFaQKBRpB5avXT1gWc4VMUmO6npuin1ONgkk+Lfcyw1PKRnTAu5YqGnPj5/Nzp+TcKn0SJdqWQjJXf0/kNDZmEoe2M6Y4NMveTPzP62YY3fi5UGmGXLHFoiiTBBMy+530heYM5cQSyrSwtxI2pJoytAmVbQje8surpHVZ89ya93BVrd8WcZTgFM7gAjy4hjrcQwOawGAEz/AKb07qvDjvzseidc0pZk7gD5zPH0xuj4c=</latexit>

v j2 v i v j4 (Lf )(i) = 4f (i)


<latexit sha1_base64="mSF8IO4PcIrqqTkYHHRwp+81fm4=">AAACFnicbVDLSsNAFJ34rPUVdelmsAgJ0pLUgm6EohsXLirYB6QhTKaTduzkwcxEKKFf4cZfceNCEbfizr9x2nShrReGczjnXu7c4yeMCmlZ39rS8srq2npho7i5tb2zq+/tt0ScckyaOGYx7/hIEEYj0pRUMtJJOEGhz0jbH15N/PYD4YLG0Z0cJcQNUT+iAcVIKsnTy8ZNYBrUhBewFigsO4Fx79nmyQSqOZzmUDNdTy9ZFWtacJHYM1ICs2p4+le3F+M0JJHEDAnh2FYi3QxxSTEj42I3FSRBeIj6xFE0QiERbjY9awyPldKDQczViyScqr8nMhQKMQp91RkiORDz3kT8z3NSGZy7GY2SVJII54uClEEZw0lGsEc5wZKNFEGYU/VXiAeIIyxVkkUVgj1/8iJpVSu2VbFva6X65SyOAjgER8AANjgDdXANGqAJMHgEz+AVvGlP2ov2rn3krUvabOYA/Cnt8wfSXZor</latexit>
[f (j1 ) + f (j2 ) + f (j3 ) + f (j4 )]
<latexit sha1_base64="SZgEewXqPKGPrjDdSUmH9iyx/Q8=">AAAB6nicbVBNS8NAEJ3Ur1q/qh69LBbBU0lEqMeiF48V7Qe0oWy2m3bpZhN2J4US+hO8eFDEq7/Im//GbZuDtj4YeLw3w8y8IJHCoOt+O4WNza3tneJuaW//4PCofHzSMnGqGW+yWMa6E1DDpVC8iQIl7ySa0yiQvB2M7+Z+e8K1EbF6wmnC/YgOlQgFo2ilx0lf9MsVt+ouQNaJl5MK5Gj0y1+9QczSiCtkkhrT9dwE/YxqFEzyWamXGp5QNqZD3rVU0YgbP1ucOiMXVhmQMNa2FJKF+nsio5Ex0yiwnRHFkVn15uJ/XjfF8MbPhEpS5IotF4WpJBiT+d9kIDRnKKeWUKaFvZWwEdWUoU2nZEPwVl9eJ62rqudWvYfrSv02j6MIZ3AOl+BBDepwDw1oAoMhPMMrvDnSeXHenY9la8HJZ07hD5zPH12EjdY=</latexit>

<latexit sha1_base64="0WpiUtTuwSLZ3suMV9SflJtIxbU=">AAAB7nicbVBNS8NAEJ3Ur1q/qh69LBbBU0mKoMeiF48V7Ae0IWy223btZhN2J4US+iO8eFDEq7/Hm//GbZuDtj4YeLw3w8y8MJHCoOt+O4WNza3tneJuaW//4PCofHzSMnGqGW+yWMa6E1LDpVC8iQIl7ySa0yiUvB2O7+Z+e8K1EbF6xGnC/YgOlRgIRtFK7UmQPQW1WVCuuFV3AbJOvJxUIEcjKH/1+jFLI66QSWpM13MT9DOqUTDJZ6VeanhC2ZgOeddSRSNu/Gxx7oxcWKVPBrG2pZAs1N8TGY2MmUah7YwojsyqNxf/87opDm78TKgkRa7YctEglQRjMv+d9IXmDOXUEsq0sLcSNqKaMrQJlWwI3urL66RVq3pu1Xu4qtRv8ziKcAbncAkeXEMd7qEBTWAwhmd4hTcncV6cd+dj2Vpw8plT+APn8wdN84+I</latexit> <latexit sha1_base64="OCG9IK2LIUnKt4ohP8DOu+UBQsQ=">AAAB7nicbVBNS8NAEJ3Ur1q/qh69LBbBU0mkoMeiF48V7Ae0IWy223btZhN2J4US+iO8eFDEq7/Hm//GbZuDtj4YeLw3w8y8MJHCoOt+O4WNza3tneJuaW//4PCofHzSMnGqGW+yWMa6E1LDpVC8iQIl7ySa0yiUvB2O7+Z+e8K1EbF6xGnC/YgOlRgIRtFK7UmQPQW1WVCuuFV3AbJOvJxUIEcjKH/1+jFLI66QSWpM13MT9DOqUTDJZ6VeanhC2ZgOeddSRSNu/Gxx7oxcWKVPBrG2pZAs1N8TGY2MmUah7YwojsyqNxf/87opDm78TKgkRa7YctEglQRjMv+d9IXmDOXUEsq0sLcSNqKaMrQJlWwI3urL66R1VfXcqvdQq9Rv8ziKcAbncAkeXEMd7qEBTWAwhmd4hTcncV6cd+dj2Vpw8plT+APn8wdQ/Y+K</latexit>

v j3 standard 5-point stencil for approximating r2 f


<latexit sha1_base64="2+8ZFeB5iqb5LZu1Xh8egbpiWzY=">AAAB8nicbVBNS8NAEJ3Ur1q/qh69LBbBiyUpgh6LXjxWsB/QxrLZbtqlm03YnQgl9Gd48aCIV3+NN/+N2zYHbX0w8Hhvhpl5QSKFQdf9dgpr6xubW8Xt0s7u3v5B+fCoZeJUM95ksYx1J6CGS6F4EwVK3kk0p1EgeTsY38789hPXRsTqAScJ9yM6VCIUjKKVuhc9RQNJH2sk7JcrbtWdg6wSLycVyNHol796g5ilEVfIJDWm67kJ+hnVKJjk01IvNTyhbEyHvGupohE3fjY/eUrOrDIgYaxtKSRz9fdERiNjJlFgOyOKI7PszcT/vG6K4bWfCZWkyBVbLApTSTAms//JQGjOUE4soUwLeythI6opQ5tSyYbgLb+8Slq1qudWvfvLSv0mj6MIJ3AK5+DBFdThDhrQBAYxPMMrvDnovDjvzseiteDkM8fwB87nDy6XkIU=</latexit>

<latexit sha1_base64="gNwBRORUkG0svNV4vLepOappi8M=">AAAB7nicbVBNS8NAEJ3Ur1q/qh69LBbBU0lU0GPRi8cK9gPaEDbbbbt2swm7k0IJ/RFePCji1d/jzX/jts1BWx8MPN6bYWZemEhh0HW/ncLa+sbmVnG7tLO7t39QPjxqmjjVjDdYLGPdDqnhUijeQIGStxPNaRRK3gpHdzO/NebaiFg94iThfkQHSvQFo2il1jjInoLLaVCuuFV3DrJKvJxUIEc9KH91ezFLI66QSWpMx3MT9DOqUTDJp6VuanhC2YgOeMdSRSNu/Gx+7pScWaVH+rG2pZDM1d8TGY2MmUSh7YwoDs2yNxP/8zop9m/8TKgkRa7YYlE/lQRjMvud9ITmDOXEEsq0sLcSNqSaMrQJlWwI3vLLq6R5UfXcqvdwVand5nEU4QRO4Rw8uIYa3EMdGsBgBM/wCm9O4rw4787HorXg5DPH8AfO5w9PeI+J</latexit>

- converges to the Laplace-Beltrami operator (given certain conditions)


- provides a notion of “frequency” on graphs

15/47
Graph Laplacian
f (7)
f (8)
f (1) f (3)
f (6)
Graph signal f : V ! RN
f (2) f (5)
f (4)

16/47
Graph Laplacian
f (7)
f (8)
f (1) f (3)
f (6)
Graph signal f : V ! RN
f (2) f (5)
f (4)

N
X
Lf (i) = Wij (f (i) f (j))
<latexit sha1_base64="zO2xf/xCtxo26vDvdM0iapzK8QA=">AAACD3icbVDLSsNAFJ34rPUVdelmsCjpwpKIoJtC0Y0LkQr2AW0Mk+mknXbyYGYilJA/cOOvuHGhiFu37vwbJ20W2nrgwuGce7n3HjdiVEjT/NYWFpeWV1YLa8X1jc2tbX1ntynCmGPSwCELedtFgjAakIakkpF2xAnyXUZa7ugy81sPhAsaBndyHBHbR/2AehQjqSRHP7r2DFqGVdgVse8kw6qV3t/AlpPQYWpk1rFnDMtlRy+ZFXMCOE+snJRAjrqjf3V7IY59EkjMkBAdy4yknSAuKWYkLXZjQSKER6hPOooGyCfCTib/pPBQKT3ohVxVIOFE/T2RIF+Ise+qTh/JgZj1MvE/rxNL79xOaBDFkgR4usiLGZQhzMKBPcoJlmysCMKcqlshHiCOsFQRFlUI1uzL86R5UrHMinV7Wqpd5HEUwD44AAawwBmogStQBw2AwSN4Bq/gTXvSXrR37WPauqDlM3vgD7TPH9RQmfw=</latexit>
j=1

16/47
Graph Laplacian
f (7)
f (8)
f (1) f (3)
f (6)
Graph signal f : V ! RN
f (2) f (5)
f (4)

N
X XN
T 1 2
Lf (i) = Wij (f (i) f (j)) f Lf = Wij (f (i) f (j))
j=1
2 i,j=1
<latexit sha1_base64="zO2xf/xCtxo26vDvdM0iapzK8QA=">AAACD3icbVDLSsNAFJ34rPUVdelmsCjpwpKIoJtC0Y0LkQr2AW0Mk+mknXbyYGYilJA/cOOvuHGhiFu37vwbJ20W2nrgwuGce7n3HjdiVEjT/NYWFpeWV1YLa8X1jc2tbX1ntynCmGPSwCELedtFgjAakIakkpF2xAnyXUZa7ugy81sPhAsaBndyHBHbR/2AehQjqSRHP7r2DFqGVdgVse8kw6qV3t/AlpPQYWpk1rFnDMtlRy+ZFXMCOE+snJRAjrqjf3V7IY59EkjMkBAdy4yknSAuKWYkLXZjQSKER6hPOooGyCfCTib/pPBQKT3ohVxVIOFE/T2RIF+Ise+qTh/JgZj1MvE/rxNL79xOaBDFkgR4usiLGZQhzMKBPcoJlmysCMKcqlshHiCOsFQRFlUI1uzL86R5UrHMinV7Wqpd5HEUwD44AAawwBmogStQBw2AwSN4Bq/gTXvSXrR37WPauqDlM3vgD7TPH9RQmfw=</latexit>

A measure of “smoothness”

Zhou and Schölkopf, “A regularization framework for learning from graph data,” ICML Workshop, 2004. 16/47
Graph Laplacian

v1 v1
v3 v4 v3 v4
v2 v5 v2 v5
v7 v6 v7 v6
v8 v8
v9 v9

f T Lf = 1 f T Lf = 21

17/47
Graph Laplacian
• L has a complete set of orthonormal eigenvectors: L = ⇤ T

2 32 32 3
0 0 0 0 0 0 0
6 . .· · · 76 .. 76 . .· · · 7
L 4 0 . N-154 . 54 . 5
0 N 1 0 N 1 0 N 1
N-1
T

18/47
Graph Laplacian
• L has a complete set of orthonormal eigenvectors: L = ⇤ T

2 32 32 3
0 0 0 0 0 0 0
6 . .· · · 76 .. 76 . .· · · 7
L 4 0 . N-154 . 54 . 5
0 N 1 0 N 1 0 N 1
N-1
T

! Eigenvalues are usually sorted increasingly: 0 = 0 < 1  ...  N 1

18/47
Graph Fourier transform

0 1 50

Shuman et al., “The emerging field of signal processing on graphs,” IEEE SPM, 2013. 19/47
Graph Fourier transform

0 1 50

Shuman et al., “The emerging field of signal processing on graphs,” IEEE SPM, 2013. 19/47
Graph Fourier transform

0 1 50

Low frequency High frequency


T T T
L= ⇤ 0L 0 = 0 =0 50 L 50 = 50

! Eigenvectors associated with smaller eigenvalues have values that vary less rapidly
along the edges

Shuman et al., “The emerging field of signal processing on graphs,” IEEE SPM, 2013. 19/47
Graph Fourier transform

0 1 50

Low frequency High frequency


T T T
L= ⇤ 0L 0 = 0 =0 50 L 50 = 50

Graph Fourier transform:


[Hammond11] 2 3T
0 0
6 . .· · · 7
fˆ(`) = h `, f i : 4 0 . N-15 f
0 N 1

Shuman et al., “The emerging field of signal processing on graphs,” IEEE SPM, 2013. 19/47
Graph Fourier transform

0 1 50

Low frequency High frequency


T T T
L= ⇤ 0L 0 = 0 =0 50 L 50 = 50

Graph Fourier transform:


[Hammond11] 2 3T
0 0
6 . .· · · 7
fˆ(`) = h `, f i : 4 0 . N-15 f
0 N 1
···
0 1 2 3 4 ··· N 1
Low frequency High frequency

Shuman et al., “The emerging field of signal processing on graphs,” IEEE SPM, 2013. 19/47
Graph Fourier transform
• The Laplacian L admits the following eigendecomposition: L ` = ` `

20/47
Graph Fourier transform
• The Laplacian L admits the following eigendecomposition: L ` = ` `

one-dimensional Laplace operator: <latexit sha1_base64="PDMHKhlUy3bGTOQbkcB7hyXfxgU=">AAAB8HicbVBNS8NAEJ34WetX1aOXYBG8WJIi6LHoxWMF+yFtLJPtpl26uwm7G6GE/govHhTx6s/x5r9x2+agrQ8GHu/NMDMvTDjTxvO+nZXVtfWNzcJWcXtnd2+/dHDY1HGqCG2QmMeqHaKmnEnaMMxw2k4URRFy2gpHN1O/9USVZrG8N+OEBgIHkkWMoLHSw3lXYsjxsdorlb2KN4O7TPyclCFHvVf66vZjkgoqDeGodcf3EhNkqAwjnE6K3VTTBMkIB7RjqURBdZDNDp64p1bpu1GsbEnjztTfExkKrccitJ0CzVAvelPxP6+TmugqyJhMUkMlmS+KUu6a2J1+7/aZosTwsSVIFLO3umSIComxGRVtCP7iy8ukWa34XsW/uyjXrvM4CnAMJ3AGPlxCDW6hDg0gIOAZXuHNUc6L8+58zFtXnHzmCP7A+fwBEziP6w==</latexit>


r2

eigenfunctions: ej x

Z
Classical FT: fˆ( ) = (ej x ⇤
) f (x)dx
Z
1
f (x) = fˆ(⇥)ej x
d⇥
2

20/47
Graph Fourier transform
• The Laplacian L admits the following eigendecomposition: L ` = ` `

one-dimensional Laplace operator: <latexit sha1_base64="PDMHKhlUy3bGTOQbkcB7hyXfxgU=">AAAB8HicbVBNS8NAEJ34WetX1aOXYBG8WJIi6LHoxWMF+yFtLJPtpl26uwm7G6GE/govHhTx6s/x5r9x2+agrQ8GHu/NMDMvTDjTxvO+nZXVtfWNzcJWcXtnd2+/dHDY1HGqCG2QmMeqHaKmnEnaMMxw2k4URRFy2gpHN1O/9USVZrG8N+OEBgIHkkWMoLHSw3lXYsjxsdorlb2KN4O7TPyclCFHvVf66vZjkgoqDeGodcf3EhNkqAwjnE6K3VTTBMkIB7RjqURBdZDNDp64p1bpu1GsbEnjztTfExkKrccitJ0CzVAvelPxP6+TmugqyJhMUkMlmS+KUu6a2J1+7/aZosTwsSVIFLO3umSIComxGRVtCP7iy8ukWa34XsW/uyjXrvM4CnAMJ3AGPlxCDW6hDg0gIOAZXuHNUc6L8+58zFtXnHzmCP7A+fwBEziP6w==</latexit>


r2 graph Laplacian: L

eigenfunctions: ej x
eigenvectors: `
f : V ! RN
Z N
X
Classical FT: fˆ( ) = (ej x ⇤
) f (x)dx Graph FT: fˆ(⇥) = h , fi = ⇤
(i)f (i)
i=1
Z N
X1
1
f (x) = fˆ(⇥)ej x
d⇥ f (i) = fˆ(⇥) (i)
2 =0

20/47
Graph Fourier transform
• The Laplacian L admits the following eigendecomposition: L ` = ` `

one-dimensional Laplace operator: <latexit sha1_base64="PDMHKhlUy3bGTOQbkcB7hyXfxgU=">AAAB8HicbVBNS8NAEJ34WetX1aOXYBG8WJIi6LHoxWMF+yFtLJPtpl26uwm7G6GE/govHhTx6s/x5r9x2+agrQ8GHu/NMDMvTDjTxvO+nZXVtfWNzcJWcXtnd2+/dHDY1HGqCG2QmMeqHaKmnEnaMMxw2k4URRFy2gpHN1O/9USVZrG8N+OEBgIHkkWMoLHSw3lXYsjxsdorlb2KN4O7TPyclCFHvVf66vZjkgoqDeGodcf3EhNkqAwjnE6K3VTTBMkIB7RjqURBdZDNDp64p1bpu1GsbEnjztTfExkKrccitJ0CzVAvelPxP6+TmugqyJhMUkMlmS+KUu6a2J1+7/aZosTwsSVIFLO3umSIComxGRVtCP7iy8ukWa34XsW/uyjXrvM4CnAMJ3AGPlxCDW6hDg0gIOAZXuHNUc6L8+58zFtXnHzmCP7A+fwBEziP6w==</latexit>


r2 graph Laplacian: L

eigenfunctions: ej x
eigenvectors: `
f : V ! RN
Z N
X
Classical FT: fˆ( ) = (ej x ⇤
) f (x)dx Graph FT: fˆ(⇥) = h , fi = ⇤
(i)f (i)
i=1
Z N
X1
1
f (x) = fˆ(⇥)ej x
d⇥ f (i) = fˆ(⇥) (i)
2 =0

20/47
Two special cases

Vandergheynst and Shuman, “Wavelets on graphs, an introduction,” Université de Provence, 2011. 21/47
Two special cases

Vandergheynst and Shuman, “Wavelets on graphs, an introduction,” Université de Provence, 2011. 22/47
Outline
• Motivation

• Graph signal processing (GSP): Basic concepts

• Spectral filtering: Basic tools of GSP

• Connection with literature

• Applications in neuroscience

23/47
Classical frequency filtering
Z Z
1
Classical FT: fˆ( ) = (e j x ⇤
) f (x)dx f (x) = fˆ(⇥)ej x
d⇥
2

24/47
Classical frequency filtering
Z Z
1
Classical FT: fˆ( ) = (e j x ⇤
) f (x)dx f (x) = fˆ(⇥)ej x
d⇥
2

Apply filter with transfer function ĝ(·) to a signal f

FT ĝ(!) IFT

f fˆ(!) ĝ(!)fˆ(!) f ⇤g

24/47
Classical frequency filtering
Z Z
1
Classical FT: fˆ( ) = (e j x ⇤
) f (x)dx f (x) = fˆ(⇥)ej x
d⇥
2

Apply filter with transfer function ĝ(·) to a signal f

FT ĝ(!) IFT

f fˆ(!) ĝ(!)fˆ(!) f ⇤g

24/47
Graph spectral filtering
N
X N
X1
GFT: fˆ(`) = h `, f i = ⇤
` (i)f (i) f (i) = fˆ(`) ` (i)
i=1 `=0

25/47
Graph spectral filtering
N
X N
X1
GFT: fˆ(`) = h `, f i = ⇤
` (i)f (i) f (i) = fˆ(`) ` (i)
i=1 `=0

Apply filter with transfer function ĝ(·) to a graph signal f : V ! RN

GFT ĝ( ` ) IGFT


N
X1
f fˆ(`) ĝ( ` )fˆ(`) f (i) = ĝ( ` )fˆ(`) ` (i)
`=0

25/47
Graph spectral filtering
N
X N
X1
GFT: fˆ(`) = h `, f i = ⇤
` (i)f (i) f (i) = fˆ(`) ` (i)
i=1 `=0

Apply filter with transfer function ĝ(·) to a graph signal f : V ! RN

GFT ĝ( ` ) IGFT


N
X1
f fˆ(`) ĝ( ` )fˆ(`) f (i) = ĝ( ` )fˆ(`) ` (i)
`=0

` ` `

25/47
Graph spectral filtering
N
X N
X1
GFT: fˆ(`) = h `, f i = ⇤
` (i)f (i) f (i) = fˆ(`) ` (i)
i=1 `=0

Apply filter with transfer function ĝ(·) to a graph signal f : V ! RN

GFT ĝ(⇤) IGFT

f T T T
f ĝ(⇤) f ĝ(⇤) f
2 3
ĝ( 0) 0
6 .. 7
ĝ(⇤) = 4 . 5
0 ĝ( N 1)

25/47
Graph Laplacian revisited
N
X N
X1
GFT: fˆ(`) = h `, f i = ⇤
` (i)f (i) f (i) = fˆ(`) ` (i)
i=1 `=0

T
The Laplacian L is a difference operator: Lf = ⇤ f

GFT ĝ(⇤) = ⇤ IGFT


T T T
f f ⇤ f ⇤ f
2 3
0 0
6 .. 7
4 . 5
0 N 1

26/47
Graph Laplacian revisited
N
X N
X1
GFT: fˆ(`) = h `, f i = ⇤
` (i)f (i) f (i) = fˆ(`) ` (i)
i=1 `=0

T
The Laplacian L is a difference operator: Lf = ⇤ f

GFT ĝ(⇤) = ⇤ IGFT


T T T
f f ⇤ f ⇤ f
2 3
0 0
6 .. 7
4 . 5
0 N 1

The Laplacian operator filters the signal in the spectral domain by its eigenvalues!
1 1
The Laplacian quadratic form: f T Lf = ||L 2 f ||2 = || ⇤ 2 T
f ||2

26/47
Graph transform/dictionary design
• Transforms and dictionaries can be designed through graph spectral
filtering: Functions of graph Laplacian!

GFT ĝ(⇤) IGFT

f T T T
f ĝ(⇤) f ĝ(⇤) f

27/47
Graph transform/dictionary design
• Transforms and dictionaries can be designed through graph spectral
filtering: Functions of graph Laplacian!

GFT ĝ(⇤) IGFT

f T T T
f ĝ(⇤) f ĝ(⇤) f
ĝ(L) : functions of L!

27/47
Graph transform/dictionary design
• Transforms and dictionaries can be designed through graph spectral
filtering: Functions of graph Laplacian!

GFT ĝ(⇤) IGFT

f T T T
f ĝ(⇤) f ĝ(⇤) f
ĝ(L) : functions of L!

! Important properties can be achieved by properly defining ĝ(L) , such


as localisation of atoms
! Closely related to kernels and regularisation on graphs

Smola and Kondor, “Kernels and regularization on graphs”, COLT, 2003. 27/47
A simple example
GFT ĝ(⇤) IGFT

f T T T
f ĝ(⇤) f ĝ(⇤) f
ĝ(L)

28/47
A simple example
GFT ĝ(⇤) IGFT

f T T T
f ĝ(⇤) f ĝ(⇤) f
ĝ(L)

Problem: We observe a noisy graph signal f = y0 + ⌘ and wish to recover y0

y ⇤ = arg min{||y f ||22 + y T Ly}


y

28/47
A simple example
GFT ĝ(⇤) IGFT

f T T T
f ĝ(⇤) f ĝ(⇤) f
ĝ(L)

Problem: We observe a noisy graph signal f = y0 + ⌘ and wish to recover y0

Data fitting term


y ⇤ = arg min{||y f ||22 + y T Ly}
y
“Smoothness” assumption

28/47
A simple example
GFT ĝ(⇤) IGFT

f T T T
f ĝ(⇤) f ĝ(⇤) f
ĝ(L)

Problem: We observe a noisy graph signal f = y0 + ⌘ and wish to recover y0

Data fitting term


y ⇤ = arg min{||y f ||22 + y T Ly}
y
“Smoothness” assumption

y ⇤ = (I + L) 1
f Laplacian (Tikhonov) regularisation is equivalent to
low-pass filtering in the graph spectral domain!
ĝ(L)

28/47
A simple example
• noisy image as the observed noisy graph signal
• regular grid graph (weights inversely proportional to pixel value difference)

i j 1
<latexit sha1_base64="9jtkmSnVkzQh6f031uh1XGFh3yI=">AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lEqMeiF48t2A9oQ9lsJ+3azSbsboQS+gu8eFDEqz/Jm//GbZuDtj4YeLw3w8y8IBFcG9f9dgobm1vbO8Xd0t7+weFR+fikreNUMWyxWMSqG1CNgktsGW4EdhOFNAoEdoLJ3dzvPKHSPJYPZpqgH9GR5CFn1FipyQflilt1FyDrxMtJBXI0BuWv/jBmaYTSMEG17nluYvyMKsOZwFmpn2pMKJvQEfYslTRC7WeLQ2fkwipDEsbKljRkof6eyGik9TQKbGdEzVivenPxP6+XmvDGz7hMUoOSLReFqSAmJvOvyZArZEZMLaFMcXsrYWOqKDM2m5INwVt9eZ20r6qeW/Wa15X6bR5HEc7gHC7BgxrU4R4a0AIGCM/wCm/Oo/PivDsfy9aCk8+cwh84nz/PfYzt</latexit>
<latexit

<latexit sha1_base64="ocTLVc574NOAe/DoMycp9A9e/3g=">AAAB6HicbVBNS8NAEJ3Ur1q/qh69LBbBU0lE0GPRi8cW7Ae0oWy2k3bbzSbsboQS+gu8eFDEqz/Jm//GbZuDtj4YeLw3w8y8IBFcG9f9dgobm1vbO8Xd0t7+weFR+fikpeNUMWyyWMSqE1CNgktsGm4EdhKFNAoEtoPJ/dxvP6HSPJaPZpqgH9Gh5CFn1FipMe6XK27VXYCsEy8nFchR75e/eoOYpRFKwwTVuuu5ifEzqgxnAmelXqoxoWxCh9i1VNIItZ8tDp2RC6sMSBgrW9KQhfp7IqOR1tMosJ0RNSO96s3F/7xuasJbP+MySQ1KtlwUpoKYmMy/JgOukBkxtYQyxe2thI2ooszYbEo2BG/15XXSuqp6btVrXFdqd3kcRTiDc7gED26gBg9QhyYwQHiGV3hzxs6L8+58LFsLTj5zCn/gfP4A0QGM7g==</latexit>

wij =
<latexit sha1_base64="nFm+SaNDsjfq+0JR+lSDlnHSUv4=">AAACB3icbVDLSsNAFJ3UV62vqEtBBotQF5ZEBN0IRTcuK9gHtCFMppN27GQSZiZKSbNz46+4caGIW3/BnX/jpM1CWw/cy+Gce5m5x4sYlcqyvo3CwuLS8kpxtbS2vrG5ZW7vNGUYC0waOGShaHtIEkY5aSiqGGlHgqDAY6TlDa8yv3VPhKQhv1WjiDgB6nPqU4yUllxz/8FN6F0KL2DXFwgndpqM/Qo9Os7aOHXNslW1JoDzxM5JGeSou+ZXtxfiOCBcYYak7NhWpJwECUUxI2mpG0sSITxEfdLRlKOASCeZ3JHCQ630oB8KXVzBifp7I0GBlKPA05MBUgM562Xif14nVv65k1AexYpwPH3IjxlUIcxCgT0qCFZspAnCguq/QjxAOg+loyvpEOzZk+dJ86RqW1X75rRcu8zjKII9cAAqwAZnoAauQR00AAaP4Bm8gjfjyXgx3o2P6WjByHd2wR8Ynz+PF5hz</latexit>
|f (i) f (i)|

Shuman et al., “The emerging field of signal processing on graphs,” IEEE SPM, 2013. 29/47
A simple example
• noisy image as the observed noisy graph signal
• regular grid graph (weights inversely proportional to pixel value difference)

Shuman et al., “The emerging field of signal processing on graphs,” IEEE SPM, 2013. 29/47
Example designs
GFT ĝ(⇤) IGFT

f T T T
f ĝ(⇤) f ĝ(⇤) f
ĝ(L)
1 1 T
Low-pass filters: ĝ(L) = (I + L) = (I + ⇤)

30/47
Example designs
GFT ĝ(⇤) IGFT

f T T T
f ĝ(⇤) f ĝ(⇤) f
ĝ(L)
1 1 T
Low-pass filters: ĝ(L) = (I + L) = (I + ⇤)

Window kernel: Windowed graph Fourier transform

Shifted and dilated band-pass filters: Spectral graph wavelets ĝ(sL)

Adapted kernels: Learn values of ĝ(L) directly from data

K
X K
X
Parametric polynomials: ĝs (L) = ↵sk Lk = ( ↵sk ⇤k ) T

k=0 k=0

Shuman et al., “Dictionary design for graph signal processing,” GSP Workshop, 2016. 30/47
Outline
• Motivation

• Graph signal processing (GSP): Basic concepts

• Spectral filtering: Basic tools of GSP

• Connection with literature

• Applications in neuroscience

31/47
GSP and the literature
There is a rich literature about data analysis and learning on graphs

32/47
GSP and the literature
There is a rich literature about data analysis and learning on graphs

network science

32/47
GSP and the literature
There is a rich literature about data analysis and learning on graphs

v1 v1
v3 v4 v3 v4
v2 v5 v2 v5
v6 v7 v6
v8 v7 v8
v9 v9

network science diffusion on graphs

32/47
GSP and the literature
There is a rich literature about data analysis and learning on graphs

v1 v1
v3 v4 v3 v4
v2 v5 v2 v5
v6 v7 v6
v8 v7 v8
v9 v9

network science diffusion on graphs

unsupervised learning (dimensionality


reduction, clustering)

32/47
GSP and the literature
There is a rich literature about data analysis and learning on graphs

v1 v1
v3 v4 v3 v4
v2 v5 v2 v5
v6 v7 v6
v8 v7 v8
v9 v9

network science diffusion on graphs

unsupervised learning (dimensionality


semi-supervised learning
reduction, clustering)

32/47
Network centrality

eigenvector centrality degree centrality

Wx =
<latexit sha1_base64="sBW8qU4KM3dYKIBIdJHsvFC7fc8=">AAACAXicbVDLSsNAFJ34rPUVdSO4GSyCq5KIoBuh6MZlBfuAJoTJZNIOnTyYuZGUUDf+ihsXirj1L9z5N07bLLT1wMDhnHu4c4+fCq7Asr6NpeWV1bX1ykZ1c2t7Z9fc22+rJJOUtWgiEtn1iWKCx6wFHATrppKRyBes4w9vJn7ngUnFk/geRilzI9KPecgpAS155mEnx1fYEToREM8BlkMRkXyce2bNqltT4EVil6SGSjQ988sJEppFLAYqiFI920rBLYgETgUbV51MsZTQIemznqYxiZhyi+kFY3yilQCHidQvBjxVfycKEik1inw9GREYqHlvIv7n9TIIL92Cx2kGLKazRWEmMCR4UgcOuGQUxEgTQiXXf8V0QCShoEur6hLs+ZMXSfusblt1++681rgu66igI3SMTpGNLlAD3aImaiGKHtEzekVvxpPxYrwbH7PRJaPMHKA/MD5/AGCxlto=</latexit>
max x d = [d(v1 ), · · · , d(vN )]
<latexit sha1_base64="XwJXv3tVixpnFURG8kcBl7VWpdM=">AAACA3icbVDLSsNAFJ3UV62vqDvdDBahhVISEXQjFN24kgr2AWkIk8mkHTrJhJlJoYSCG3/FjQtF3PoT7vwbJ20X2nrgwplz7mXuPX7CqFSW9W0UVlbX1jeKm6Wt7Z3dPXP/oC15KjBpYc646PpIEkZj0lJUMdJNBEGRz0jHH97kfmdEhKQ8flDjhLgR6sc0pBgpLXnmUQCvoBNURp5drfVwwJWs5a+7quuZZatuTQGXiT0nZTBH0zO/egHHaURihRmS0rGtRLkZEopiRialXipJgvAQ9YmjaYwiIt1sesMEnmolgCEXumIFp+rviQxFUo4jX3dGSA3kopeL/3lOqsJLN6NxkioS49lHYcqg4jAPBAZUEKzYWBOEBdW7QjxAAmGlYyvpEOzFk5dJ+6xuW3X7/rzcuJ7HUQTH4ARUgA0uQAPcgiZoAQwewTN4BW/Gk/FivBsfs9aCMZ85BH9gfP4AwceVrA==</latexit>

33/47
Network centrality

eigenvector centrality degree centrality

Wx =
<latexit sha1_base64="sBW8qU4KM3dYKIBIdJHsvFC7fc8=">AAACAXicbVDLSsNAFJ34rPUVdSO4GSyCq5KIoBuh6MZlBfuAJoTJZNIOnTyYuZGUUDf+ihsXirj1L9z5N07bLLT1wMDhnHu4c4+fCq7Asr6NpeWV1bX1ykZ1c2t7Z9fc22+rJJOUtWgiEtn1iWKCx6wFHATrppKRyBes4w9vJn7ngUnFk/geRilzI9KPecgpAS155mEnx1fYEToREM8BlkMRkXyce2bNqltT4EVil6SGSjQ988sJEppFLAYqiFI920rBLYgETgUbV51MsZTQIemznqYxiZhyi+kFY3yilQCHidQvBjxVfycKEik1inw9GREYqHlvIv7n9TIIL92Cx2kGLKazRWEmMCR4UgcOuGQUxEgTQiXXf8V0QCShoEur6hLs+ZMXSfusblt1++681rgu66igI3SMTpGNLlAD3aImaiGKHtEzekVvxpPxYrwbH7PRJaPMHKA/MD5/AGCxlto=</latexit>
max x d = [d(v1 ), · · · , d(vN )]
<latexit sha1_base64="XwJXv3tVixpnFURG8kcBl7VWpdM=">AAACA3icbVDLSsNAFJ3UV62vqDvdDBahhVISEXQjFN24kgr2AWkIk8mkHTrJhJlJoYSCG3/FjQtF3PoT7vwbJ20X2nrgwplz7mXuPX7CqFSW9W0UVlbX1jeKm6Wt7Z3dPXP/oC15KjBpYc646PpIEkZj0lJUMdJNBEGRz0jHH97kfmdEhKQ8flDjhLgR6sc0pBgpLXnmUQCvoBNURp5drfVwwJWs5a+7quuZZatuTQGXiT0nZTBH0zO/egHHaURihRmS0rGtRLkZEopiRialXipJgvAQ9YmjaYwiIt1sesMEnmolgCEXumIFp+rviQxFUo4jX3dGSA3kopeL/3lOqsJLN6NxkioS49lHYcqg4jAPBAZUEKzYWBOEBdW7QjxAAmGlYyvpEOzFk5dJ+6xuW3X7/rzcuJ7HUQTH4ARUgA0uQAPcgiZoAQwewTN4BW/Gk/FivBsfs9aCMZ85BH9gfP4AwceVrA==</latexit>

- Google’s PageRank is a variant of eigenvector centrality


- eigenvectors of W can also be used to provide a frequency
interpretation for graph signals

PageRank: http://www.ams.org/publicoutreach/feature-column/fcarc-pagerank
Sandryhaila and Moura, “Discrete signal processing on graphs”, IEEE TSP, 2013. 33/47
Diffusion on graphs

v1 v1
v3 v4 heat diffusion v3 v4
v2 v5 v2 v5
v6 v7 v6
v8 v7 v8
v9 v9

34/47
Diffusion on graphs

v1 v1
v3 v4 heat diffusion v3 v4
v2 v5 v2 v5
v6 v7 v6
v8 v7 v8
v9 v9

@x
Lx = 0 ⌧L
<latexit sha1_base64="1bh3hUKTLyjuw3gM5dIj8tUE4eM=">AAACEnicbZDLSgMxFIbP1Futt1GXboJF0IVlRgTdCEU3LlxUsBfolJJJM21o5kKSkZZhnsGNr+LGhSJuXbnzbcy0A2rrD4GP/5yT5PxuxJlUlvVlFBYWl5ZXiqultfWNzS1ze6chw1gQWichD0XLxZJyFtC6YorTViQo9l1Om+7wKqs376mQLAzu1DiiHR/3A+YxgpW2uuaR4wlMEifCQjHM0Sj9YUfhOEXH6GaELpDVNctWxZoIzYOdQxly1brmp9MLSezTQBGOpWzbVqQ6SXY54TQtObGkESZD3KdtjQH2qewkk5VSdKCdHvJCoU+g0MT9PZFgX8qx7+pOH6uBnK1l5n+1dqy8807CgihWNCDTh7yYIxWiLB/UY4ISxccaMBFM/xWRAdYZKZ1iSYdgz648D42Tim1V7NvTcvUyj6MIe7APh2DDGVThGmpQBwIP8AQv8Go8Gs/Gm/E+bS0Y+cwu/JHx8Q0E5Jz7</latexit>
@⌧ x(v, ⌧ ) = e
<latexit sha1_base64="WlJIO1M+2q14Y57C7/EkK67XbYo=">AAACBHicbZDLSsNAFIYn9VbrLeqym8EitKAlEUE3QtGNCxcV7AXaGCbTSTt0Mgkzk9ISunDjq7hxoYhbH8Kdb+OkzUKrPwx8/OcczpzfixiVyrK+jNzS8srqWn69sLG5tb1j7u41ZRgLTBo4ZKFoe0gSRjlpKKoYaUeCoMBjpOUNr9J6a0SEpCG/U5OIOAHqc+pTjJS2XLM4Lo+OugrFlQtynxynBG+mY9cqjyquWbKq1kzwL9gZlECmumt+dnshjgPCFWZIyo5tRcpJkFAUMzItdGNJIoSHqE86GjkKiHSS2RFTeKidHvRDoR9XcOb+nEhQIOUk8HRngNRALtZS879aJ1b+uZNQHsWKcDxf5McMqhCmicAeFQQrNtGAsKD6rxAPkEBY6dwKOgR78eS/0Dyp2lbVvj0t1S6zOPKgCA5AGdjgDNTANaiDBsDgATyBF/BqPBrPxpvxPm/NGdnMPvgl4+MbqoiWzw==</latexit>
x0 (v)
x(v, 0) = x0 (v)
<latexit sha1_base64="I5o6yAGRplilcEaFiOm06mxhT5g=">AAAB9HicbVDLSgNBEOyNrxhfUY9eBoOQgIRZEfQiBL14jGAekCxhdjKbDJl9ODO7JCz5Di8eFPHqx3jzb5wke9DEgoaiqpvuLjcSXGmMv63c2vrG5lZ+u7Czu7d/UDw8aqowlpQ1aChC2XaJYoIHrKG5FqwdSUZ8V7CWO7qb+a2EScXD4FFPIub4ZBBwj1OijeSMy8k5rtyMe7icVHrFEq7iOdAqsTNSggz1XvGr2w9p7LNAU0GU6tg40k5KpOZUsGmhGysWEToiA9YxNCA+U046P3qKzozSR14oTQUazdXfEynxlZr4run0iR6qZW8m/ud1Yu1dOykPolizgC4WebFAOkSzBFCfS0a1mBhCqOTmVkSHRBKqTU4FE4K9/PIqaV5UbVy1Hy5LtdssjjycwCmUwYYrqME91KEBFJ7gGV7hzUqsF+vd+li05qxs5hj+wPr8AYpFkKI=</latexit>

34/47
Diffusion on graphs

v1 v1
v3 v4 heat diffusion v3 v4
v2 v5 v2 v5
v6 v7 v6
v8 v7 v8
v9 v9

@x
Lx = 0 ⌧L
<latexit sha1_base64="1bh3hUKTLyjuw3gM5dIj8tUE4eM=">AAACEnicbZDLSgMxFIbP1Futt1GXboJF0IVlRgTdCEU3LlxUsBfolJJJM21o5kKSkZZhnsGNr+LGhSJuXbnzbcy0A2rrD4GP/5yT5PxuxJlUlvVlFBYWl5ZXiqultfWNzS1ze6chw1gQWichD0XLxZJyFtC6YorTViQo9l1Om+7wKqs376mQLAzu1DiiHR/3A+YxgpW2uuaR4wlMEifCQjHM0Sj9YUfhOEXH6GaELpDVNctWxZoIzYOdQxly1brmp9MLSezTQBGOpWzbVqQ6SXY54TQtObGkESZD3KdtjQH2qewkk5VSdKCdHvJCoU+g0MT9PZFgX8qx7+pOH6uBnK1l5n+1dqy8807CgihWNCDTh7yYIxWiLB/UY4ISxccaMBFM/xWRAdYZKZ1iSYdgz648D42Tim1V7NvTcvUyj6MIe7APh2DDGVThGmpQBwIP8AQv8Go8Gs/Gm/E+bS0Y+cwu/JHx8Q0E5Jz7</latexit>
@⌧ x(v, ⌧ ) = e
<latexit sha1_base64="WlJIO1M+2q14Y57C7/EkK67XbYo=">AAACBHicbZDLSsNAFIYn9VbrLeqym8EitKAlEUE3QtGNCxcV7AXaGCbTSTt0Mgkzk9ISunDjq7hxoYhbH8Kdb+OkzUKrPwx8/OcczpzfixiVyrK+jNzS8srqWn69sLG5tb1j7u41ZRgLTBo4ZKFoe0gSRjlpKKoYaUeCoMBjpOUNr9J6a0SEpCG/U5OIOAHqc+pTjJS2XLM4Lo+OugrFlQtynxynBG+mY9cqjyquWbKq1kzwL9gZlECmumt+dnshjgPCFWZIyo5tRcpJkFAUMzItdGNJIoSHqE86GjkKiHSS2RFTeKidHvRDoR9XcOb+nEhQIOUk8HRngNRALtZS879aJ1b+uZNQHsWKcDxf5McMqhCmicAeFQQrNtGAsKD6rxAPkEBY6dwKOgR78eS/0Dyp2lbVvj0t1S6zOPKgCA5AGdjgDNTANaiDBsDgATyBF/BqPBrPxpvxPm/NGdnMPvgl4+MbqoiWzw==</latexit>
x0 (v)
x(v, 0) = x0 (v)
<latexit sha1_base64="I5o6yAGRplilcEaFiOm06mxhT5g=">AAAB9HicbVDLSgNBEOyNrxhfUY9eBoOQgIRZEfQiBL14jGAekCxhdjKbDJl9ODO7JCz5Di8eFPHqx3jzb5wke9DEgoaiqpvuLjcSXGmMv63c2vrG5lZ+u7Czu7d/UDw8aqowlpQ1aChC2XaJYoIHrKG5FqwdSUZ8V7CWO7qb+a2EScXD4FFPIub4ZBBwj1OijeSMy8k5rtyMe7icVHrFEq7iOdAqsTNSggz1XvGr2w9p7LNAU0GU6tg40k5KpOZUsGmhGysWEToiA9YxNCA+U046P3qKzozSR14oTQUazdXfEynxlZr4run0iR6qZW8m/ud1Yu1dOykPolizgC4WebFAOkSzBFCfS0a1mBhCqOTmVkSHRBKqTU4FE4K9/PIqaV5UbVy1Hy5LtdssjjycwCmUwYYrqME91KEBFJ7gGV7hzUqsF+vd+li05qxs5hj+wPr8AYpFkKI=</latexit>

- heat diffusion on graphs is a typical physical process on graphs


- other possibilities exist (e.g., random walk on graphs)
- many have an interpretation of filtering on graphs

Smola and Kondor, “Kernels and regularization on graphs”, COLT, 2003. 34/47
Graph clustering (community detection)

j
wij
i

35/47
Graph clustering (community detection)

j A3
wij
i
A1

A2

35/47
Graph clustering (community detection)

j A3
wij
i
A1

A2

k
1 X W (Ai , Ai )
N Cut(A1 , ..., Ak ) =
2 i=1 vol(Ai )

35/47
Graph clustering (community detection)

j A3
wij
i
A1

A2

k
1 X W (Ai , Ai )
N Cut(A1 , ..., Ak ) =
2 i=1 vol(Ai )

- first k eigenvectors of graph Laplacian minimise the graph cut


- eigenvectors of graph Laplacian enable a Fourier-like analysis
for graph signals

Ulrike von Luxburg, “A tutorial on spectral clustering”, Statistics and Computing, 2007. 35/47
Semi-supervised learning

Zhu, “Semi-supervised learning with graphs”, Ph.D. dissertation, CMU, 2005. 36/47
Semi-supervised learning

y:
<latexit sha1_base64="FG/oxg69hcUrfhpafoKD+rAWk3A=">AAAB6XicbVBNS8NAEJ3Ur1q/qh69LBbBU0lEUDwVvXisYm2hDWWz3bRLN5uwOxFC6D/w4kERr/4jb/4bt20O2vpg4PHeDDPzgkQKg6777ZRWVtfWN8qbla3tnd296v7Bo4lTzXiLxTLWnYAaLoXiLRQoeSfRnEaB5O1gfDP1209cGxGrB8wS7kd0qEQoGEUr3WdX/WrNrbszkGXiFaQGBZr96ldvELM04gqZpMZ0PTdBP6caBZN8UumlhieUjemQdy1VNOLGz2eXTsiJVQYkjLUthWSm/p7IaWRMFgW2M6I4MoveVPzP66YYXvq5UEmKXLH5ojCVBGMyfZsMhOYMZWYJZVrYWwkbUU0Z2nAqNgRv8eVl8nhW99y6d3dea1wXcZThCI7hFDy4gAbcQhNawCCEZ3iFN2fsvDjvzse8teQUM4fwB87nD2URjUE=</latexit>

min ||y x||22 + ↵ xT Lx,


x2RN
<latexit sha1_base64="lJxH+nCqC+IsAAP39HfLhn2rQ/Q=">AAACLnicbVDLSgMxFM34rPVVdekmWARBLTNF0GVRBBciVfoQOu2QSdM2NJMZkoxY0vGH3PgruhBUxK2fYabtwteBwOGce7k5x48Ylcq2X6yp6ZnZufnMQnZxaXllNbe2XpNhLDCp4pCF4tpHkjDKSVVRxch1JAgKfEbqfv8k9es3REga8ooaRKQZoC6nHYqRMpKXO3UDyj2tbxPoUg7dAKme7+urpHWRwOFQDxK4D407HHrFVhHuQhexqIfujNSqQH2epOael8vbBXsE+Jc4E5IHE5S93JPbDnEcEK4wQ1I2HDtSTY2EopiRJOvGkkQI91GXNAzlKCCyqUdxE7htlDbshMI8ruBI/b6hUSDlIPDNZBpH/vZS8T+vEavOUVNTHsWKcDw+1IkZVCFMu4NtKghWbGAIwoKav0LcQwJhZRrOmhKc35H/klqx4NgF5/IgXzqe1JEBm2AL7AAHHIISOANlUAUY3INH8ArerAfr2Xq3PsajU9ZkZwP8gPX5BejoqQM=</latexit>

Zhu, “Semi-supervised learning with graphs”, Ph.D. dissertation, CMU, 2005. 36/47
Semi-supervised learning

y:
<latexit sha1_base64="FG/oxg69hcUrfhpafoKD+rAWk3A=">AAAB6XicbVBNS8NAEJ3Ur1q/qh69LBbBU0lEUDwVvXisYm2hDWWz3bRLN5uwOxFC6D/w4kERr/4jb/4bt20O2vpg4PHeDDPzgkQKg6777ZRWVtfWN8qbla3tnd296v7Bo4lTzXiLxTLWnYAaLoXiLRQoeSfRnEaB5O1gfDP1209cGxGrB8wS7kd0qEQoGEUr3WdX/WrNrbszkGXiFaQGBZr96ldvELM04gqZpMZ0PTdBP6caBZN8UumlhieUjemQdy1VNOLGz2eXTsiJVQYkjLUthWSm/p7IaWRMFgW2M6I4MoveVPzP66YYXvq5UEmKXLH5ojCVBGMyfZsMhOYMZWYJZVrYWwkbUU0Z2nAqNgRv8eVl8nhW99y6d3dea1wXcZThCI7hFDy4gAbcQhNawCCEZ3iFN2fsvDjvzse8teQUM4fwB87nD2URjUE=</latexit>

min ||y x||22 + ↵ xT Lx,


x2RN
<latexit sha1_base64="lJxH+nCqC+IsAAP39HfLhn2rQ/Q=">AAACLnicbVDLSgMxFM34rPVVdekmWARBLTNF0GVRBBciVfoQOu2QSdM2NJMZkoxY0vGH3PgruhBUxK2fYabtwteBwOGce7k5x48Ylcq2X6yp6ZnZufnMQnZxaXllNbe2XpNhLDCp4pCF4tpHkjDKSVVRxch1JAgKfEbqfv8k9es3REga8ooaRKQZoC6nHYqRMpKXO3UDyj2tbxPoUg7dAKme7+urpHWRwOFQDxK4D407HHrFVhHuQhexqIfujNSqQH2epOael8vbBXsE+Jc4E5IHE5S93JPbDnEcEK4wQ1I2HDtSTY2EopiRJOvGkkQI91GXNAzlKCCyqUdxE7htlDbshMI8ruBI/b6hUSDlIPDNZBpH/vZS8T+vEavOUVNTHsWKcDw+1IkZVCFMu4NtKghWbGAIwoKav0LcQwJhZRrOmhKc35H/klqx4NgF5/IgXzqe1JEBm2AL7AAHHIISOANlUAUY3INH8ArerAfr2Xq3PsajU9ZkZwP8gPX5BejoqQM=</latexit>

- learning by assuming smoothness of predicted labels


- this is equivalent to a denoising problem for graph signal y

Zhu, “Semi-supervised learning with graphs”, Ph.D. dissertation, CMU, 2005. 36/47
GSP and the literature
centrality, diffused information, class membership, node labels (and
node-level features in general) can ALL be viewed as graph signals

v1 v1
v3 v4 v3 v4
v2 v5 v2 v5
v6 v7 v6
v8 v7 v8
v9 v9

network science network diffusion

unsupervised learning (dimensionality


semi-supervised learning
reduction, clustering)

37/47
Outline
• Motivation

• Graph signal processing (GSP): Basic concepts

• Spectral filtering: Basic tools of GSP

• Connection with literature

• Applications in neuroscience

38/47
A typical analysis framework

Huang et al., “A graph signal processing perspective on functional brain imaging”, Proc. IEEE, 2018. 39/47
A typical analysis framework

Huang et al., “A graph signal processing perspective on functional brain imaging”, Proc. IEEE, 2018. 39/47
Application I: Understanding brain functioning

Medaglia et al., “Functional alignment with anatomical networks is associated with cognitive flexibility”, Nature Human
Behaviour, 2018. 40/47
Application I: Understanding brain functioning

Medaglia et al., “Functional alignment with anatomical networks is associated with cognitive flexibility”, Nature Human
Behaviour, 2018. 40/47
Application I: Understanding brain functioning

liberality (large high-freq. components)


associated with high switching cost

Medaglia et al., “Functional alignment with anatomical networks is associated with cognitive flexibility”, Nature Human
Behaviour, 2018. 40/47
Application I: Understanding brain functioning

low-freq. mid-freq. high-freq.


components components components

- record BOLD signals while responding to sequentially presented stimuli

Huang et al., “Graph frequency analysis of brain signals”, IEEE JSTSP, 2016. 41/47
Application I: Understanding brain functioning

low-freq. mid-freq. high-freq.


components components components

- record BOLD signals while responding to sequentially presented stimuli


- it favours learning to have
smooth, spread signals (low-freq.) when facing unfamiliar task
varied, spiking signals (high-freq.) when task becomes familiar

Huang et al., “Graph frequency analysis of brain signals”, IEEE JSTSP, 2016. 41/47
Application II: Disease classification

(normal control) (Alzheimer’s disease)

Hu et al., “Matched signal detection on graphs”, NeuroImage, 2016. 42/47


Application II: Disease classification

(normal control) (Alzheimer’s disease)

Hu et al., “Matched signal detection on graphs”, NeuroImage, 2016. 42/47


Application II: Disease classification

Parisot et al., “Disease prediction using graph convolutional networks”, Medical Image Analysis, 2018. 43/47
Application II: Disease classification
ADNI (structural MRI): volumes of brain structures
ABIDE (fMRI): off-diagonal of functional connectivity

Parisot et al., “Disease prediction using graph convolutional networks”, Medical Image Analysis, 2018. 43/47
Application II: Disease classification
ADNI (structural MRI): volumes of brain structures
ABIDE (fMRI): off-diagonal of functional connectivity

similarity in phenotypic data

Parisot et al., “Disease prediction using graph convolutional networks”, Medical Image Analysis, 2018. 43/47
Application II: Disease classification
ADNI (structural MRI): volumes of brain structures
ABIDE (fMRI): off-diagonal of functional connectivity

similarity in phenotypic data

Parisot et al., “Disease prediction using graph convolutional networks”, Medical Image Analysis, 2018. 43/47
Application III: Gender classification

Arslan et al., “Graph saliency maps through spectral convolutional networks”, GRAIL, 2018. 44/47
Application IV: Inferring brain connectivity

Hu et al., “A spectral graph regression model for learning brain connectivity of Alzheimer’s disease”, PLOS ONE, 2015.
Shen et al., “Nonlinear structural vector autoregressive models for inferring effective brain network connectivity”, 2016. 45/47
Application IV: Inferring brain connectivity

Alzheimer's disease

normal control

Hu et al., “A spectral graph regression model for learning brain connectivity of Alzheimer’s disease”, PLOS ONE, 2015.
Shen et al., “Nonlinear structural vector autoregressive models for inferring effective brain network connectivity”, 2016. 45/47
Future of GSP
• Mathematical models for graph signals
- global and local smoothness / regularity
- underlying physical processes

• Graph construction
- how to infer topologies given observed data?

• Fast implementation
- fast graph Fourier transform
- distributed processing

• Connection to / combination with other fields


- statistical machine learning
- deep learning on graphs and manifolds

• Key applications

Bronstein et al., “Geometric deep learning”, IEEE SPM, 2017.


Wu et al., “A comprehensive survey on graph neural networks”, arXiv, 2019. 46/47
Resources
• Three tutorial/overview papers:

• More available at: http://web.media.mit.edu/~xdong/resource.html

47/47

You might also like