' Random'Walks'as'a'Stable'Analogue'of'Eigenvectors' '(with'Applications'to'Nearly?Linear?Time'Graph'Partitioning)' Lorenzo'Orecchia,'MIT'Math'

advertisement
'
'
'
'
'
Random'Walks'as'a'Stable'Analogue'of'Eigenvectors'
'
'(with'Applications'to'Nearly?Linear?Time'Graph'Partitioning)'
'
'
'
Lorenzo'Orecchia,'MIT'Math'
Based'on'joint'works'with'Michael'Mahoney'(Stanford),'Sushant'Sachdeva'(Yale)'and'
'Nisheeth'Vishnoi'(MSR'India).'
Why$Spectral$Algorithms$for$Graph$Problems$…$
…'in'practice?'
• 'Simple'to'implement'
• 'Can'exploit'very'efUicient'linear'algebra'routines'
• 'Perform''well'in'practice'for'many'problems'
'
…'in'theory?'
• 'Connections'between'spectral'and'combinatorial'objects''
• 'Connections'to'Markov'Chains'and'Probability'Theory'
• ''Intuitive'geometric'viewpoint'
'
RECENT'ADVANCES:''
Fast'algorithms'for'fundamental'combinatorial'problems'
'
'rely'on'spectral'and'optimization'ideas'
'
Spectral$Algorithms$for$Graph$Par88oning$
Spectral'algorithms'are'widely'used'in'many'graph?partitioning'applications:
'clustering,'image'segmentation,'community?detection,'etc.!
CLASSICAL!!VIEW:!!
'?'Based'on'Cheeger’s'Inequality''
'?'Eigenvectors'sweep?cuts'reveal'sparse'cuts'in'the'graph'
'
Spectral$Algorithms$for$Graph$Par88oning$
Spectral'algorithms'are'widely'used'in'many'graph?partitioning'applications:
'clustering,'image'segmentation,'community?detection,'etc.!
CLASSICAL!!VIEW:!!
'?'Based'on'Cheeger’s'Inequality''
'?'Eigenvectors'sweep?cuts'reveal'sparse'cuts'in'the'graph'
NEW!TREND:!
'?'Random'walk'vectors'replace'eigenvectors:'
• 'Fast'Algorithms'for'Graph'Partitioning'
• 'Local'Graph'Partitioning'
• 'Real'Network'Analysis'
'?'Different'random'walks:'PageRank,'Heat?Kernel,'etc.'
'
Why$Random$Walks?$A$Prac88oner’s$View$
Advantages'of'Random'Walks:'
1)  Quick'approximation'to'eigenvector'in'massive'graphs'
A ='adjacency'matrix
'
'
'D'='diagonal'degree'matrix'
W'='ADA1 =''natural'random'walk'matrix 'L'='D –'A'='Laplacian'matrix
Second'Eigenvector'of'the'Laplacian'can'be'computed'by'iterating'W :'
For'random''y0's.t.'''''''''''''''''''''''''','compute
y0T D¡11 = 0
D¡1 W ty0
'
''
Why$Random$Walks?$A$Prac88oner’s$View$
Advantages'of'Random'Walks:'
1)  Quick'approximation'to'eigenvector'in'massive'graphs'
A ='adjacency'matrix
'
'
'D'='diagonal'degree'matrix'
W'='ADA1 =''natural'random'walk'matrix 'L'='D –'A'='Laplacian'matrix
Second'Eigenvector'of'the'Laplacian'can'be'computed'by'iterating'W :'
For'random''y0's.t.'''''''''''''''''''''''''','compute
y0T D¡11 = 0
D¡1 W ty0
D¡1 W t y0
x2 (L) = limt!1 ||W t y0 || ¡1
In'the'limit,'''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''.'
D
'
''
Why$Random$Walks?$A$Prac88oner’s$View$
Advantages'of'Random'Walks:'
1)  Quick'approximation'to'eigenvector'in'massive'graphs'
A ='adjacency'matrix
'
'
'D'='diagonal'degree'matrix'
W'='ADA1 =''natural'random'walk'matrix 'L'='D –'A'='Laplacian'matrix
''
Second'Eigenvector'of'the'Laplacian'can'be'computed'by'iterating'W :'
For'random''y0's.t.'''''''''''''''''''''''''','compute
y0T D¡11 = 0
D¡1 W ty0
D¡1 W t y0
In'the'limit,'''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''''.'
x2 (L) = limt!1 ||W t y0 || ¡1
D
Heuristic:'For'massive'graphs,'pick't'as'large'as'computationally'affordable.'
Why$Random$Walks?$A$Prac88oner’s$View$
Advantages'of'Random'Walks:'
1)  Quick'approximation'to'eigenvector'in'massive'graphs'
2)  Statistical'robustness'
'Real?world'graphs'are'noisy'
'
GROUND!TRUTH!
GRAPH!
Why$Random$Walks?$A$Prac88oner’s$View$
Advantages'of'Random'Walks:'
1)  Quick'approximation'to'eigenvector'in'massive'graphs'
2)  Statistical'robustness'
'Real?world'graphs'are'noisy'
'
NOISY!
MEASUREMENT!
GROUND4TRUTH!
GRAPH!
INPUT!GRAPH!
GOAL:'estimate'eigenvector'of'ground?
truth'graph.'
Why$Random$Walks?$A$Prac88oner’s$View$
Advantages'of'Random'Walks:'
1)  Quick'approximation'to'eigenvector'in'massive'graphs'
2)  Statistical'robustness'
''
NOISY!
MEASUREMENT!
'
GROUND4TRUTH!
GRAPH!
INPUT!GRAPH!
GOAL:'estimate'eigenvector'of'ground?truth'graph.'
'
OBSERVATION:'eigenvector'of'input'graph'can'have'very'large'variance,''
'
'as'it''can'be'very'sensitive'to'noise'
'
RANDOM4WALK!VECTORS!provide'better,'more'stable'estimates.!
'
'
This$Talk$
QUESTION:''
Why'random?walk'vectors'in'the'design'of'fast'algorithms?'
''
'
'
'
'
'
'
'
'
'
''
This$Talk$
QUESTION:''
Why'random?walk'vectors'in'the'design'of'fast'algorithms?'
'ANSWER:'Stable,'regularized'version'of'the'eigenvector'
'
'
'
'
'
'
'
'
'
''
This$Talk$
QUESTION:''
Why'random?walk'vectors'in'the'design'of'fast'algorithms?'
'ANSWER:'Stable,'regularized'version'of'the'eigenvector'
'
'
GOALS'OF'THIS'TALK:'
?'Show'optimization'perspective'on'why'random'walks'arise'
?'Application'to'nearly?linear?time'balanced'graph'partitioning'
'
'
'
'
'
'
'
'
Random'Walks''
as'Regularized'Eigenvectors'
What$is$Regulariza8on?$
Regularization'is'a'fundamental'technique'in'optimization'
'
'
'
OPTIMIZATION'
PROBLEM'
'
WELL?BEHAVED'
OPTIMIZATION'
PROBLEM'
• 'Stable'optimum'
• 'Unique'optimal'solution'
• 'Smoothness'conditions'
'
…'
What$is$Regulariza8on?$
Regularization'is'a'fundamental'technique'in'optimization'
'
'
'
OPTIMIZATION'
PROBLEM'
'
WELL?BEHAVED'
OPTIMIZATION'
PROBLEM'
Parameter'¸'>' 0
BeneUits'of'Regularization'in'Learning'and'Statistics:'
• 'Increases'stability'
• 'Decreases'sensitivity'to'random'noise'
• 'Prevents'overUitting'
Regularizer''F
Instability$of$Eigenvector$
EXPANDER!
Instability$of$Eigenvector$
✏
1
Current!
eigenvector!
✏
✏
✏
EXPANDER!
Instability$of$Eigenvector$
1
✏
✏
1
Small'perturbation'
Current!
eigenvector!
✏
✏
✏
EXPANDER!
Eigenvector!Changes!Completely!!
The$Laplacian$Eigenvalue$Problem$
'Quadratic'Formulation'
1
min xT Lx
d
s.t. ||x||2 = 1
xT 1 = 0
For'simplicity,'take'G'to'be'd?regular.$
The$Laplacian$Eigenvalue$Problem$
'Quadratic'Formulation'
1
min xT Lx
d
s.t. ||x||2 = 1
xT 1 = 0
SDP'Formulation'
1
min
d
s.t.
L•X
I •X =1
T
11 • X = 0
Xº0
The$Laplacian$Eigenvalue$Problem$
'Quadratic'Formulation'
1
min xT Lx
d
s.t. ||x||2 = 1
SDP'Formulation'
1
min
d
s.t.
L•X
I •X =1
T
xT 1 = 0
11 • X = 0
Xº0
Programs'have'same'optimum.'Take'optimal'solution''$
X ¤ = x¤ (x¤)T
Instability$of$Linear$Op8miza8on$
Consider'a'convex'set''''''''''''''''''and'a'linear'optimization'problem:'
S ½ Rn
'
T
f(c)
=
arg
min
c
x
x2S
'
'
The'optimal'solution'f(c)'may'be'very'unstable'under'perturbation''of'c':'
'
0
kf(c0 ) ¡ f(c)k >> ±
''
and'
kc
ck 
c0
f(c0 )
c
S
f(c)
Regulariza8on$Helps$Stability$
S ½ Rn
Consider'a'convex'set''''''''''''''''''and'a!regularized!linear'optimization'
problem'
T
f(c) = arg minx2S c x +F (x)
'
''
where'F'is'¾?strongly'convex.''
'
0
Then:' kc
implies'
ck 
'
'
''
kf (c)
f (c0 )k 
c0T x + F (x)
cT x + F (x)
f(c) f(c0 )
Regulariza8on$Helps$Stability$
S ½ Rn
Consider'a'convex'set''''''''''''''''''and'a!regularized!linear'optimization'
problem'
T
f(c) = arg minx2S c x +F (x)
'
''
where'F'is'¾?strongly'convex.''
'
0
Then:' kc
implies'
ck 
'
'
''
slope 
kf (c)
f (c0 )k 
c0T x + F (x)
cT x + F (x)
f(c) f(c0 )
Regularized$Spectral$Op8miza8on$
1
min
d
s.t.
SDP'Formulation'
L•X
I •X =1
11T • X = 0
Density'Matrix'
Xº0
Eigenvector'decomposition'of'X:
X=
P
T
pi vi vi
8i, pi ¸ 0,
P
pi = 1,
8i, viT 1 = 0.
Eigenvalues'of'X deUine'probability'distribution
Regularized$Spectral$Op8miza8on$
SDP'Formulation'
1
min L • X
d
s.t.
I •X =1
J •X =0
Density'Matrix'
Xº0
Eigenvalues'of'X
'deUine'probability'distribution
X ¤ = x¤ (x¤)T
¤
x
0
TRIVIAL'DISTRIBUTION'
0
1
Regularized$Spectral$Op8miza8on$
1
min
d
s.t.
L • X + ´ · F (X)
Regularizer'F
Parameter'´
I •X =1
11T • X = 0
X º0
The'regularizer'F''forces'the'distribution'of'eigenvalues'of'X'to'be'non?
trivial'
1¡²
X ¤ = x¤ (x¤)T
²1
REGULARIZATION'
¤
X =
x¤
P
pi vi viT
²2
Regularizers$
Regularizers'are'SDP?versions'of'common'regularizers'
'
'
• ''von'Neumann'Entropy'
'
FH (X) = Tr(X log X) =
• 'p?Norm,'p > 1
Fp(X) =
1
p
||X||
p
p
• 'And'more,'e.g.'log?determinant.
'
'
=
P
1
p
Tr(X
)
p
pi log pi
=
1
p
P
ppi
Our$Main$Result$
Regularized''SDP'
1
min L • X + ´ · F (X)
d
s.t.
I •X =1
J •X =0
Xº0
RESULT:!''
Explicit'correspondence'between'regularizers!and!random!walks!
REGULARIZER'
F = FH
F = Fp
OPTIMAL'SOLUTION'OF'REGULARIZED'PROGRAM'
t
X ? / HG
Entropy'
p?Norm'
?
where't'depends'on'´
X / (qI + (1 ¡ q)W)
1
p¡1
where'q'depends'on'´
Our$Main$Result$
Regularized''SDP'
1
min L • X + ´ · F (X)
d
s.t.
I •X =1
J •X =0
Xº0
RESULT:!''
Explicit'correspondence'between'regularizers!and!random!walks!
REGULARIZER'
F = FH
OPTIMAL'SOLUTION'OF'REGULARIZED'PROGRAM'
Entropy'
t
X ? / HG
where't'depends'on'´
HEAT4KERNEL!
F = Fp
p?Norm'
?
X / (qI + (1 ¡ q)W)
LAZY!RANDOM!WALK!
1
p¡1
where'q'depends'on'´
Background:$HeatAKernel$Random$Walk!
For'simplicity,'take'G'to'be'd4regular.''
'
• ' The'Heat?Kernel'Random'Walk'is'a'Continuous?Time'Markov'Chain'over'V,'
modeling'the'diffusion'of'heat'along'the'edges'of'G.'
'
•  ' Transitions' take' place' in' continuous' time' ' t,' with' an' exponential'
distribution.'
@p(t)
p(t)
= ¡L
@t
¡ dt L
p(t) = e
d
p(0)
• ' The'Heat'Kernel'can'be'interpreted'as'Poisson'distribution'over'number'of'
steps'of'the'natural'random'walk'W=ADA1:$
'
'
'
¡ dt L
e
¡t
=e
P1
tk
k
W
k=1 k!
Background:$HeatAKernel$Random$Walk!
For'simplicity,'take'G'to'be'd4regular.''
'
• ' The'Heat?Kernel'Random'Walk'is'a'Continuous?Time'Markov'Chain'over'V,'
modeling'the'diffusion'of'heat'along'the'edges'of'G.'
'
•  ' Transitions' take' place' in' continuous' time' ' t,' with' an' exponential'
distribution.'
@p(t)
p(t)
@t
= ¡L
¡ dt L
p(t) = e
d
Notation!
t
p(0) =: HG
p(0)
• ' The'Heat'Kernel'can'be'interpreted'as'Poisson'distribution'over'number'of'
steps'of'the'natural'random'walk'W=ADA1:$
'
'
'
¡ dt L
e
¡t
=e
P1
tk
k
k=1 k! W
Heat$Kernel$Walk:$Stability$Analysis$
S ½ Rn
Consider'a'convex'set''''''''''''''''''and'a!regularized!linear'optimization'
problem'
T
f(c) = arg minx2S c x +F (x)
'
''
where'F'is'¾?strongly'convex.''
'
0
Then:' kc
implies'
ck 
'
'
''
kf (c)
f (c0 )k 
Heat$Kernel$Walk:$Stability$Analysis$
S ½ Rn
Consider'a'convex'set''''''''''''''''''and'a!regularized!linear'optimization'
problem'
T
f(c) = arg minx2S c x +F (x)
'
''
where'F'is'¾?strongly'convex.''
'
0
Then:' kc
implies'
ck 
'
'
Analogous'statement'for'Heat'Kernel:'
''
kG
0
Gk1 
implies'
kf (c)
f (c0 )k 
⌧
HG
0
⌧
I • HG
0
⌧
HG
⌧
I • HG
1
⌧·
'
'
'
'
'
Applications'to'Graph'Partitioning:'
Nearly?Linear?Time'Balanced'Cut'
Par88oning$Graphs$A$Conductance$
Undirected'unweighted' G = (V, E), |V | = n, |E| = m
S''
Conductance'of'S'µ'V
Á(S) =
|E(S,S̄)|
min{vol(S),vol(S̄)}
Par88oning$Graphs$–$Balanced$Cut$
NP?HARD'DECISION'PROBLEM'
Does'G'have'a'b?balanced'cut'of'conductance'<'° ?'
'
'
'
'
'
'
S
Á(S) < °
b
1
2
> vol(S) > b
vol(V )
Par88oning$Graphs$–$Balanced$Cut$
NP?HARD'DECISION'PROBLEM'
Does'G'have'a'b?balanced'cut'of'conductance'<'° ?'
'
'
'
'
'
'
S
Á(S) < °
b
1
2
> vol(S) > b
vol(V )
• 'Important'primitive'for'many'recursive'algorithms.''
• 'Applications'to'clustering'and'graph!decomposition.'
'
Spectral$Approxima8on$Algorithms$
Does'G'have'a'b?balanced'cut'of'conductance'<'° '?'
'
Algorithm!
'
' Recursive'Eigenvector''
'
[Spielman,'Teng'‘04]'
'
'
[Andersen,'Chung,'Lang'‘07]'
'
[Andersen,'Peres'‘09]'
Method!
Spectral'
Local'Random'
Walks'
Local'Random'
Walks'
Local'Random'
Walks'
Distinguishes!!!
¸ ° !!and!
p
O( °)
Running!
Time!
Õ(mn)
µ ¶
µq
¶
m
Õ
O
° log3 n
°2
µ ¶
³p
´
m
Õ
O
° log n
°
µ
¶
³p
´
m
Õ
p
O
° log n
°
Spectral$Approxima8on$Algorithms$
Does'G'have'a'b?balanced'cut'of'conductance'<'° ?'
' Algorithm!
'
' Recursive'Eigenvector''
'
[Spielman,'Teng'‘04]'
'
'
[Andersen,'Chung,'Lang'‘07]'
'
[Andersen,'Peres'‘09]'
[Orecchia,'Sachdeva,'Vishnoi'’12]'
Method!
Spectral'
Local'Random'
Walks'
Local'Random'
Walks'
Local'Random'
Walks'
Random'Walks'
Distinguishes!!!
¸ ° !!and!
p
O( °)
Running!
Time!
Õ(mn)
µ ¶
µq
¶
m
Õ
O
° log3 n
°2
µ ¶
³p
´
m
Õ
O
° log n
°
µ
¶
³p
´
m
Õ
p
O
° log n
°
p
O( °)
Õ (m)
Recursive$Eigenvector$Algorithm$
(G, b, °)
INPUT:''''''''''''''''''''DECISION:'does'there'exists'b?balanced'S'with'Á(S)'<'°
'?
'
'
Recursive$Eigenvector$Algorithm$
(G, b, °)
INPUT:''''''''''''''''''''DECISION:'does'there'exists'b?balanced'S'with'Á(S)'<'°
'?
• 'Compute'eigenvector'of'G'and'corresponding'Laplacian'eigenvalue'¸2
'
'
G
Recursive$Eigenvector$Algorithm$
(G, b, °)
INPUT:''''''''''''''''''''DECISION:'does'there'exists'b?balanced'S'with'Á(S)'<'°
'?
• 'Compute'eigenvector'of'G'and'corresponding'Laplacian'eigenvalue'¸2
•  If'¸2'¸'°,'output'NO.'Otherwise,'sweep'eigenvector'to'Uind'S1'such'that'
(S1 )  O(
p
)
'
G
S1
Recursive$Eigenvector$Algorithm$
(G, b, °)
INPUT:''''''''''''''''''''DECISION:'does'there'exists'b?balanced'S'with'Á(S)'<'°
'?
• 'Compute'eigenvector'of'G'and'corresponding'Laplacian'eigenvalue'¸2
•  If'¸2'¸'°,'output'NO.'Otherwise,'sweep'eigenvector'to'Uind'S1'such'that'
(S1 )  O(
p
)
• ''If'S1'is'(b/2)?balanced.'Output'S1.'Otherwise,'consider'the'graph'G1'induced'by'G '
on'V-S1'with'self?loops'replacing'the'edges'going'to'S1.
G1
'
'
'
'
S1
Recursive$Eigenvector$Algorithm$
(G, b, °)
INPUT:''''''''''''''''''''DECISION:'does'there'exists'b?balanced'S'with'Á(S)'<'°
'?
• 'Compute'eigenvector'of'G'and'corresponding'Laplacian'eigenvalue'¸2
•  If'¸2'¸'°,'output'NO.'Otherwise,'sweep'eigenvector'to'Uind'S1'such'that'
(S1 )  O(
p
)
• 'If'S1'is'(b/2)?balanced.'Output'S1.'Otherwise,'consider'the'graph'G1'induced'by'G '
on'V-S1'with'self?loops'replacing'the'edges'going'to'S1.
• 'Recurse'on'G1.'
G1
S1
'
'
'
'
Recursive$Eigenvector$Algorithm$
(G, b, °)
INPUT:''''''''''''''''''''DECISION:'does'there'exists'b?balanced'S'with'Á(S)'<'°
'?
• 'Compute'eigenvector'of'G'and'corresponding'Laplacian'eigenvalue'¸2
•  If'¸2'¸'°,'output'NO.'Otherwise,'sweep'eigenvector'to'Uind'S1'such'that'
(S1 )  O(
p
)
• 'If'S1'is'(b/2)?balanced.'Output'S1.'Otherwise,'consider'the'graph'G1'induced'by'G '
on'V-S1'with'self?loops'replacing'the'edges'going'to'S1.
• 'Recurse'on'G1.'
S1
'
'
'
'
S2
Recursive$Eigenvector$Algorithm$
(G, b, °)
INPUT:''''''''''''''''''''DECISION:'does'there'exists'b?balanced'S'with'Á(S)'<'°
'?
• 'Compute'eigenvector'of'G'and'corresponding'Laplacian'eigenvalue'¸2
•  If'¸2'¸'°,'output'NO.'Otherwise,'sweep'eigenvector'to'Uind'S1'such'that'
(S1 )  O(
p
)
• ''If'S1'is'(b/2)?balanced.'Output'S1.'Otherwise,'consider'the'graph'G1'induced'by'G '
on'V-S1'with'self?loops'replacing'the'edges'going'to'S1.
• ''Recurse'on'G1.'
S1
'
'
'
'
S3
S4
S2
Recursive$Eigenvector$Algorithm$
(G, b, °)
INPUT:''''''''''''''''''''DECISION:'does'there'exists'b?balanced'S'with'Á(S)'<'°
'?
• 'Compute'eigenvector'of'G'and'corresponding'Laplacian'eigenvalue'¸2
•  If'¸2'¸'°,'output'NO.'Otherwise,'sweep'eigenvector'to'Uind'S1'such'that'
(S1 )  O(
p
)
• ''If'S1'is'(b/2)?balanced.'Output'S1.'Otherwise,'consider'the'graph'G1'induced'by'G '
on'V-S1'with'self?loops'replacing'the'edges'going'to'S1.
• ''Recurse'on'G1.'
S1
'
'
'
'
S3
S2
¸2(G5) ¸ °
S4
LARGE'INDUCED'EXPANDER'='NO4CERTIFICATE!
Recursive$Eigenvector$Algorithm$
(G, b, °)
INPUT:''''''''''''''''''''DECISION:'does'there'exists'b?balanced'S'with'Á(S)'<'°
'?
• 'Compute'eigenvector'of'G'and'corresponding'Laplacian'eigenvalue'¸2
•  If'¸2'¸'°,'output'NO.'Otherwise,'sweep'eigenvector'to'Uind'S1'such'that'
(S1 )  O(
p
)
• ''If'S1'is'(b/2)?balanced.'Output'S1.'Otherwise,'consider'the'graph'G1'induced'by'G '
on'V-S1'with'self?loops'replacing'the'edges'going'to'S1.
• ''Recurse'on'G1.'
S1
S3
¸2(G5) ¸ °
S4
S2
'
'
RUNNING!TIME:'$$$$$$$$$$$$per'iteration,'O(n)'iterations.'Total:'
Õ(mn)
Õ(m)
'
Recursive$Eigenvector:$The$Worst$Case$
EXPANDER!
­(n) nearly?disconnected'components'
Varying'
conductance'
Recursive$Eigenvector:$The$Worst$Case$
S2
S3
S1
EXPANDER!
Varying'
conductance'
NB:'Recursive'Eigenvector'eliminates'one'component'per'iteration.''
­(n)
''''''''''''iterations'are'necessary.'Each'iteration'requires'Ω(m)'time.'
Recursive$Eigenvector:$The$Worst$Case$
S2
S3
S1
Varying'
conductance'
EXPANDER!
NB:'Recursive'Eigenvector'eliminates'one'component'per'iteration.''
­(n)
''''''''''''iterations'are'necessary.'Each'iteration'requires'Ω(mn)'time.'
GOAL:'Eliminate'unbalanced'low?conductance'cuts'faster.''
Recursive$Eigenvector:$The$Worst$Case$
S2
S3
S1
EXPANDER!
STABILITY!VIEW:!
•  Ideally,'we'would'like'to'enforce'progress:'' 2 (Gt+1 )
Varying'
conductance'
>>
2 (Gt )
•  Eigenvector'may'change'completely'at'every'iteration.'Impossible'to'
enforce'any'non?trivial'relation'between''''''''''''''''''''''''and''
2 (Gt+1 )
2 (Gt )
'''
'
Our$Algorithm:$Contribu8ons$
Algorithm!
Recursive'Eigenvector''
OUR'ALGORITHM'
Method!
Distinguishes!¸ ° !!and!
Time!
Eigenvector'
p
O( °)
Õ(mn)
Random'Walks'
p
O( °)
Õ (m)
MAIN!FEATURES:'
• 'Compute'O(log n)!'global'heat?kernel'random?walk'vectors'at'each'iteration'
• 'Unbalanced'cuts'are'removed'in'O(log n)'iterations'
• 'Method'to'compute'heat?kernel'vectors'in'nearly?linear'time'
'
TECHNICAL!COMPONENTS:!
1)'New'iterative'algorithm'with'a'simple'random'walk'interpretation'
2)'Novel'analysis'of'Lanczos'methods'for'computing'heat?kernel'vectors'
Elimina8ng$Unbalanced$Cuts$
• 'The'graph'eigenvector'may'be'correlated'with'only'one'sparse'unbalanced'cut.'
'
'
'
'
'
Elimina8ng$Unbalanced$Cuts$
• 'The'graph'eigenvector'may'be'correlated'with'only'one'sparse'unbalanced'cut.'
'
¿
HG
• 'Consider'the'Heat?Kernel'random'walk?matrix'''''''''''for'¿'='log'n/°.'
¿
HG
ei Probability'vector'for'random'
'
'
'
'
walk'started'at'vertex'i
¿
HG
ej
Long'vectors'are'slow?mixing'
random'walks'
Elimina8ng$Unbalanced$Cuts$
• 'The'graph'eigenvector'may'be'correlated'with'only'one'sparse'unbalanced'cut.'
'
¿
HG
• 'Consider'the'Heat?Kernel'random'walk?matrix'''''''''''for'¿'='log'n/°.'
'
'
'
'
Unbalanced'cuts'of''
p
<
conductance''.. °
Elimina8ng$Unbalanced$Cuts$
• 'The'graph'eigenvector'may'be'correlated'with'only'one'sparse'unbalanced'cut.'
SINGLE!VECTOR!
SINGLE!CUT!
'
¿
HG
• 'Consider'the'Heat?Kernel'random'walk?matrix'''''''''''for'¿'='log'n/°.'
'
'
'
'
VECTOR!
EMBEDDING!
MULTIPLE!CUTS!
Unbalanced'cuts'of''
p
<
conductance''.. °
Elimina8ng$Unbalanced$Cuts$
• 'The'graph'eigenvector'may'be'correlated'with'only'one'sparse'unbalanced'cut.'
SINGLE!VECTOR!
SINGLE!CUT!
AFTER!CUT!REMOVAL!…!
…!eigenvector!can!change!completely!
'
¿
HG
• 'Consider'the'Heat?Kernel'random'walk?matrix'''''''''''for'¿'='log'n/°.'
'
'
'
'
VECTOR!
EMBEDDING!
MULTIPLE!CUTS!
…!vectors!do!not!change!a!lot!
Our$Algorithm$for$Balanced$Cut!
IDEA'BEHIND'OUR'ALGORITHM:''
Replace'eigenvector'in'recursive'eigenvector'algorithm'with''
¿
¿ = log n/°
HG
'Heat?Kernel'random'walk''''''''''for'
'
¿
Consider'the'embedding'{vi}'given'by'''''''''
:'
HG
'
¿
vi = HG
ei
'
'
'
'
'
'
''
~1
n
Our$Algorithm$for$Balanced$Cut!
IDEA'BEHIND'OUR'ALGORITHM:''
Replace'eigenvector'in'recursive'eigenvector'algorithm'with''
¿
¿ = log n/°
HG
'Heat?Kernel'random'walk''''''''''for'
Chosen'to'emphasize'
'
cuts'of'conductance'≈ °'
¿
Consider'the'embedding'{vi}'given'by'''''''''
:'
HG
'
¿
vi = HG
ei
'
'
'
'
'
'
''
~1
n
Stationary'distribution'is'
uniform'as'G'is'regular
Our$Algorithm$for$Balanced$Cut!
IDEA'BEHIND'OUR'ALGORITHM:''
Replace'eigenvector'in'recursive'eigenvector'algorithm'with''
¿
¿ = log n/°
HG
'Heat?Kernel'random'walk''''''''''for'
Chosen'to'emphasize'
'
cuts'of'conductance'≈ °'
¿
Consider'the'embedding'{vi}'given'by'''''''''
:'
HG
'
¿
vi = HG
ei
'
'
'
'
~1
n
'
Stationary'distribution'is'
uniform'as'G'is'regular
MIXING:'
DeUine'the'total'deviation'from'stationary'for'a'set'S µ'V'for'walk''
''
¿
ª(HG
, S)
=
P
~1/n||2
||v
¡
i
i2S
FUNDAMENTAL'QUANTITY'TO'UNDERSTAND'CUTS'IN'G
Our$Algorithm:$Case$Analysis!
Recall:'
¿ = log n/°
¿
ª(HG
, S) =
P
i2S
'
CASE'1:'Random'walks'have'mixed!
¿
vi = HG
ei
'
'
'
'
ALL'VECTORS'ARE'SHORT'
'
'
1
⌧
'
(HG , V ) 
poly(n)
'
¿
||HG
ei ¡ ~1/n||2
Our$Algorithm:$Case$Analysis!
Recall:'
¿ = log n/°
¿
ª(HG
, S) =
P
i2S
¿
||HG
ei ¡ ~1/n||2
'
CASE'1:'Random'walks'have'mixed!
¿
vi = HG
ei
'
'
'
'
ALL'VECTORS'ARE'SHORT'
'
1
'
⌧
(HG , V ) 
poly(n)
'
'
By'deUinition'of'¿
'
¸2 ¸ ­(°)
ÁG ¸ ­(°)
Our$Algorithm!
¿ = log n/°
''
''
''
''
¿
ª(HG
, S)
¿
vi = HG
ei
=
P
¿
~1/n||2
||H
e
¡
i
G
i2S
''
''
''
''
CASE'2:'Random'walks'have'not!mixed!
'
!'
1
¿
ª(HG
,V) >
poly(n)
!'
p
We'can'either'Uind'an'Ω(b)?balanced'cut'with'conductance'
! O( °)
'
'
Our$Algorithm!
¿ = log n/°
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
'
¿
ª(HG
, S)
=
¿
vi = HG
ei
P
¿
~1/n||2
||H
e
¡
i
G
i2S
RANDOM'PROJECTION''
+''
SWEEP'CUT'
CASE'2:'Random'walks'have'not!mixed!
'
1
¿
ª(HG
,V) >
'
poly(n)
p
'We'can'either'Uind'an'Ω(b)?balanced'cut'with'conductance'
! O( °)
'
'
Our$Algorithm!
¿ = log n/°
'
'
'
'
'
'
'
¿
ª(HG
, S)
S1
=
P
¿
~1/n||2
||H
e
¡
i
G
i2S
BALL'
ROUNDING'
'
'CASE'2:'Random'walks'have'not!mixed!
'
1
¿
ª(HG
,V) >
'
poly(n)
'
p
!
O(
°)
We'can'either'Uind'an'Ω(b)?balanced'cut'with'conductance'
p
OR'a'ball'cut'yields'S1'such'that'''''''''''''''''''''''''''''''''''and'
(S1 )  O( )
'
¿
¿
ª(HG
, S1) ¸ 12 ª(HG
, V ).
'
Our$Algorithm:$Itera8on!
¿ = log n/°
'
'
'
'
'
¿
ª(HG
, S)
S1
=
P
¿
~1/n||2
||H
e
¡
i
G
i2S
'
'
'
'
'
'
'
p
'CASE'2:'We'found'an'unbalanced'cut'S1'with'''''''''''''''''''''''''''''''''and'
(S1 )  O( )
'
¿
¿
ª(HG
, S1) ¸ 12 ª(HG
, V ).
'
(2).'
Modify'G =G(1)$by'adding!edges!across'''''''''''''''''to'construct'G
(S1, S¯1 )
'
'
Analogous'to'removing'unbalanced'cut'S1'
in'Recursive'Eigenvector'algorithm'
'
Our$Algorithm:$Modifying$G!
p
(S1 )  O( )
'CASE'2:'We'found'an'unbalanced'cut'S1'with'''''''''''''''''''''''''''''''''and'
'
¿
¿
ª(HG
, S1) ¸ 12 ª(HG
, V ).
'
(2).'
(S1, S¯1 )
Modify'G =G(1)'by'adding!edges!across'''''''''''''''''to'construct'G
'
'
Sj
'
'
'
'
Our$Algorithm:$Modifying$G!
p
(S1 )  O( )
'CASE'2:'We'found'an'unbalanced'cut'S1'with'''''''''''''''''''''''''''''''''and'
'
¿
¿
ª(HG
, S1) ¸ 12 ª(HG
, V ).
'
(2).'
(S1, S¯1 )
Modify'G =G(1)'by'adding!edges!across'''''''''''''''''to'construct'G
'
'
S1
'
'
'
G(t+1) = G(t) + °
P
i2St
where'Stari'is'the'star'graph'rooted'at'vertex'i.''
'
'
'
Stari
Our$Algorithm:$Modifying$G!
p
(S1 )  O( )
'CASE'2:'We'found'an'unbalanced'cut'S1'with'''''''''''''''''''''''''''''''''and'
'
¿
¿
ª(HG
, S1) ¸ 12 ª(HG
, V ).
'
(2).'
(S1, S¯1 )
Modify'G =G(1)'by'adding!edges!across'''''''''''''''''to'construct'G
'
'
S1
'
'
'
G(t+1) = G(t) + °
P
i2St
Stari
where'Stari'is'the'star'graph'rooted'at'vertex'i.''
'
'
'
The'random'walk'can'now'escape'S1'more'easily.'
Our$Algorithm:$Itera8on!
'
'
¿ = log n/°
'
'
'
¿
ª(HG
, S)
S1
=
P
¿
~1/n||2
||H
e
¡
i
G
i2S
'
'
'
'
'
'
p
(S1 )  O( )
CASE'2:'We'found'an'unbalanced'cut'S'1'with'''''''''''''''''''''''''''''''''and'
'
¿
¿
ª(HG
, S1) ¸ 12 ª(HG
, V ).
'
(2).'
'Modify'G =G(1)$by'adding!edges!across'''''''''''''''''to'construct'G
(S1, S¯1 )
POTENTIAL'REDUCTION:'
'
'
'
⌧
(HG
(t+1) , V
)
⌧
(HG
(t) , V
)
1
3
⌧
⌧
(HG(t) , St ) 
(HG
(t) , V )
2
4
Our$Algorithm:$Itera8on!
'
'
¿ = log n/°
'
'
'
¿
ª(HG
, S)
S1
=
P
¿
~1/n||2
||H
e
¡
i
G
i2S
'
'
'
'
'
'
p
(S1 )  O( )
CASE'2:'We'found'an'unbalanced'cut'S'1'with'''''''''''''''''''''''''''''''''and'
'
¿
¿
ª(HG
, S1) ¸ 12 ª(HG
, V ).
'
(2).'
'Modify'G =G(1)$by'adding!edges!across'''''''''''''''''to'construct'G
(S1, S¯1 )
POTENTIAL'REDUCTION:'
'
'
'
⌧
(HG
(t+1) , V
)
⌧
(HG
(t) , V
)
1
3
⌧
⌧
(HG(t) , St ) 
(HG
(t) , V )
2
4
CRUCIAL!APPLICATION!OF!STABILITY!OF!RANDOM!WALK!
Summary$and$Poten8al$Analysis!
IN'SUMMARY:'
At'every'step't'of'the'recursion,'we'either''
1.  Produce'a'Ω(b)?balanced'cut'of'the'required'conductance,'OR'
'
'
Poten8al$Reduc8on!
IN'SUMMARY:'
At'every'step't'of'the'recursion,'we'either''
1.  Produce'a'Ω(b)?balanced'cut'of'the'required'conductance,'OR'
2.  Find'that''
1
⌧
(H
,
V
)

''''''''''''''''''''''''''''''''''''''''''','OR'
G(t)
'
poly(n)
'
Poten8al$Reduc8on!
IN'SUMMARY:'
At'every'step't'of'the'recursion,'we'either''
1.  Produce'a'Ω(b)?balanced'cut'of'the'required'conductance,'OR'
2.  Find'that''
1
⌧
(H
,
V
)

''''''''''''''''''''''''''''''''''''''''''','OR'
G(t)
poly(n)
3.'Find'an'unbalanced'cut'St 'of'the'required'conductance,'such'that'for'the'
graph'G (t+1),'modiUied'to'have'increased'edges'from'St,'
'
⌧
(HG
(t+1) , V ) 
'
'
3
⌧
(HG
(t) , V )
4
Poten8al$Reduc8on!
IN'SUMMARY:'
At'every'step't-1'of'the'recursion,'we'either''
1.  Produce'a'Ω(b)?balanced'cut'of'the'required'conductance,'OR'
2.  Find'that''
1
⌧
(H
,
V
)

''''''''''''''''''''''''''''''''''''''''''','OR'
G(t)
poly(n)
3.'Find'an'unbalanced'cut'St of'the'required'conductance,'such'that'for'the'
process'P (t+1),'modiUied'to'have'increased'transitions'from'St,'
'
⌧
(HG
(t+1) , V
3
⌧
)
(HG
(t) , V )
4
After'T=O(log'n)'iterations,'if'no'balanced'cut'is'found:'
'
⌧
(HG
(T ) , V ) 
1
poly(n)
From'this'guarantee,'using'the'deUinition'of'G(T),'we'derive'an'SDP?certiUicate'
that'no'b?balanced'cut'of'conductance'Ο(°)'exists'in'G.'
'
NB:'Only'O(log'n)'iterations'to'remove'unbalanced'cuts.'
'
HeatAKernel$and$Cer8ficates!
• 'If'no'balanced'cut'of'conductance''is'found,'our'potential'analysis'yields:'
⌧
(HG
(T ) , V
1
)
poly(n)
L+°
PT ¡1 P
j=1
i2Sj
L(Stari ) º °L(KV )
ModiUied'graph'has'¸2!¸!°
$
CLAIM:$This$is$a$cer8ficate$that$no$balanced$cut$of$conductance$<$°'existed'in'G.
'
[Sj
'
HeatAKernel$and$Cer8ficates!
• 'If'no'balanced'cut'of'conductance''is'found,'our'potential'analysis'yields:'
⌧
(HG
(T ) , V
1
)
poly(n)
L+°
PT ¡1 P
j=1
i2Sj
L(Stari ) º °L(KV )
ModiUied'graph'has'¸2!¸!°
$
CLAIM:$This$is$a$cer8ficate$that$no$balanced$cut$of$conductance$<$°'existed'in'G.'
Balanced'cut'T
[Sj
Á(T ) ¸ °
'
|[Sj |
¡ ° |T |
HeatAKernel$and$Cer8ficates!
• 'If'no'balanced'cut'of'conductance''is'found,'our'potential'analysis'yields:'
⌧
(HG
(T ) , V
1
)
poly(n)
L+°
PT ¡1 P
j=1
i2Sj
L(Stari ) º °L(KV )
ModiUied'graph'has'¸2!¸!°
$
CLAIM:$This$is$a$cer8ficate$that$no$balanced$cut$of$conductance$<$°'existed'in'G.'
Balanced'cut'T
[Sj
Á(T ) ¸ °
'
|[Sj |
¡ ° |T |
¸°
b/2
¡° b
¸ °/2
Comparison$with$Recursive$Eigenvector!
'
RECURSIVE!EIGENVECTOR:!
We'can'only'bound'number'of'iterations'by'volume'of'graph'removed.'
Ω(n)'iterations'possible.'
'
'
OUR!ALGORITHM:!
Use'variance'of'random'walk'as'potential.''
Only'O(log'n)'iterations'necessary.'
'
'
'
ª(P, V ) =
P
i2V
||P ei ¡ ~1/n||2
'STABLE'SPECTRAL'NOTION'OF'POTENTIAL'
'
Running$Time!
• 'Our'Algorithm'runs'in'O(log'n)'iterations.'
• 'In'one'iteration,'we'perform'some'simple'computation'(projection,'sweep'
¿
Õ(md)
cut)'on'the'vector'embedding'''''''''''''.'This'takes'time'''''''''''''','where'd'is'the'
HG
(t)
dimension'of'the'embedding.'
'
• 'Can'use'Johnson?Lindenstrauss'to'obtain'd =$O(log$n).$
• $Hence,$we$only$need$to$compute$O(log2$n)$matrixAvector$products$
¿
HG
(t) u
Õ(m)
• $We$show$how$to$perform$one$such$product$in$8me$$$$$$$$$$$$$for$all$¿.$
• $OBSTACLE:$$
$$¿,$the$mean$number$of$steps$in$the$HeatAKernel$random$walk,$is$Ω$(n2)$for$path.$
$
Conclusion!
NOVEL!ALGORITHMIC!CONTRIBUTIONS!
'
• 'Balanced?Cut'Algorithm'using'Random'Walks'in'time' Õ(m)
'
MAIN!IDEA!
Random'walks'provide'a'very'useful'
stable'analogue'of'the'graph'eigenvector'
via'regularization'
'
'
OPEN!QUESTION!
More'applications'of'this'idea?'
Applications'beyond'design'of'fast'algorithms?'
'
'
'
'
A$Different$Interpreta8on$
THEOREM:''
Suppose'eigenvector'x'yields'an'unbalanced'cut'S'of'low'conductance'''''''''''''''''''''''
'and'no'balanced'cut'of'the'required'conductance.'
S
'
'P d x = 0
i i
0'
'
P
P
Then,'
1
2
2
d
x
¸
d
x
i
i
i
i.
i2S
i2V
2
'
In'words,'S'contains'most'of'the'variance'of'the'eigenvector.'
'
'
A$Different$Interpreta8on$
THEOREM:''
Suppose'eigenvector'x'yields'an'unbalanced'cut'S'of'low'conductance'''''''''''''''''''''''
'and'no'balanced'cut'of'the'required'conductance.'
'
'P d x = 0
i i
0'
V
?S
'
P
P
Then,'
1
2
2
d
x
¸
d
x
i
i
i
i.
i2S
i2V
2
'
In'words,'S'contains'most'of'the'variance'of'the'eigenvector.'
'
QUESTION:'Does'this'mean'the'graph'induced'by'G'on'V– S'is'much'closer'to'
have'conductance'at'least'°?'
'
'
A$Different$Interpreta8on$
THEOREM:''
Suppose'eigenvector'x'yields'an'unbalanced'cut'S'of'low'conductance'''''''''''''''''''''''
'and'no'balanced'cut'of'the'required'conductance.'
'
'P d x = 0
i i
0'
V
?S
'
P
P
Then,'
1
2
2
d
x
¸
d
x
i
i
i
i.
i2S
i2V
2
'
QUESTION:'Does'this'mean'the'graph'induced'by'G'on'V– S'is'much'closer'to'
have'conductance'at'least'°?'
ANSWER:'NO.'x'may'contain'little'or'no'information'about'G'on'V– S.'
Next'eigenvector'may'be'only'inUinitesimally'larger.'
'
CONCLUSION:'To'make'signiUicant'progress,'we'need'an'analogue'of'the'
eigenvector'that'captures'sparse'
'
'
Theorems$for$Our$Algorithm!
THEOREM'1:'(WALKS'HAVE'NOT'MIXED)'
'
1
ª(P (t) , V ) >
'
poly(n)
'
''
'
Can'Uind'cut'of'
p
conductance''O( °)
Theorems$for$Our$Algorithm!
THEOREM'1:'(WALKS'HAVE'NOT'MIXED)'
'
1
Can'Uind'cut'of'
ª(P (t) , V ) >
'
p
poly(n)
conductance''O( °)
'
Proof:''Recall'that''
P
(t)
(t)
¡¿Q
ª(P, V ) = i2V ||P ei ¡ ~1/n||2
¿ = log n/°
'
P
=e
Use'the'deUinition'of'¿'.'The'spectrum'of'P $(t)'implies'that'
P
'
(t)
(t)
2
(t)
||P
e
¡
P
e
||
!
O(°)
·
ª(P
,V)
i
j
ij2E
'
'
TOTAL'VARIANCE'
EDGE'LENGTH'
'
''
'
Theorems$for$Our$Algorithm!
THEOREM'1:'(WALKS'HAVE'NOT'MIXED)'
'
1
Can'Uind'cut'of'
ª(P (t) , V ) >
'
p
poly(n)
conductance''O( °)
'
Proof:''Recall'that''
P
(t)
(t)
¡¿Q
ª(P, V ) = i2V ||P ei ¡ ~1/n||2
¿ = log n/°
'
P
=e
Use'the'deUinition'of'¿'.'The'spectrum'of'P $(t)'implies'that'
P
'
(t)
(t)
2
(t)
||P
e
¡
P
e
||
!
O(°)
·
ª(P
,V)
i
j
ij2E
'
'
TOTAL'VARIANCE'
EDGE'LENGTH'
'
Hence,'by'a'random'projection'of'the'embedding'{P ei},'followed'by'a'sweep'
cut,'we'can'recover'the'required'cut.'
''
SDP'ROUNDING'TECHNIQUE'
'
Theorems$for$Our$Algorithm!
THEOREM'2:'(WALKS'HAVE'MIXED)''
'
1
ª(P (t) , V ) !
'
poly(n)
'
'
No'Ω(b)?balanced'cut'of'
conductance'O(°)'
Theorems$for$Our$Algorithm!
THEOREM'2:'(WALKS'HAVE'MIXED)''
'
1
No'Ω(b)?balanced'cut'of'
ª(P (t) , V ) !
'
poly(n)
conductance'O(°)'
'
Proof:'Consider'S'='['Si.'Notice'that'S'is'unbalanced.
Assumption'is'equivalent'to'
P
¡¿L¡O(log n)
L(Si )
'
1
i2S
L(KV ) • e
!
.
poly(n)
'
Theorems$for$Our$Algorithm!
THEOREM'2:'(WALKS'HAVE'MIXED)''
'
1
No'Ω(b)?balanced'cut'of'
ª(P (t) , V ) !
'
poly(n)
conductance'O(°)'
'
Proof:'Consider'S'='['Si.'Notice'that'S'is'unbalanced.
Assumption'is'equivalent'to'
P
¡¿L¡O(log n)
L(Si )
'
1
i2S
L(KV ) • e
!
.
poly(n)
By'taking'logs,'
P
L + O(°) i2S L(Si) º ­(°)L(KV ). SDP'DUAL'
'
CERTIFICATE'
'
Theorems$for$Our$Algorithm!
THEOREM'2:'(WALKS'HAVE'MIXED)''
'
1
No'Ω(b)?balanced'cut'of'
ª(P (t) , V ) !
'
poly(n)
conductance'O(°)'
'
Proof:'Consider'S'='['Si.'Notice'that'S'is'unbalanced.
Assumption'is'equivalent'to'
P
¡¿L¡O(log n)
L(Si )
'
1
i2S
L(KV ) • e
!
.
poly(n)
By'taking'logs,'
P
L + O(°) i2S L(Si) º ­(°)L(KV ). SDP'DUAL'
'
CERTIFICATE'
This'is'a'certiUicate'that'no'Ω(1)?balanced'cut'of'conductance'O(°)'exists,'as'
evaluating' the' quadratic' form' for' a' vector' representing' a' balanced' cut' U'
yields'
vol(S) O(°) ¸ ­(°)
Á(U)
¸
­(°)
¡
'
vol(U)
'
as'long'as'S'is'sufUiciently'unbalanced.'
'
SDP$Interpreta8on!
E {i,j}2EG
E {i,j}2V £V
8i 2 V
E j2V
||vi ¡ vj ||2 " °,
1
2
||vi ¡ vj || =
,
2m
1 1
2
||vi ¡ vj || "
·
.
b 2m
0
SHORT!EDGES!
FIXED!VARIANCE!
LENGTH!OF!
STAR!
EDGES!
SDP$Interpreta8on!
E {i,j}2EG
E {i,j}2V £V
8i 2 V
E j2V
||vi ¡ vj ||2 " °,
1
2
||vi ¡ vj || =
,
2m
1 1
2
||vi ¡ vj || "
·
.
b 2m
SHORT!EDGES!
FIXED!VARIANCE!
LENGTH!OF!
STAR!
EDGES!
SHORT!RADIUS!
0
Background:$HeatAKernel$Random$Walk!
For'simplicity,'take'G'to'be'd4regular.''
'
• ' The'Heat?Kernel'Random'Walk'is'a'Continuous?Time'Markov'Chain'over'V,'
modeling'the'diffusion'of'heat'along'the'edges'of'G.'
'
•  ' Transitions' take' place' in' continuous' time' ' t,' with' an' exponential'
distribution.'
@p(t)
p(t)
@t
= ¡L
¡ dt L
p(t) = e
d
Notatio
n!
t
p(0) =: HG
p(0)
• ' The'Heat'Kernel'can'be'interpreted'as'Poisson'distribution'over'number'of'
A1:$
steps'of'the'natural'random'walk'W=AD
P
k
t
e¡ d L = e¡t
1 t
k
W
k=1 k!
• 'In'practice,'can'replace'Heat?Kernel'with'natural'random'walk'W 't '
'
Download