>> Yuval Peres: Okay. Welcome everyone, as well... happy to have Eviatar Procaccia tell us about the Cheeger...

advertisement
>> Yuval Peres: Okay. Welcome everyone, as well as the viewers at home [laughter]. We're
happy to have Eviatar Procaccia tell us about the Cheeger constant of supercritical percolation.
>> Eviatar Procaccia: Okay. Thank you all for coming or attending at home as I now discover.
Thank you for having me here. The title is quite long but the talk is intended for quite a large
wide audience so I will try to define everything. Besides what the asymptomatic is and the
square lattice, all of the rest of the words will be defined.
>>: Actually… [laughter] you can get to the core. You can skip…
>> Eviatar Procaccia: Okay. You can tell me to run so how, what is a parameter, everybody
knows the basic facts, so I can skip this, even the nice fact about [laughter], even the nice fact
about the first reference about the isoperimetry. [laughter]
>>: [inaudible] carpet, maybe we could read the previous slide.
>> Yuval Peres: What I meant is you don't have to define percolation.
>> Eviatar Procaccia: Okay. Isoperimetry is a general class of problems where the idea is to find
sets with a given volume and minimal boundary measure. These questions were originally
asked in continuous space by the Greeks. They actually knew the correct answer and as this
slide shows this actually one reference was saved from Roman times from 20 BC in the story of
the Princess Dido. She was a Phoenician princess who had to run away because her brother
killed her husband and tried to kill her, and she ran to northern Africa where she made a pact
with the local king that she will get a bull’s hide and she paid him some money and the promise
that she will get as much land as she can enclose with the bull’s hide, so Princes Dido was a
smart mathematician. What she did was take the bull’s hide, cut it into very, very thin strips as
you can see here and then took the seashore as one side and created a perfect half circle as the
other side and of course which solves the isoperimetry problem and there she formed the
kingdom of Carthage, [inaudible]. This was all stated in Virgil’s points. So in discrete settings
there is an isoperimetric problem that is given by the Cheeger constant and I will just give a
quick definition. If you have a finite graph G the Cheeger constant defined as the minimum
over all sets of size smaller than one half the size of the graph, the ratio between the edge
boundary which is all of the edges such as one side is in the set, and the other side is in the
complement and the volume of the set. I'll give you one example where the application for this
and if you take a big torus end by end by end by in D dimensions, choose a point that is
universally random and then run a random walk until time which is proportionate to the
volume so to some u larger than zero and run it for time t which is larger than u time n to the d
then together with Shellef using a paper of Uvall [phonetic] and Morris you can obtain using
isoperimetric inequalities that the mixing time to run the random walk on this graph, the mixing
time of this random walk is the same as the torus up to an iterated log factor with this
probability that it's larger than iterated log goes to zero and goes to infinity. This is one
application; it is not topic of the talk. So skip percolation, skip critical percolation [laughter]
right, okay. So we fix some p which is larger than the critical value and you know that there is
some giant component inside the box with very high probability such that all other components
are polylogarithmic in size.
>>: [inaudible] [laughter]
>> Eviatar Procaccia: [inaudible] I skip this part. You tell me to skip this part. [laughter] So
denote giant component by CDN, so now this is going to be our graph and denote by phi and
the Cheeger constant of the giant component. Now in several works by Benjamini, Mossel,
Mathieu, Remy, Berger, Biskup, Hoffman and Kozma and this note by Pete in ‘08 was proved
that n times the Cheeger constant is bounded between two constants, a little c and big C with
probability tending rapidly to 1. This led Itai to formulate following conjecture that the limit
actually exists almost surely. The correct asymptotics was known but not that the limit exists
and so with this conjecture in mind together with Rosenthal in 2011 we showed that the
variance of n times cn is smaller than some constant n to the two minus D, which shows at least
in high dimensions the Cheeger concept is concentrated around its mean and it's enough to
prove convergence of the mean but this is still open essentially, so in high dimensions we don't
even know this.
>>: I'm sorry, what's open? I missed something.
>> Eviatar Procaccia: So even the weaker conjecture in expectation is still open. The topic of
the talk today is the paper with Biskup, Louidor and Rosenthal which would prove the
conjecture in two dimensions. So fixed d equals two and phi n is the Cheeger constant of
[laughter]
>>: [inaudible]
>> Eviatar Procaccia: [laughter] so the limits in two-dimensional percolation exists almost
surely. Now the way the proof is constructed is first we will have to describe what is the Wulff
construction, which is the general physics notion that we will describe. The Wulff construction
is constructed by some norms so we are going to define what is this norm that the shape is
constructed with. There is some discrete geometrical objects that we had to invent which
approximate between discrete curves and continuous curves and we'll have to talk about this
and the proof of the main theorems. So what is the Wulff construction? Let BP be some norm.
For now the p is just for later use. For now it's just norm on R2. And let PP be the solution of
these [inaudible] question with respect to the norm beta P, so it's the infimum overall all simple
curves in R2 with leb beg interior one. Take the infimum of the length with respect to beta P.
Now Wulff in the beginning of the last century was a physicist conjectured that such a
minimizing curve can be constructed, and actually conjectured the correct shape, so WP is the
balling, the dual norm of beta P, so it beta P is the L1 norm then WP will be the L infinity ball et
cetera and WP hat is just a normalization of this such that you'll have leb beg area one, so this
conjecture was made 100 years ago but it was actually proved by Taylor in ‘74 that this the
same shape is indeed a minimizing set.
>>: [inaudible]
>> Eviatar Procaccia: I'll check and I'll tell you later it's not the Taylor that, the other Taylor
[laughter]
>>: [inaudible] conjecture. It was really stated as a conjecture, was it? [inaudible] realized it
was…
>> Eviatar Procaccia: By Wulff.
>>: [inaudible].
>>: Well, you know, he's a physicist. It was stated as a claim [laughter], but for us it was stated
as a conjecture since it wasn't proved. In ‘75 Taylor proved that the minimizer is unique up to
shift, so the Wulff construction is the unique answer to the [inaudible] question. Now denote
by DH the Hausdorff metric, the L infinity Hausdorff metric, for instance. Then a very useful
quantity version that we use in our paper by Dobrushin, Kotecky and Shlosman says that for any
simple rectifiable curve gamma enclosing a region of unit Lebesque area if the beta P length of
the curves are close, the curves themselves also close in the in the Hausdorff metric. Using this
claim it also strengthens the conjecture of Benjaini and we prove a shape theorem. So for D
equals two and here I stated there exists a norm that will -- in a few minutes we will construct
this norm and so…
>>: Quick question on the previous slide is this square root essentially tight? Is that the right
R?
>> Eviatar Procaccia: So can it be strengthened? I don't… So defined by U hat the family of sets
which achieve the Cheeger constant, we show that every set which achieves the Cheeger
constant are the correct normalization converges to the Wulff shape with specific norm, so in
the Hausdorff metric, so this is a shape theorem. Okay. So now for several slides, forget about
the [inaudible]. We're going to talk about Zed 2 geometry and topology so for Wulff contract
construction what you need is some sort of tension. It was constructed for physical questions,
like for instance, the geometry of water, a water drop, so we have the water tension but our
tension has to do with the number of, go with the boundary, so essentially we need to find
some norm which gives us information about the boundary. So there is a set of definitions
now. So the right boundary edge is the following. You can look at the definition on top or the
picture for more clear, so you look at the path, so three steps of the path. If the path goes up
and then down then this is the right boundary. These are the right boundary edges of this
vertex so essentially the law is as you go from xi +1 to xi minus 1 clockwise and all of the edges
which are between are the right boundary edges so in this case there is only one and so in this
case there are no right boundary edges, and the right boundary of the a path which is denoted
like this is a set of right boundary edges. Is this definition clear? Now right most path is a path
which is simple but simple not in the original sense but in the sense of the oriented sense, so in
each edge you can cross at most once in each orientation, okay? And it is rightmost if it is
simple and doesn't contain any right boundary edges. If you want to look at this picture so this
is the right boundary path, so go like this, okay?
>>: [inaudible] right boundary?
>> Eviatar Procaccia: The right most path, okay?
>>: It doesn't contain any of its own [inaudible]?
>> Eviatar Procaccia: Right. These, the striped edges are the right boundary and you can see
that the path does not contain any one of them, and only to notice also this curly path which is
on the medial graph, actually the graph which the vertices of the edges and they are connected
if they share a face, a vertex, and what you can see which is a claim that we proved is that for
every right boundary path this interface in the medial graph is simple as a path in R2. Okay. So
let R X Y be the set of right most paths from X to Y. For every realization Omega of the
percolation and the rightmost path gamma let B of gamma be the size of the number of open
rightmost edges of gamma. And if X and Y are connected inside the percolation cluster let B X Y
be the infimum of B gamma over all rightmost open path between X and Y. Okay? So this is a
on the way of defining the norm. One more small definition, so for every vector in R2 let the
bracket of X denote the vertex in C infinity that's nearest to X using so, it shouldn't be the
lexicographic anymore, but using some kind of [inaudible] invariant order. Theorem the
boundary norm for every p which is supercritical and every x in R2 the limit b bracket zero
bracket nx over n is an almost sure constant. It is bounded from infinity and [inaudible] which
is different than zero responded away from zero. The limit also exists [inaudible] one and the
convergence was uniform on the unit circle, okay, which is essential for our…
>>: [inaudible]
>> Eviatar Procaccia: So bxy, so given two vertices look at all of the rightmost paths between
them and the rightmost boundary and then bxy is the infimum of the rightmost boundary of
such paths, but the px is defined as the limit of b zero nx over n. So the claim is that it is
homogeneous. It obeys a triangle inequality; in particular, it is a norm. Okay. So for every p,
beta P is the norm that we are going to construct the Wulff construction according to. So this
term is quite technical but the one I want to talk about two conceptual points, so the uniformity
essentially comes from this very nice paper from Kesten in ‘93 called The Speed of Convergence
in First Passage Percolation where invents his nice idea of things so the trajectories are not
bounded because you have edges where you cannot cross, so you don't know how far you get
to but there is this nice idea of giving some very large weights on closed edges, and then having
the new weight which is, which sums up all of these weights and then this is bounded if it
proves the speed of convergence on this thing and then it shows that it is the same. So this
didn't transfer immediately but this is the main idea. And the second idea is sub-additivity so
for the first passage percolation some of the sub-additivity is trivial, but here it's just semi-trivial
so the idea is that if you have two rightmost paths you can concatenate them in the following
way. So travel with the first one until the first time you hit the second one, and then continue
with the second one from the last time it hits this point to its end. Okay? So it's not very hard
to see that this concatenation is the rightmost path and the difference within the right
boundary is at most two, so the constant difference does not affect the sub-additivity because
you can take the norm +2 and show that this is sub-additivity. So these are the main ideas.
Now we call a rightmost circuit if it’s a rightmost path which is closed, and as we said earlier it is
always possible to find a closed curve which is close to the rightmost path and so we have
[inaudible] essentially so you have interior of this, so the interior of this envelope is welldefined, so define the volume of the rightmost path to be intersection of the interior of the
interface the continuous interface with the C to N. is this clear?
>>: So the volume is…
>> Eviatar Procaccia: This is finish, so it's all, essentially all of the vertices inside to the end
which are enclosed inside…
>>: [inaudible]?
>> Eviatar Procaccia: Hm?
>>: [inaudible]
>> Eviatar Procaccia: The set.
>>: The volume is the set?
>> Eviatar Procaccia: No, it should be the size of the [inaudible]. Now what we need to do is
given a set in CDN to approximate its boundary the norm.
>>: [inaudible] should be open [inaudible] CDN?
>> Eviatar Procaccia: It's a [inaudible] component.
>>: Oh okay, but before you used in [inaudible] infinity?
>> Eviatar Procaccia: No, at CDN now these; I put C2N.
>>: [inaudible] infinity [inaudible]
>> Eviatar Procaccia: Okay. So I want to talk about how you approximate a set inside CDN to a
continuous set such as the length Beta P, the length of the curve is close to this, so the first
thing you do is a polygimal [phonetic] approximation. Okay, you choose some fixed R and then
jump from vertices which are of distance at least R, so R is an integer. Now in this process this
polygon that you get doesn't have to be simple. But simple change just by taking vertices which
are very close [laughter] and creating a polygon with two more vertices will produce in a finite
time so each time you do this procedure the number of components reduces by one, so this will
end at some point, and from the convergence term you know that once these edges are long
enough the bead of the right boundary distance between them would be very similar to the
norm, okay? Is this clear? Good. So this is a theorem which states what I just said. We'll get to
it in a second. Okay. So let's give…
>>: [inaudible]
>> Eviatar Procaccia: Let's look at it. So for every epsilon there exists a constant, that is for
every R bigger than zero the probability which goes stretch exponentially to 1 the following
[inaudible] if gamma is any rightmost open circuit inside the box of size which is at least R to
the 1 over 5 which is very essential for a set which achieves the Cheeger constant and the size
of the path is more equal than the volume to the power of two over three then there exists a
simple closed curve lambda such as that it is very close to the original one in the Hausdorff
metric. Okay. The one is just a…
>>: No bullion gamma [inaudible] set?
>> Eviatar Procaccia: Yeah, so volume gamma is the set, sorry. So volume gamma is what was
defined earlier. It's a set that is the intersection of the interior of the interface in CDN, C2n.
Okay? The one here is just the distance between the discrete and the continuous paths and the
beta P length of lambda is smaller up to some small approximation then the right boundary
distance of, right boundary, the number of open right boundary edges of gamma. Okay? I see
that you are a little bit confused. Maybe it's because I confused you. Any questions?
>>: [inaudible]
>> Eviatar Procaccia: Hm?
>>: It's the first time in 25 years I've seen volume [inaudible] [laughter] [inaudible]
>> Eviatar Procaccia: Yes.
>>: [inaudible]
>> Eviatar Procaccia: So let me know that I didn't like this simple for, metrics are not simple
[laughter] but I will not say more than that. So okay, let's give a sketch of the main theorem. I
want to prove that the limit exists. We do it with a lower bound and an upper bound. In the
lower boundary let's start with the set achieving the Cheeger constant and we can assume that
it started in some constant n squared, okay? So by definition, few of n is the ratio between the
fellow [inaudible] [laughter]
>>: [inaudible].
>> Eviatar Procaccia: So by definition C of N is the ratio between the boundary and the size of
[inaudible] n. Now this is a restatement of the previous theorem that we find a closed curve
lambda such that it is close to the original envelope in the Hausdorff metric and the beta P
length is smaller than beta gamma, then so since the polygons were chosen to be, to include
the vertices in the envelope [inaudible] n then you know that the best length is better than the
original envelope part between the two vertices, so this gives you this inequality. These are
volume bounded you get from the previous theorem and since you know by density arguments
that the size of [inaudible] n is smaller than some, so theta plays the probability of the vertex to
be in the infinite component. With the size of A any smaller than theta b plus epsilon n squared
over 2 with high probability. This gives you the lower bound on the Cheeger so 50 is the Wulff
number. So this inequality says that since this is a path with the correct volume normalized it is
bigger than the Wulff number, and so this is the easier side and the upper bound we want to
start with the Wulff shape but our priority nobody tells us that the Wulff shape is contained
inside a box. It's just a solution of some [inaudible] question so we need to show -- first of all
need to blow it up to the correct size. We need to show that it is contained in the box and so
let's do this. So the red thing is supposed to be the Wulff shape. It's not how it really looks like
but this is for this proof. So let alpha be the maximal in the x axis which is still included inside
WP.
>>: Don't you know [inaudible]?
>> Eviatar Procaccia: Yes. [laughter] so we are going to use this convex, but I want to show,
yes, but how -- this is a [inaudible] idea. How can you picture a convex set that leaves
[laughter], which contains these vertices and leaves this?
>>: What you mean that it is in [inaudible]?
>> Eviatar Procaccia: Yeah, but you want to [inaudible] bound it explicitly because you want to
approximate it to a set that can achieve the Cheeger constant so it has to have the correct size
inside the correct boxes, so essentially you need to give a very good bound on the L infinity box
which contains it. Okay? So by symmetry and convexity you know that if this point exists and
also this point and this point and this point, so the L1 ball for this alpha is contained inside WP.
Now this is why the picture looks ugly because of the assumption by contradiction, so assume
that there is some point x which is to the right of alpha, so it leaves the L infinity ball. Then by
symmetry you know that also the reflection of x is in the box and you know that x plus the
reflection which is bar x, so has length which is smaller than two times beta p of n. Okay. So
this means that x bar over two is inside WP but this is a contradiction, okay? So we get this
trivial measure bound and by normalizing it correctly we know that the normalize L shape
contains the L1 ball [inaudible] one half it contain in the infinity ball with [inaudible] one over
square root two. This is not hard but this is essential for [inaudible]. Okay. So you want to do
is start with the Wulff shape and you know by definition that vp is the length of the boundary of
the Wulff shape. Now again, what you do is approximated by some polygon, and the polygon
which is a close enough to the original shape that you can always do. And now from each of
the vertices of the polygon what we show that you can find paths that are almost optimal, so
optimal up to some small epsilon correction, which are as close as we want to the path. So if
you take this path you don't get too far away from the polygon and thus from the original Wulff
shape in the Hausdorff metric.
>>: [inaudible]
>> Eviatar Procaccia: Good. Again, you do the [laughter] don't tell anybody in this building that
they never configured anything of Microsoft in my computer. So this is how you find a path
inside [laughter]
>>: This will be a little difficult now you've got [inaudible] [laughter]
>> Eviatar Procaccia: So this is how we find, so these paths are inside CDN, okay? They are
close to the original set and their rightmost boundary is smaller, so this is choosing the polygon
and blowing it up to the correct size is taking this path and creating this simple circuit with
smaller right boundary, and now take all the vertices in the interior of this envelope. The size is
similar to theta P over two times n squared, so we get also the upper bound. If this is clear so
each move was a lie so we have to show that everything happened with high probability. We
use the uniform convergence, so where is uniformity important? In the lower bound we
started from some set and we did this polygonization [phonetic] so we need uniformity over all
directions, every starting point this proves the uniformity is not as important but… So are there
any questions so far or do we move to the open questions? So this essentially is proof for the
limit for the Benjaini conjecture in two dimensions, so the limit exists but this is already very
close to the shape theorem which I will not discuss today. Are there any questions on this
proof before we go to the open questions? Okay. So some open questions. When P equals
one the, so what is our boundary norm? Essentially all of the edges are open, right? So it is just
L1, L1 norm. So the Wulff shape is the L infinity ball which is what we know, right? We just
take the torus and set it so the Cheeger constants are squares. Now a square is not smooth
because it has corners but [inaudible] conjecture wants to take some P which is between one
and PC to already become smooth. This we find very interesting and at the moment we don't
have any idea of how to prove this. Now another interesting conjecture is that the limiting
shape is that P goes to one half.
>>: [inaudible]
>> Eviatar Procaccia: So essentially nothing, because it's given, so it's a deterministic -- we
know that it is some deterministic shape, but it's given by some norm. Once you know the
norm you know what the shape looks like, but when you are very close to one it's easy to show
that the norm is close to L1 and then the shape is close to the square. It's also not very hard to
show that…
>>: [inaudible]
>> Eviatar Procaccia: All we know to do now is union bound, so take one minus epsilon when
epsilon is very small then essentially you cannot find a path with…
>>: Fixed?
>> Eviatar Procaccia: Yes. Fixed, just to wean the number of trajectories; this is what I mean.
Okay? But essentially we don't even know to prove, for instance, this thing which is very
natural that when P goes to one half that the limiting shape converges to the Euclidean ball.
This relates to another open question about just the graph distance in percolation, which is also
not known, so the graph distance converges to the Euclidean distance when P goes to one half.
Another natural question is higher dimensions, so on -- there were some works on Wulff
construction on the Ising model where they generalized it to higher dimensions by Cerf,
Bodineau, Ioffe and Velenik.
>>: [inaudible]
>> Eviatar Procaccia: So these generalizations are quite technical and it's not certain that we
can actually use some of these because there is a big difference between droplets and subtle
geometric questions like is it a [inaudible] set. But we are thinking about it. For instance, it's
natural even to define the norm, right? So in two dimension we defined the norm via this
rightmost path which is essentially the way we found out you could characterize the envelopes
of settings in zed two. But in higher dimensions you don't have [inaudible], right? You have all
of these weird sets like the Klein bottle Alexander’s [inaudible] so it's not certain how given a
path you know what is the orientation of the edges that you are supposed to take, so one way
for instance, to generalize it is to do this twice infimum. So first look, so take two points x, y for
every set which contains x and y on the inner boundary, look at all the paths on the boundary of
the set. Now the set’s boundary gives you the correct orientation of the path boundary, and
now take infimum over all of the sets and over all of the paths then. So we are thinking about
it. It's not easy to show sub additivity that it converges, certainly uniform convergence will not
be easy, but hope that you will succeed. Okay? Questions? [applause]
>>: So I still don't quite understand what you know and what you don't know about the shapes.
Usually it's fairly standard to argue things about [inaudible] and that would, if you know how to
do that then probably you wouldn’t have placed that open problem [inaudible], so what?
>> Eviatar Procaccia: Again, we get the norm by the sub additive [inaudible] theory. Basically it
is not very constructive, right? So we know that there is some limiting shape. We have a
characterization of it.
>>: So this is the [inaudible] supposed to be about the surface tension?
>> Eviatar Procaccia: Yes, exactly. The norm is the surface tension. This is a restatement of…
>>: I don't understand the second [inaudible] of it? That's the open? You can carry out the
compilation. Usually like something [inaudible] is to know the second [inaudible] exists
[inaudible] between positive and zero something like that…
>> Eviatar Procaccia: So probabilistically the norm is not very hard to understand, but given a
path here, if the boundary, so -- so it's not hard to do the calculations and use the norm even
though you don't know what it is. All we need is this bound. So if the points are far away then
the best path is close to the norm. It's not immediate and we need all of this technology of
Kesten but it's possible without knowing like you have all of these supercritical claims when you
don't know what the critical number is. So many things are easy, are possible to do without
knowing what it actually is. We know that there exists something that is deterministic but we
don't have any…
>>: [inaudible]
>> Eviatar Procaccia: Hm, sure, sure, sure. Yes. Yeah, so it is the dual ball of some norms, so
it's nice, but which norm? That is the question. And you pass from a square to a circle by what
way; what are the fluctuations? These things are interesting and are very open.
>> Yuval Peres: Thank you Eviatar again. [applause]
Download