18895 >> Yuval Peres: We're happy to have Alexandre Stauffer... to tell us about Poisson Brownian motions and continued percolation...

advertisement
18895
>> Yuval Peres: We're happy to have Alexandre Stauffer from Berkeley, and he's going
to tell us about Poisson Brownian motions and continued percolation in a mobile
environment.
>> Alexandre Stauffer: Thank you. I'm assuming everybody can see above this line.
Okay?
[laughter]
Hope so.
So let me start -- oh. First of all, let me say that this is the joint work with Alistair
Sinclair. So let's start defining the model in a simulation.
So we start with a Poisson point process in the plane, and for the simulation, we are just
going to pick a finite star just to make it more beautiful and appealing. But we can
assume it's the whole plane.
And then I'm going to draw both of radius, say, one, same thing at each point of the point
process. And I'm going to see this as a model for our wireless network, where each
point of the point process is a node of the network and we are going to say that true
points are neighbors if they're both intersect, like in these true points and they're not
neighbors if the balls don't intersect like these two ones here.
So this model is in some sense called random geometric graphs. Right? So for the
closest friends, we just call it RGG. And for the Poisson process has intensity lambda
over the whole plane.
So it's intuitively as we increase the value of lambda, we expect more and more nodes to
be showing up in this picture, and we have more and more, say, edge of the graph. I
say an edge when the balls intersect.
And of course the connect components of this graph will be bigger. So let me add more
nodes to the picture. We are going to get a picture like this. And it's known that they
exist a critical value lambda C such that if lambda is larger than lambda C, that then
there exists an infinite component of the graph, which we are going to call the giant
connect component. GCC.
And just to illustrate this case, I'm going to take this example and I'm going to erase the
balls around the points that do not belong to the largest component. So just give you an
example in the picture.
So all the balls you see are part of the largest component of this example. So we see
that many nodes belong to the largest component. And in particular the largest
component is spanning the whole region. So spans at the top and you can find a path
going through here, maybe you can find a path.
And so there exists giant components, but the graph is not connected. Right? You see
nodes here that do not belong to this largest component.
And true facts they exist the giant component, the graph not being connected occurs
with probability one in the model of the whole plane. And when there exists a giant
component, we're going to look at what happens with the origin of the plane. In
particular, in this setting, the probability that the origin and a note like this does not
belong to the giant component is going to be some constant eta. That's strictly between
0 and 1.
So by belonging/not belonging, the origin belongs to that component if it's in one of
these balls I illustrate here. One of the balls with the nodes that are part of the infinite
component.
So as a model for networks, the fact that a graph is not connected is a pretty serious
drawback. So we're going to consider particular subset of wireless networks. In
particular, mobile wireless networks, where you can think of nodes being people carrying
their cell phones. Right?
And they are moving around. And every time a person is close to another cell phone,
the two cell phones can exchange messages. So translate in this model, what we're
going to do is you are going to create a model of a graph that evolves with time.
So the time here, the mobile -- so F times 0. I will start with -- not that this is easy -- I will
start with a random geometric graph in the way I just defined in the picture. And each
node will perform, will move as a Brownian motion.
So we start with this picture, that each node moves a sequential Brownian motion. But
to simplify my life I'm going to observe the model in these discrete time steps.
So we're going to observe times 0-1, 2 and so on. I really don't care what happens
between 0 and 1. I'm just observing time 0 and 1. So let's see what happened. We run
with a random geometric graph. And the nodes move sequential Brownian motion and
I'm sampling the nodes in discrete time steps.
As you can see this Brownian motion in two dimensions the nodes essentially stay
around their initial location here, for example. But the edges of the graph change
substantially. For example, this node, it's isolated here, not 21. And hopefully soon.
That's it. It gets connected to someone. And there are some interesting properties. If I
let this simulation run forever, and at some moment not forever but at some moment I
stop it, and I ask you to forget everything I have seen before and just look at this picture
now, you are going to see that it looks pretty much like the same random geometric
graph.
And the fact is that is true because the Brownian motion applied to this process
preserves the measure of the random geometric graph. So at any time -- so at any time,
fixed time I, what you observe is exactly a random geometric graph.
So this model inherits properties on the run on random geometric graph, for example,
this with fixed time probability 1 the graph is not connected and the origin does not
belong to giant component with some probability it.
So you can say, well, if the graph is not connected, then doesn't help much. We had a
non-connect graph before and we still have a non-connect graph now. And what's the
point, right?
So the point is S, for example, at this time these nodes here, it's not connected to any
nodes, but if you wait some steps the nodes will move and at some point it must get
connect to someone.
So that's what we're aiming for. We're aiming for one connectivity over time here. So to
illustrate I'm going to again illustrate the balls that do not belong to giant component. I'll
let the nodes move showing you the largest components at each step.
So you see that the nodes move very slowly, but the giant component changes
significantly from one step to another. So nodes that are initially very isolated like these
ones, they may at some time connect to the giant components and be able to
communicate with many nodes in the system.
So what I'm going to look at here is how long does it take for a typical node to connect to
the largest components? So that concludes the beautiful part of the talk. So there will
be no more simulations.
So can you please send the screen up? Okay. So let me do now formalize the
question. So we assume that lambda is larger than lambda C. And I say that I define
this random variable tip to be the minimum time I such that the origin belongs to the
giant component at that time I, which I denote as this is C sub I.
So we want to say, look at a probability that this random variable is larger than some
value T, which is equivalent to say, of course, that's the probability. That the origin has
not been in the giant components at time I for all I up to time T minus 1.
So want to look at this probability. So first thing you can say is that these events, the
origin not belong to a giant component time I is monotone in the sense that if this event
doesn't hold for a certain instance, if I delete nodes, it's still going not to hold.
So, of course, it can then apply the FKG equality and the first thing you can get, that this
probability is larger than the probability for a single, a fixed time I, which will define as
eta to the number of steps.
And the first question you could ask is whether we can get an upper bound that's also
exponential in T. And the answer to this question is no.
>>: I guess this requires [inaudible] random geometric graphs, because don't know
potentially which is the giant component.
>> Alexandre Stauffer: No, the giant component is unique.
>>: Infinite.
>>: [Inaudible].
>>: Yeah.
>>: All right. Thank you. So we know why the FKJ quality this lower bound holds. So
the question when may ask is whether we can prove an exponential upper bounds and I
said that the answer is no, at least in two dimensions, and I'm not going to talk much
about that, but you can use work on space coverage for random geometric graphs and
Poisson point process to show that this probability is larger than exponential in some
constants T over log T. For two dimensions. And so now we want to see what's the
best upper bound we can get to this probability and that gives us our theorem. Our
theorem says that for two dimensions the probability, that tip is larger than T is more
than some exponential, some constant square root of for two dimensions.
Or more general the theorem applies to higher dimensions which you get the bounds
each minus C, D, should D over -- D plus 2.
But what I'm going to focus now on two dimensions only. So now from this point to the
end of the talk, my goal is to try to give an idea on how to prove these upper bounds. So
any questions so far?
So let's see what you can do to prove this. What I need is to get a hold of this
probability, in particular if we fix a time I, we won't get the true probability that the origin
is not at the infinite cross at that time given that it hasn't been before.
So you have to condition the fact that the origin has never been the giant component
before and this event is quite hard to imagine what does a point process look like if you
condition the origin not being in the infinite component before that time so what I want to
show is the first step in the proof is we're going to skip delta steps.
So now we're going to observe times 0 delta to delta instance of 1. And that is if you
skip some steps, we are going to allow the nodes to move further away and you hope for
some mixing to be achieved. And which I'll make clear hopefully later. And the idea is
for this mixing is that you want to, the goal will be to show the probability that the origin is
not in the giant component S times I times delta now, conditional some signal fields in
the previous steps is more than some constant eta prime.
So the idea is to if I show delta steps, this conditional event will be essentially the same
as without the condition. The nodes move far away in such a way that what you are
going to see in the process will be roughly a fresh Poisson process.
That's the main goal of our proof. So let's see how we're going to do this. First, before
going to prove the goal, which can check easily that once we have this, then the theorem
follows immediately, because this gives that the probability that the [inaudible] is larger
than T will be more than eta prime to T over delta, since I'm skipping the other steps,
and it comes as no surprise that I'm going to set delta to be square root of T, which we
achieved.
So for now on let's try to prove this goal. And I'm going to be introducing three basic
steps. So the first step is instead of looking at the whole plane, which is where these
event lies, I'm going to restrict myself to finite region of size L.
The square of size length L. And I'm going to take L sufficiently large so that looking at
events that the origin belongs to the infinite, the same as look at the origin belonging to
the largest cluster in this region. So not going to give you the details on this part. But ->>: Origin T.
>> Alexandre Stauffer: No the L is dependent of T. Suffices L to be the order of T.
>>: So eta prime doesn't depend on T?
>> Alexandre Stauffer: Eta prime doesn't depend on T.
>>: Take delta depending on T.
>> Alexandre Stauffer: Delta depends on T.
>>: For what values -- I mean, either you're going to be able to achieve in the goal that
depends on the delta you take, right.
>> Alexandre Stauffer: No.
>>: Why can't you take delta equals 1 to get something even better?
>> Alexandre Stauffer: But then you cannot get the constant T. But essentially what I'm
saying is what I'm going to see after this conditioning is a fresh Poisson process, right?
If you don't take delta sufficiently large, this process would be sub critical, for example.
>>: So then ->>: The time of the ->>: You take the goal to be noteworthy.
>>: All right.
>> Alexandre Stauffer: So, okay, so we start with this box. I'm going to look at what's
happened inside this box only. But to prevent interference from the outside, we're going
to control a larger region of size 2 L. So that whatever happens outside this region don't
affect what happens in the box, essentially. It will be probably clear later on.
So that's the first step. That's very simple. The second step is we are going to change
the sigma field. So originally the sigma fields you could think is that the probability of the
origin has not been giant components in the previous step. Now we're going to add
conditions so we can handle this. And the condition I'm going to use is some notion of
density. So let me first explain what we do. Take this big box and you tessellate it in
small squares. That's the usual one, by the way the whole box. And the more squares
you have side length little L which will depend on T. And I'm going to say that the cell,
which is the little box, the cell, will be dense. If it contains at least some number of
nodes. So this is for fixed time. You look at the fixed time, a fixed cell. If the cell is
dense if it contains some number of nodes and this number is the expected number of
the nodes in that cell, which lambda L square times some 1 minus epsilon, where
epsilon will make sufficiently small in this.
So once that cell is dense if it has slightly less -- has more nodes than something slightly
small than the expectation. Right? And of course since the number of nodes in the cell
is a Poisson random available, this event has high probability. In particular, the
probability that the cell is dense, dense at time I will be something larger than 1 minus an
exponential value minus some constant lambda L square.
This is for fixed cell and fixed time I. And we can extend this to all cells and all times so
that we have probability that all cells are dense at all times I will just be the union
bounds. The number of cells times the number of steps.
Okay. And so we're going to work with these events that the cells are dense at all times
I and now we're going to let me show you how to relate L with delta. So the idea is we
set delta sufficiently large so that after delta steps, our nodes can work anywhere inside
its cell and even further away.
So essentially we want delta to be not A. We want delta to be some large constant
times L squared. So that we can move anywhere in this cell.
So since delta is going to be square root of T set somehow, somewhere.
>>: There.
>> Alexandre Stauffer: Right there. It's here. Yeah. Right. So as said here. We have
L square is also square root of T. So this event -- L squared. L squared. Right. Is
equal to a square root of T. So this event will hold, will not hold, say, with probability
exponential is more than square root T, which of the same order as the theorem we want
to get. And that's exactly the limitation of why we only put square root of T, because this
value must be at the same order as this value, the T of the delta.
Given this condition. It has to be satisfied. So now that's going to be our sigma field.
Sigma fields will be at the previous step not only the origin has not been in the giant
components, but also all cells will be dense at all the previous steps. And I'm going to
use this to show that what we are going to see is itself a Poisson point, computation
Poisson process.
So that's the part for the rest of the talk. So let's see how we use the density. So take a
fixed cell, size L. And we know that it's dense. So we know that there are many nodes
in this cell. We don't know where they are, how they are distributed. It doesn't matter.
We just know that there are many of them.
So more than 1 minus epsilon nodes. And I'm going to put inside this cell a fresh
Poisson point process. A fresh Poisson point process, with intensity smaller than this. A
little smaller than this, 1 minus epsilon squared length. So I'm going to add Poisson
process to this cell and all the other cells inside this big box of size 2 L with intensity
smaller than the minimum number of nodes. We are seeing. So we have some nodes
here. Let's draw this across, for example. And since with high probability the number of
crosses will be smaller than the number of blue points.
So we are able to match each cross to one blue points and we can do this in an arbitrary
rare. It doesn't matter which point we choose. Only narrows that each blue point match
only to one red cross.
And now we're going to -- that's the blue points and the red cross, move across a
Brownian motion for delta steps. But not that since this guy is at the same cell, their
distance must be smaller than the diameter of the cell L squared 2 but delta is of the
order of CL squared. So after delta steps, both nodes are able to traverse this distance,
no matter how I couple them. And that's the main part, because if I take C sufficiently
large, it's easy to show that we can couple this, the motion of these two nodes so that
after delta steps they go to the same location with high probability.
So we can say that this direct process and its partner move to the same location with
probability 1 minus epsilon by setting C sufficiently large with respect to epsilon. So
once you have this, you know that the red cross that couple with their partner forms itself
a Poisson point process by seaming this one with probability 1 minus epsilon. So it's a
new Poisson point process, with intensity 1 minus epsilon cubed times lambda.
>>: Just ask a question. Why -- you notice that all of them are coupled, just ->> Alexandre Stauffer: I don't need all of them to be coupled. All the red cross, not all
the blue.
>>: Right. So but you don't need -- all the red crosses to be coupled or some of them.
>> Alexandre Stauffer: Completely coupled the same location, just some of them.
>>: So those that are not maybe special ones, you said [inaudible].
>> Alexandre Stauffer: No no.
>>: You can couple.
>> Alexandre Stauffer: Those ->>: The trouble ->> Alexandre Stauffer: Maybe let me explain the coupling, it may be easier.
>>: Yes.
>> Alexandre Stauffer: So the coupling, is you take a pair, and I'm going to choose a
position for the red cross first. So say it was this position, and suppose the distance
here is, say, Z. So I'm going to give you a sub density that shows how to move X to the
position at distance T. The sub density will be essentially a normal distribution which is
usual for the Brownian motion. So 1 over 2 pi delta, exponential. But instead of having
Z squared true delta, I have Z plus Z. Z plus L square root of 2 squared.
So instead of the usual Brownian motion, I will hide on where I had this L squared root of
2 I just have this extra term. So this is, of course, smaller than the density of the
Brownian motion point-wise. And not only that, if you take this, the departure of the red
cross, the density of him moving there is also larger than this sub density, because by
the triumph we call L sub 2.
>>: Two independently?
>> Alexandre Stauffer: No, the main trick is that this sub density does not depend on the
location of the blue point.
>>: I guess I'm confused. If you have two red crosses, coupled, these two ->>: Red and blue.
>>: Sorry. Like the event that one is coupled and the other is coupled, are these
independent?
>>: Yes.
>>: But it's not too much.
>>: It's not enough.
>>: Because not having the independent location.
>>: Yeah.
>>: Okay. [laughter].
>>: [Inaudible] where it doesn't [inaudible].
>>: And the distance -- the coupling between them. The distance between the red and
the blue.
>>: The event depend on the final location.
>> Alexandre Stauffer: Settled? Any more questions so far? Thanks. So that's how we
do the coupling, and we know that doing this coupling, what we show that to get this 1
over epsilon is we integrate over Z we get the 1 over epsilon for some large enough C in
the definition of delta.
So that's given the probability that the coupled to the same location and this forms a
Poisson process with intense epsilon cubed times lambda. But that's not the end of the
story, because what we know now is that to have a Poisson point process condition that
they will couple, but we didn't make the move yet. The move must be made for after
delta steps, must be made according to this density, because of the condition on the
coupling.
So what we have is that we have inside this big box a Poisson point process of red
crosses, and we are going to let them move according to that particular density.
And it's easy to see that that density, well, if -- that density would preserve the measure
if the Poisson point process was on the whole plane. But it's not. It's only on the true L
by true L box. So what we need to show last step is that after making the motion, then
some mass of intensity from here will go out. Some massive of density from here will go
really out. Some mass here will go in, but there is no mass coming out here to inside.
So the intensity will decrease a bit. But not so much, because this distance is very large.
This distance has all the T and they're just moving square root T.
So what we can show is that, for example, being generous here, we can get another
Poisson point process with intensity 1 minus epsilon 4 times lambda. Right? And then
just set epsilon so this is larger than lambda C. Well, we had assumed that lambda is
strictly larger than the C. So we can always set epsilon.
So by doing this, we know that the red cross, the red cross will be a super critical
random geometric graph so in particular we proved the goal. And once we proved the
goal, we have the result immediately.
And that essentially concludes the proof. Let's put this here. And also the talk. Thanks.
[applause].
Questions?
>>: [Inaudible].
>> Alexandre Stauffer: The real one? For two dimensions, I'm pretty sure that the real
one is matching the lower bounds. We weren't able to prove it. But we believe that's the
right answer.
>>: We should say the bound is computing some probability [inaudible].
>> Alexandre Stauffer: So the lower bounds comes from the probability. Here we say
the probability is that the origin belongs to the infinite question. So you can look at
probability that the origin is adjacent to any nodes. It's a Poisson process until some ball
hits the origin. So this is how we get this lower bound.
So we have the probability that the origin is isolated for T steps is the case exponentially
in T over log T.
>>: Do you think that the lemma -- but you have the random -- next to gold. That this
would actually be true with delta equal C of T but ->> Alexandre Stauffer: My hope is that this is -- this should be true for delta log T but we
need to show better results on the density, because that's the limitation here on our
proof.
>>: But if what you are interested in was simply whether it's a neighbor then you know
this is the right answer?
>> Alexandre Stauffer: Then we know this is tight. We have an upper and lower bound.
It's known from the space [inaudible] results and we have a generalization for some
case. So we have an upper/lower bound for this, for this particular problem.
>>: So you don't have a lower bound [inaudible].
>> Alexandre Stauffer: So for bigger than 2, the lower bound for this, from this setting is
exponential in T so that's the best, as good as the FKG inequality.
>>: The work that's -- it's known that for the [inaudible] probability E to the minus volume
of the wiener sausage [phonetic] time T [inaudible] union of balls to time T, union is
linear in direction. Like T over all T. Because of the range of the random walk. That's
just for coverage.
>> Alexandre Stauffer: Yes.
Any more questions? Thoughts? Solutions? Okay. Thank you again.
[applause]
Download