>> Darko Kirovski: Well, it's a great pleasure for... welcome Professor Rehr from the department of physics from the...

advertisement
>> Darko Kirovski: Well, it's a great pleasure for me today to introduce and to
welcome Professor Rehr from the department of physics from the University of
Washington. Professor Rehr has been for a long time in the field of physics,
whole bunch of fields that he has conquered over there since I think your Ph.D.
at Cornell.
We met, the way how Professor Rehr has come here is through an introduction
in an aircraft. We were just sitting next to each other. I was so excited with the
work that he was doing. He was reviewing one of his papers, and I was very
excited. We started chatting, and that's how we are welcoming Professor Rehr
here.
I'm very excited to learn about the topic that you'll be presenting today.
>> J.J. Rehr: Thank you.
>> Darko Kirovski: Okay.
>> J.J. Rehr: All right. Well, thank you, Darko, it was aircraft pleasure for me
actually to make that introduction. That made the trip from Copenhagen back to
Seattle much more enjoyable.
So I'm really happen to be here and to meet all of you virtually who are possibly
looking in as well. So what I want to talk today are some approaches that I've
been developing with my colleagues. And they're realtime approaches and the
idea is to do what quantum mechanics does much more efficiently and to exploit
the advances of high performance computing to do it. So this is just a subset of
topics that we've been working on, but I think it will give you a flavor of what
we've accomplished.
So the goal is to treat the quantum mechanics very large systems but without the
traditional approach of solving the Schrodinger equation and getting eigenstates.
This may seem farfetched but actually it can be done with fancy new tricks, and
our tricks are to use realtime approaches. They go under the buzz words of time
dependent density functional theory and analogous technique called density
functional theory molecular dynamics which is the first principles approach to
molecular dynamics without any adjustable parameters. They're not can't
potentials but they're predicted from first principles.
So the talk has two parts. I'm going to first talk about the application of this
method to nano-scale systems. That's in the order of few hundred atoms, using
this so-called DFTMD and these calculations can be checked against high
precision experiments now using x-rays, x-ray absorption theory. So I'm going to
show an example where you can use this experimental technique to check the
calculations.
The second is a related distantly related topic and that involves photonic
molecules, large molecules which are optically active which are going to be used
in the next generation of electronic/ botanic devices. And please feel free to ask
questions throughout the talk if something is unclear.
So this is our motto. This is our hero. And he was very much interested in these
time dependent techniques and in fact invented some of them, especially the
propagator technique that some of our work is based on. Okay. So this is the
first topic. And it has to do with how real catalysts that are used in say petroleum
refining work, these catalysts typically have a small metal particle like platinum
and it's where it actually lots of these, they're dispersed. This just shows one.
And the surface is called gamma alumina. It's kind of inert surface that this is
based on and this is actually a time elapsed simulation which I'll discuss in some
detail. It's done at several different structures and the work is published and forth
coming, physical review B, and I have some reprints or preprints with me if
anyone is interested.
This is work done with my post-doc Fernando Villa who is actually here in the
audience today, principal and first author on this paper.
The second topic dealing with photonic molecules is how to describe how these
very large polymeric systems behave when you apply a time dependent field,
and these fields are used to control the activity of these molecules and they're
non linear. In order to be a good photonic molecule they have to have good non
linear properties so that's what we calculate, the second harmonic generation
and DC rectification.
So I'm going to begin with this description of the behavior of catalytic particles.
As I said this is a simulated rendering of what such a structure looks like. The
big purple atoms are of course platinum atoms. The rusty ones are those which
are in contact with an oxygen atom on the surface, they're the red ones and
because of this close contact there's a little bit of charge transfer. These
platinum atoms become oxidized, okay. And that change between a metallic like
platinum and an oxidized platinum is really responsible for the very interesting
dynamics in chemistry that gives us a clue as to how they work, okay.
And the others are aluminum atoms which don't play too much a role. So the
bonding is from the platinum to the oxygens. The aluminums sort of stay out of
the way. And the surface is bouncing up and down, depending on the
temperature, so this is sort of like a mattress of magic fingers and which you
adjust the temperature by some molecular dynamics technique that controls the
kinetic energy. So this is like a heat bath and that affects the behavior.
Anyway, these platinum clusters when you looked at them with x-rays had a lot of
very unusual properties. Negative thermal expansion. You heat these up and
unlike platinum, the bonds tend to contract. It's a very strange. Also, this is very
large disorder, unusually large compared to just librational disorder. The thing is
almost like melted.
Anyway, so our goal was to understand these properties. We got a telephone
call one day from a colleague who had been making these. They made a lot of
these materials and then layered them with x-rays and discovered this and they
hadn't a clue how to explain it. Well, we had various clues but they're all wrong
and it took us a while to figure out how to address them and what we came up
with is something that's a real alternative conventional way of looking at these
structures. The conventional way is to get some structure of this out of some
database and then you find the equilibrium spots on the surface where this can
sit and then you kind of look at it and look at several things like that.
We wasted a lot of time doing that. Didn't work. So the next approach is actually
to follow the motion in realtime of this and notice it's a fairly big system, it's about
150 atoms, about a dozen atoms of platinum and this big surface in order to get
an accurate quantum mechanical simulation.
Okay. Well, these are the experiments. So I have three graphs on here. The
one on the top shows the behavior of the platinum, platinum distance versus
temperature. And there are a lot of curves on this, but the black curves is what
ordinary platinum does. You see, it is this kind of slow steady increase in bond
distance with temperature. Normal. Linear. Okay? And this is platinum on
carbon black, you know, similar behavior. Okay?
But when you put it on alumina, this particular substrate, look, as you heat it up
the distance gets shorter, and you can do it whether it's in helium or hydrogen
atmosphere. These are kind of typical of what's actually done in the industrial
world, I'm told.
So what's going on? Not behaving like ordinary platinum metal. This is the
disorder, sigma squared, it's kind of the mean square, RMS variation in bond
distance, and so this is basically a 10th of an angstrom more or less. Okay.
Platinum foil slowly increases from a very low value but in these materials look at
how big it is. There's this huge kind of intrinsic disorder, very much larger than
that in met also. And when they measure these, they measure it with x-ray
absorption. That's how they get the bond distance or the disorder, and that could
be obtained by looking at this so called x-ray absorption spectrum. They use
this, they use very hard x-rays to study these materials. The energy of the x-ray
is about 10,000 electron volts or so, and it comes from kicking out an electronic
deep in the material and then looking at its response. So the premise is that if
you see a spectrum like this, if the unique signature of what the materials is
made of and you can find out what that material -- what that structure is by doing
an inverse problem as you may all be familiar with, and that's what our software
is good at solving this inverse problem.
Okay. Anyway, these are the surprising anomalies that we want to understand.
Negative thermal expansion very anomalous disorder and this red shift, the XAS
when you increase the temperature moves down and the structure changes
dramatically. So that's telling us something's going on.
>>: [inaudible].
>> J.J. Rehr: Yeah, sure.
>>: [inaudible] the assumption being greater than one, does this mean that -what's the ->> J.J. Rehr: Okay. If this was an atom, just plain atomic platinum and the
absorption would rise slowly up to one, one in this -- this is arbitrary units
because it's normalized, but the one refers to the absorption of a plain platinum
atom. Okay. The fact that it's bigger than one, that you get that enhancement is
a reflection of the geometry in the material and the interaction between
neighboring platinums and comes from some unoccupied states right at
threshold. So what we're seeing here, as I'll explain later, is that this shift is a
reflection of the oxidation that's going on in the material due to the interaction
with the substrate. And that's changing with temperature. So that's what it's
telling us. It's really telling you about the electronic structure.
Okay. So how do we do it? What we've done is a technique called density
functional theory, and this is the very efficient way of finding out the interactions
at a first principle quantum mechanics level and usually this is done in equilibrium
at zero temperature. Okay? What this DFTMD is, is to attach this density
functional theory to molecular dynamics so at every time step we do another DFT
calculation and we do it at finite temperature by imposing a fixed kinetic energy
for the whole ensemble so it's definitely not an equilibrium calculation, okay.
These are dynamic behavior. And this way we can simulate liquids and a much
more realistic description of what goes on in materials because they're moving
around all the time.
So was that -- okay. Anyway, before we did this, we tried all that equilibrium stuff
and it didn't explain anything, what we couldn't get a reasonable description.
Anyway, lo and behold, and this is Fernando Villa's calculations, he did 2500 of
these DFT calculations and three femtosecond steps. Three femtoseconds
typical vibration periods are in the order of 10 femtoseconds or 20 or something
like that, so you can actually see individual motions of all of the atoms in this
material and the substrate on this time scale.
Look what happened. This is the mean near neighbor distance. It fluctuates like
crazy. As you go along. However, when we did the simulations at low
temperature and high temperature and red means hot here, blue means cold, lo
and behold, the calculations, the average clearly demonstrated this negative
thermal expansion. So it verified that first observation but it was on the average,
and there were huge fluctuations, though about it. So blue is 165 kelvin, red is
573, and these are kind of typical of this 573 of the temperature in an active
catalyst. Made it even get hotter.
But these calculations are expensive. Well, this was done in 10 to the fourth CT
hours. That's as far as I know the number of hours in a year, but if you have a lot
of processors you can do them overnight. So I think this is what took [inaudible].
>>: [inaudible].
>> J.J. Rehr: Okay.
>>: [inaudible]. Computer science or ->> J.J. Rehr: No, it was a run in a big department of energy supercomputer in
Berkeley called the NERS, National Energy Research Supercomputer or
something. And we did it for just a 10 atom cluster. So this was a clue we're on
to the right track.
Now, notice that the center of gravity of this near neighbor distance has very
interesting oscillation, too. And they're a reflection of the fact that this whole
platinum cluster is kind of breathing in and out. I'll get back to that as I go along.
Okay. Anyway, this summarizes the computational details. There's a computer
code by the name of VAS, and it's been optimized for this DFTMD. It can handle
may be 1,000 atoms, something like that. And each of these calculations took
like what, a couple hours, I guess, each time step.
>>: The time step, no.
>> J.J. Rehr: Each femtosecond step took a couple CPU hours and then you
have 2500 of them.
>>: Something like that.
>> J.J. Rehr: It takes a lot.
We also simulated the x-ray absorption spectra and that's with a code by the
name of FEF that's developed our group at university of Washington and I'll
describe that later. The experiments were actually done with a distribution of
cluster sizes between two and 25 atoms because this took 10 to the fourth hours
we took a typical one, 10 atoms. Just to make sure that it works. Obviously we'd
like to simulate the whole ensemble but this takes a lot of computer resources.
Okay. Well, this was just to check. If we added hydrogen to the calculations that
would take even longer built what Fernando did was first just add a few hydrogen
atoms and see what happens an basically the hydrogen atoms weakened bonds
because they steal some anti-bonding charge, I guess. And so the -- this
obviously gets a little longer. So in the end, we didn't put the hydrogen atoms in,
that would just be extra work. But we assume that would naturally make the
calculations a little bit different. Increases the bond links but we expect that
otherwise it's only a qualitative change.
Okay. But here's the negative thermal expansion. This is.
This shows the distribution of all those near neighbor distances at the two
different structures. At the low temperature you get a fairly sharply peaked
distribution and at high temperatures it's more spread out. Notice that the
centroid, though, shifts down as you heat it up just as expected and for this -these simulations we get a negative expansion of a hundredth of an angstrom.
It's a little bit smaller, well notably smaller than the experimental. But it's of the
same order of magnitude.
And this is only for a 10 atom cluster and this is for a whole distribution. So at
least we're in the right ballpark.
>>: [inaudible].
>> J.J. Rehr: Yeah, sure.
>>: [inaudible] long tail in the experiments.
>> J.J. Rehr: Oh, it's because this is the whole pair distribution function because
they're distances from one platinum to the second neighbor and the third
neighbor and they're really spread out. So this is the whole pair distribution
function from one platinum and everything else. Most of what you see is just the
near neighbors, about 12 near neighbors to every platinum and then -- well,
some neighbors are further out. So in general, the pair distribution function
averages over the whole cluster not just the near neighbor.
Okay. And likewise. Look, the disorder is extremely large at low structures and
even larger at high structures. This is the same order of magnitude as what's
observed. It's a little bit smaller. But again, qualitatively in the right direction.
So what's going on? Here's our physical interpretation which we think is really
kind of interesting. To get an idea look at the center of mass and so I plotted the
coordinates of the center of mass. The surface say is the floor here and going
away from the surface is in the Z direction, okay. So nothing happens in the Z
direction, the mean of the little platinum cluster is fixed.
But what happens is that there's a lot of variation in the X and Y directions.
Notice that -- so what's happening is that the center of this cluster is kind of
planted somewhere, stuck to the surface by these platinum oxygen bonds and
then it can kind of move around. As you would do if you were stuck there but you
know, you're on a bouncing floor, so you're all you can do is move around.
That kind of motion is called librational motion. It's actually a classical term. The
period is about two picoseconds, and the amplitude is about a angstrom and this
is way, way bigger than the kind of variation that you get from vibrations which is
hundreds of an angstrom.
So this motion, the interpretation is Brownian motion. In Brownian motion, this
was one of Einstein's explanations in 1905, in fact he got a lot of -- he got more
citations for his Brownian motion paper than relatively was the explanation that
the origin of this Brownian motion is just statistical mechanics there's something
called the equipartition theorem which says that the kinetic energy of a particle,
Brownian particle's proportional temperature this is a recovery of that but for
nano-particles. The smaller they are, the bigger the Brownian motion. And this
kinds of dynamic motion turns out to be very large and really changes the
dynamics a lot because now the whole structure is vibrating back and forth and
when it does that, it can change the bonding of the platinums to the surface. So
I'd like to illustrate that.
So I have a video, this is again made by Fernando. And this simulates about five
picoseconds of the trajectory, and I want you to focus on this platinum atom
because what we'll see as it goes along, the bonding changes. It sometimes
whenever it gets in contact with an oxygen, there's a little bit of charged transfer
and this fluctuation happens all the time. So what's happening is this electronic
structure is varying. You're sometimes breaking bonds or creating them. And
notice, look at the big wide variation of all the atoms, you know. There's a slow
back and forth and just a lot of dynamic motion.
Sometimes actually these clusters can pick up an oxygen atom for a little while
and so after a long time the prediction is that these will walk across the surface
on very long-time scales and this is thought to be one of the mechanisms for
degrading catalysts because when they glom together they're less effective.
They're most effective when their mono-dispersed, so called. Yes?
>>: On the previous slide where you had the three plots.
>> J.J. Rehr: Yes.
>>: I noticed a variation I guess in that X direction as opposed to the Y direction.
>> J.J. Rehr: Right.
>>: Is that just based on the arrangements or is that ->> J.J. Rehr: Actually there's a kind of -- this gamma alumina surface has kind of
walls or troughs and it's more easy to go along these kinds of troughs. Is that
what you would say to something ->>: [inaudible].
>>: So basically those troughs then [inaudible].
>>: Okay. That's fine.
>> J.J. Rehr: Now, the original description of librational motion was applied to
the variation of the center of mass of the moon. And it's here you can see it
vastly, yes, slowed down, but it's very similar. It's a slow swaying motion caused
by these variations of title motion and other gravitational interaction. So this is
very similar.
This had been studied in small molecules on the surface and they called it
fluctional behavior but notice what happens as these clusters kind of move about
dynamically and atoms get close to each other they can create bonds or not. So
there's a kind of transient, transients to the bonding, and that's really important
for catalysis because the more activity there is, then there is more degrees of
freedom you can have for catalytic reactions and this kind of realtime simulation I
think is the first time that this was really understood, at least according to our
colleagues in the catallis industry.
This librational motion you would never see it in an equilibrium study because
you freeze it. Anyway, so we have librational motion. This is really stuck to the
surface here but at high structures you can actually walk. So and the simulations
show both.
This is a, the so called footprint. Again, it's time elapsed and this shows the
trajectory at low and high structures. Notice that at high structures you can walk.
So and this surface, this footprint sort of shows what's seen on the surface.
Notice that the footprint covers a very large part of the surface. So the surface
itself is pretty inert, it's just the source of a bunch of oxygen bonds. But you
know, its arrangement is not that critical.
Okay. How do we understand the x-ray stuff? So I don't know how many of you
are familiar with x-ray absorption theory but it's one of the fields in which
University of Washington is preeminent. Anyway, this is a typical x-ray
absorption experiment. And what happens, this is copper, but again you take a
hard x-ray and you shoot it at this material and when the energy is big enough to
kick out an electron from an inner orbital, an inner energy level, an atom, the
deepest most ones, then you get a very sharp increase in absorption and then
you see these very funny wiggles. The absorption varies a lot. If this was a free
atom it would just rise to one and then go flat. The fact that it didn't do this
means that it's a material.
Anyway, this is what you're doing. You're sending in a photon which is a
squiggly line, an electromagnetic wave and kicking out an electron, okay.
And there are two regions. The near-edge region which is called xanes,
near-edge structure, and then the extended x-ray fine structure that goes on for
the very large energy range. We want to concentrate on this near edge region.
That was the region in platinum that was interesting. Anyway, understanding this
whole process is another Einstein Nobel Prize where the photoelectric effect and
it comes because light supposedly admitted in discreet units when this photon is
absorbed, an electron gets kicked out, it becomes a photoelectron. So that's
what we're studying. And the thing is, although Einstein got the prize a long time
ago, it was never really fully understood because although Einstein studied
photoemission, he completely ignored I shouldn't say ignored, that wasn't
relevant, the interaction between the electron and the ion that's produced. These
are of opposite sign and so they interact, and this affects the absorption a lot.
Anyway, the neat thing is that new quantum mechanics and theoretical ideas can
explain this and can be used now to analyze materials. So this is actually our
work and it let to a review of modern physics paper with colleague at Los Alamos
and this shows actually what a photoelectron does when it -- an electron is a
wave in quantum mechanics and which it's emitted, it can hit other atoms and
then scatter, and so the physics of photoemission is a scattering problem. It's
like throwing a rock into water and then watching it bounce around. That's what's
responsible for that wiggly behavior.
The advance in understanding this behavior is not to use wave functions,
[inaudible] equation, but instead we use a Green's function technique or a
propagator. If we were using conventional quantum mechanics then there's
something called [inaudible] golden rule, which says that the absorption
coefficient is given by a sum over all possible transitions that you can make
between an initial and a final state. So you've got to calculate all these states.
It's a simple formula, but very complicated because you have this sum, and there
are millions of states, too many. Okay. Here's the deal with Green's functions.
Instead of doing that sum, if you have the Green's function, that sum is implicit in
the Green's function. Okay? So there's a fundamental theorem that says that
the sum over final states and this delta function conserves energy is equivalent to
knowing the Green's function going from one point to another and so this
complicated sum can be replaced by a single integral over the burning function.
One term. Zillions of terms. Okay. Hugely more efficient except this is a
non-local operator so that's the price. But it's the reason why this technique has
been so successful. Does it work? You bet. So here's the experiment versus
theory and aluminum and notice that the theory covers a huge energy range.
We've tested this over thousands of materials, in fact everything in the periodic
table and the whole energy range and works like a charm.
This is just another example and our code is called FEFF. It's widely used and it
basically explanation the experiment. The data is the I think the dots and the
calculation is the solid line and these are some other calculations that shouldn't
work so -- but anyway, this is our code.
The other neat thing about a Green's function method is that it trivially
parallelizes because if you look at this formula, the energy is a parameter. You
don't need eigenstates, you don't have to solve a secular equation, if you want a
spectrum and your spectrum has maybe you know discretized the measurement
say a hundred different energies or a thousand, you can give each processor one
or more energies. And if you have a hundred processors and a hundred points
you can push the button and get the whole spectrum at once. So that's the other
advantage of these computational techniques, they're designed to be highly
parallelizable and you can just do them all at once.
So we used a bunch of fancy techniques to do this and these message passing
techniques to parallelize the code which makes it really efficient. And you can do
this in a few minutes, which would ordinarily take a day or so. Okay. Anyway,
getting back to the catalyst, how does it do? This is the experiment. It has this
funny red shift. As you can to high structures there it crosses. This is the theory
for a 10 atom cluster and some subset of the configuration. But look, all the
features are reproduced by the calculation. Again, Fernando did this. Okay.
Well, the interesting thing is that there are huge fluctuations throughout the time
series. And it was only by doing the configurational average that we're able to
get this to work out. If you do a few sample equilibrium geometries, you get
garbage because it doesn't adequately sample the variation in structures we
think that the full dynamics does. So that's the kind of upshot that if you're doing
finite temperature experiments then you have to have a statistical ensemble
appropriate to that temperature.
Okay. Anyway, the features are there. It's semi qualitative, quantitative because
we don't have the -- we only have one cluster, 10 atom cluster and we need to
look at more, obviously. Okay. And how about the red shift. Okay. The red shift
come from these fluctuations due to the transient bonding. At high temperature
you really change the charge transfer to the surface, you get -- it's sort of like the
melting of ice. When you melt ice, you know, ice is lighter than water, you get a
more compact structure with shorter bonds. It's very analogous to this.
Anyway, the fermi level or the threshold just decreases and that explains this
shift. So it's the transient in the bonding. You cannot do it with the static
equilibrium structure, you really need the transient bonding to explain this. So
anyway, the conclusions is that this dynamic structure is a general feature of
these very small clusters because it goes like the temperature divided by the
mass and when the mass is small you get very large effects. And it has
fluctuating bonding and a lot more face space for reactions. So our catalyst
collaborators think this could be an important advance. We don't know yet but
it's certainly a new observation.
Okay. Turning to the non linear optical response, this is work done by a grad
student of mine who's just finishing Yoshi Takemoto [phonetic], he's really a
brilliant computer scientist physicist guy. Anyway, we're interested in these large
photonic molecules. They're optically active and what's important in photonics is
they're non linear properties put on an electric field and you don't just get linear
response but you get non linear response. So the more non linear, the more
effective it is a switch or something like that.
Now, this is a big NSF grant that they have at University of Washington in the
chemistry department. Okay. So what's our method? Again, there are these
quantum mechanical methods. You have to sum over all the states and use
predation theory and it's very complicated and takes a long time. Too long. We
could never do this with ordinary method. We use something called realtime,
time dependent density functional theory, okay, and it's similar to the DFTMD.
We have to solve a whole bunch of quantum mechanical time steps. Each time
step we find a wave function.
But this is a time dependent wave function and we have a Hamiltonian which
describes the dynamics. This is kinetic energy and potential energy. And these
terms describe the interaction with the external electric field. Anyway, this is
much more efficient than these eigenstate methods. So what do we do? Okay.
Anyway, we put in a term that has an electric field and the way an electric field
works is there's a kind of die pole, so called interaction which is proportional to
the electric field and the displacement of an atom from equilibrium.
When you displace a charge you create a dipole moment, it's called a dipole
interaction. Okay. Anyway, so at each time step we calculate the whole electron
density by evolving this. So your system doesn't have my fonts. How about
that? Anyway, this is tech point. So this should have said psi at T plus delta T
is -- okay. Anyway, the point is you can evolve this by calculating this time
evolution operator using a so-called Crank-Nicolson method. It's unitary, and it
calculates the wave function at T plus delta T from the calculation at time T and
then there's a simple way of evolving it. So you do it step by step and follow it
along.
So sorry about the font problem. Weird. Okay. So what happens is then as you
go along, the electrons are sloshing back and forth trying to follow this applied
field. Of course they're get out of step and we get this non linear response. And
so we calculate this polarization. The polarization is the term in Maxwell's
equation. Maybe you know, D is displacement is E plus P and P is the
polarization of the system and so that oscillates and if you foray transform this,
you get the dielectric response.
Okay. Anyway, to do non linear behavior what we do is we ply a quasi
monochromatic field. The reason we do that, this is kind of a Gaussian envelope
of a fixed frequency field. We do that to keep the predation kind of short in time
and we then apply various strengths of the electric field, several different
strengths and then this is the response. Notice how non linear it is when you
look at these three calculations.
So the applied field is just three different levels of E, electric field and then this is
the non-linear. So in this way, we directly calculate the response. We tested this
on a simple molecule called PNA. I think I showed it in -- is it just for test
purposes because there was a large number of other quantum mechanics
calculations out there. This is the list. There's a whole bunch of names for them
and there's some experiments. But look, they all more or less agree our method
is the red. This what's this. This is Hartree-Fock. It's a terrible approach. This
is way better than some methods.
Anyway, we get pretty good results, and very good comparison with other codes.
So it's -- this is the so called beta, it's the second harmonic term. So it works.
But the interesting thing for the photonics people is that it also works on really
photonic molecules. This is called FTC. It's some very long complicated organic
material. This is what it looks like. This is at least one part, there's several other
chains. And the interesting thing is that we agreed pretty well with the behavior
in the dominant, large wavelength regime. This is off maybe because of other
side chains and maybe basis that overall the method is the best available yet to
calculate these properties from first principals.
Okay. So some conclusions. We have some novel very efficient approaches
that describe the quantum behavior, very large scale systems. For the
nano-scale catalyst, this dynamic structure turns out to be crucial and it explains
all the experimental observations of these platinum nano-clusters, this librational
and Brownian like motion we think is new, at least we haven't seen it before to
our knowledge and it's likely to have a lot of ramification because it shows that
there's a big stochastic element in bonding and catalytic properties. In this
TDDFT we can explain the linear and non-linear response of photonic molecules.
And there are many other applications that I haven't shown like to the x-ray
absorption. I'd like to acknowledge my collaborators. This is our group at UW.
Some external collaborators and support was DOE and NSF.
Some of our main sponsors are what's called the DOE computational material
science network. It's a group of investigators who use advanced computation in
combination with these experiments using the modern secreten [phonetic] x-ray
sources to understand nano-particles. And we were a lead of one of these
groups for several years.
And I think that's it. So thank you very much for your attention.
[applause].
>>: Is it possible to -- you have several structures with minor variations. Is it
possible [inaudible] these methods to get their consensus, some consensus from
[inaudible]?
>> J.J. Rehr: I think so. As you can see, what we have is kind of statistical
ensemble here and so the idea is to generate it enough structures to capture the
variation of the ensemble. That's all you really need. You don't need all of them,
but you need to capture the kind of first few moments. So in these cases it's the
statistics that is what's important and not having to do everything. So I hope that
answers the question, it's a statistical approach.
>>: Where's the line between where you can use this for like versus a more
expensive and ->> J.J. Rehr: You mean like fancy quantum mechanics? Fancy quantum
mechanics works what up to 10 atoms?
>>: You can't really fancy.
>> J.J. Rehr: A couple of clusters or something like that?
>>: These [inaudible] are actually very fast from a theoretical point of view?
>> J.J. Rehr: Which methods?
>>: The DST methods that we use. If you want to go to something more
sophisticated, the limit is maybe 20 atoms at the most. At the most. And for
much shorter time.
>>: Do you feel like that's a big difference between the results of the simulation
and -- I'm just wondering like when can you use this approximation, you know,
what kind of systems or what's going on?
>> J.J. Rehr: I guess if you're interested in what really, really, really accurate
energy levels and --
>>: If your requirement -- if you need extreme accuracy in the energies for the
structures, I mean, within to get essentially .2 percent error essentially those
other method would give you that. But ->>: Like protein or [inaudible].
>>: You don't need those, you cannot a ply ->> J.J. Rehr: Impossible. In fact even these methods may be too expensive for
proteins. But they're applicable for little chunks of protein and so in order to do a
protein what we think you have to do is do these fancy methods on little pieces
and then piece them together. This gives you the really quantum mechanics of
the interactions in place adjusted for solvent and temperature, whatever. Instead
of using interaction parameters that you get out of some table that God knows
where they came from, so this is sort of how to do that molecular dynamics for
the proteins correctly. You need to construct potentials or interactions this way
and then use them in the longer time scale calculations.
I think a lot of people are doing that. There's some sort of hybrid methods
covering multi scales. So what I've discussed is the quantum mechanical scale
which is on the order of 10 picoseconds. The next step would be to take those
and go to the next length in time scales. So this is the -- these are the building
blocks that you can then use to go to the very interesting long time scales.
What's interesting, though, is that they already indicate that bonds are transient.
Those tables of interactions you have assume that they live for ever, they use the
same interest action and it's probably just not the case. It ignores these
fluctuations which I think are probably key to a lot of processes and if you build a
model of a protein that doesn't allow, it's essentially dead protein that's what you
had, real proteins are probably alive and I think these dynamic affects are going
to be increasingly important in real simulation. People are going to have to use
an ensemble of interactions.
>>: Especially if you want to [inaudible] what you see in those simulations that
the really [inaudible] don't fluctuate, a structure of [inaudible] doesn't fluctuate like
those little chunks of metal, right, but the solvent that is around the protein does
fluctuate like that, and a lot of the [inaudible] holding comes from the solvent in
which it [inaudible] and so you see how the two things connect?
>> J.J. Rehr: Yeah, actually so one of the things Yoshi's been doing is to add a
few solvent molecules because that moves the -- that causes a lot of fluctuations
in the activity and that's part of the problem, finding a good solvent that makes
these platonic molecules stable over a long period of time. Lifetime is a big issue
especially in photovoltaics. They can make great organic photovoltaics. And you
know what, they found that they get bleached by the sun in 24 hours. So you
really need to work on something that makes them long lasting.
And they've done that with the LEDs, but it took a long time. And again, it's a
choice of matrix that you put it in and so really we have to calculate not just this,
but this guy in the presence of some attached molecules which affect all the
interactions and you got to take those into account. It's really, really interesting to
do that, you know. You attach some little molecule to here and the thing bends
and twists and you get yourself very different animal.
So that's really important to do this. And we're -- although we're interested in the
next step, this long time scale, we've been content, well, we've had enough
trouble getting it done at this level. Yes?
>>: I have a question on a little let's -- well, it's [inaudible] a question on how you
guys operate. In terms of resources that you need to be -- to create one of these
simulations, go from scratch from software to get the final results, what are the
things that computer science can provide to you so that you feel more confident
and more -- things are much easier for you to program these and send them out,
deploy them, get results back. I mean, is the framework where you deploy these
simulations, is it simple to use or how do you guys go about that?
>>: Maybe I can let Ferrick (phonetic). He has a lot of scripts and tools and
structure tools, so maybe ->>: You have two different problems there. I mean, one is a development of
code and the other is the actual running of the applications that are completely
different, completely separate. For us to have a maximum [inaudible] usually
because that way we can have pretty much every operating system and that's
what we definitely rely on. And that is the basis program so that it takes a little
while to get it done. But that is completely [inaudible] because the user does it I
mean it looks like the program, it gives it the program.
Using the products is usually fairly simple because we just hide all the -- all the
message passing we hide it completely and so launching the jobs and making
the simulation subject fairly straightforward. [inaudible] understanding of what
are the [inaudible].
>>: [inaudible] how do you [inaudible] you divide something you pay for per hour
on this.
>> J.J. Rehr: We have our own little mini-clusters to develop the code.
>>: [inaudible].
>> J.J. Rehr: And you run it on little molecules like PNA and make sure that the
results match what they should look like and you do that enough times to debug
before you run it on the big system.
>>: You still have if you really want to see how it behaves on large system you
have use some of the big machines and those [inaudible].
>> J.J. Rehr: There are a bunch of tools out there, and we're developing some
Java graphical interface tools to make it simpler to run. This x-ray absorption
code that I've developed actually has been embedded in a lot of analysis
packages for analyzing x-ray absorption. They're actually used around the world.
So that's uses a lot of computer programming, higher level stuff. And computer
science is used a lot.
We've just developing these codes so it takes the quanta are graduate student
cycles which is our order of three years or so to kind of develop one additional
stage of code like this. And that takes a lot of time. But our policy has always
been that everything we develop has to be transferrable to the next graduate
student along the line. So it has to be sufficiently robust and documented that it's
the kind of stand alone module and then they can be reused. This turns out to be
a very powerful policy which we've always adhered to, and for that reason, I think
our codes can be used by others and without an infinite number of phone calls.
>>: [inaudible] agreed to do things that nobody thought he could do. Who
[inaudible].
>> J.J. Rehr: Yeah. I mean, people are using this code that I have that I started.
I mean, I'm a physicist but it's used by chemists and biophysicists and
geophysicists, anyway, just because the x-ray technique is so powerful. But it's
useless unless you have the tool that can do simulations that you can compare
the theory and experiment. And it has to be accurate. So what do we really
need to do the next generation? As the codes evolve, you know, everybody
wants them to be more accurate. Okay. So you have to build that in and you
have to build it in in a robust way that takes advantage of say multicore operating
systems. And each step has to be efficient.
So there's a lot of almost software engineering that we as physicists have to do
and we're not comfortable doing that. So interaction with computer people is has
been great. I mean, one of my former students actually came to Microsoft
Research, he was a brilliant programmer, and sort of understood both the
physics and the computer science and had efficient searching things and filters
heap and hash keys and all that stuff but that really is key to making a fast code
that can be used. So anyway, we try to pay attention to computer science in the
algorithms. Although it's going very fast, faster than we can go.
>>: Both sides are making huge steps.
>> Darko Kirovski: Any other questions? Okay. Thank you very much,
professor Rehr.
[applause]
Download