21343. >>: Okay. It's my great pleasure to present...

advertisement
21343.
>>: Okay. It's my great pleasure to present an award that is a special award agreed upon by the
Organizing Committee of ECC 2010 to Professor Scott Vanstone.
As I indicated in my talk at one point, Scott's a very unusual mathematician. He's both made
major contributions to basic mathematics in discrete mathematics, theory of finite fields and other
areas and at the same time he's a very practical person. He has good instincts, unlike some of
us.
[laughter]
And he has been a leading force, not only in the mathematics of elliptic curve cryptography, but
more surprisingly, perhaps, on the commercial side and in the whole difficult process of
standardization.
So let me just read the inscription on this award that I'm going to present. It's the Special ECC
2010 Award presented to Scott A. Vanstone for seminal contributions to research, development,
standardization and commercialization of elliptic curve cryptography.
[applause]
>> Scott Vanstone: Thank you. This is a wonderful honor. I'd like to thank the organizing
committee for bestowing this upon me. I'll certainly hang it in a very prominent place.
I've got a few remarks. I'd like to make a short trip down memory lane. So I hopefully won't keep
you too long because I need a glass of wine.
So everybody who has been around from the very beginning of ECC I notice when they started
their talks today, they said it's hard to believe it's been 25 years.
And I'm the same. It really is hard to believe all the time's gone by, and we've had a lot of fun
doing it. I've learned over the years that when an idea is right, it is often the case it is conceived
by more than one person.
ECC is a prime example of this phenomena. Neil Koblitz and Victor Miller, working
independently, discovered ECC at essentially the same time in 1985.
I attended the Crypto conference in Santa Barbara, California for many, many years. I was
present at Victor Miller's talk at Crypto '85 when he presented his paper at ECC. I'll return to this
monumental event a bit later in my comments.
I was educated as a combinatorial mathematician specializing in the theory of designs. This area
required a comprehensive knowledge of finite fields, finite geometries, and algebra.
At the expense of offending some, I've never found a serious application for combinatorial
designs. So I'm glad I moved on.
Anyhow, I had the right background to move into public key cryptography. A major turning point
for me was the appearance of a preprint of the RSA paper, A Method for Obtaining Digital
Signatures and Public Key Crypto Systems.
The paper appeared in 1978, but the preprint circulated at least a year earlier. I found this article
fascinating and a great example as to how elementary number theory could have a practical
application.
I coauthored the first year algebra book at the University of Waterloo. This book and course
consisted of subjects such as greatest common devisor, Euclidian algorithm, the extended
Euclidian algorithm, modular arithmetic, finite fields, et cetera. I enjoyed teaching the course but
it lacked any real world application.
RSA was the answer to this problem. I immediately wrote a new chapter which introduced
cryptography and the RSA algorithm. Approximately six weeks into the course, the first-year
students had enough background they could understand the algorithm and why it worked.
This course is still taught at Waterloo where approximately 1500 mathematics majors per year
take it.
At this stage I definitely had a keen interest in cryptography. In 1979, I was doing some contract
work for the Communications Research Center in Ottawa. I did this work so that I could fund one
of my post-doctoral students and one that Victor mentioned this morning, Ria Fujahara [phonetic].
He's the one that told Don Coppersmith what we were doing. I've forgiven him already, but...
[Laughter]
At first the project appeared to have nothing to do with cryptography; but as we learned more, it
became clear that a solution to the CRC project we were given was essentially to find an
algorithm to take discrete logarithms in a binary field.
A quick check of the literature revealed that a number of organizations were implementing Diffie
Hellman key exchanges in a multiplicative group of a binary finite field. And, Victor of course,
mentioned that this morning also.
Hewlett-Packard was producing a chip to implement arithmetic in the finite field with 2 to the 127
elements. At this point in time the best people could do was to take logarithms in a field with 2 to
the 32 elements. And it was believed 2 to the 127 was completely infeasible.
We developed an algorithm that would compute logarithms in 2 to the 127 by making use of the
extended Euclidian algorithm. We implemented this algorithm and forced people to abandon
binary fields of this size for public key purposes. Later on, as Victor mentioned, Don
Coppersmith, who is now at IDA, improved our algorithm and could find logs in a field with 2 to
127 elements in a matter of seconds or minutes.
What we learned from this work was that if we wanted to use binary fields in cryptography, we
needed to build devices that could compute efficiently in much larger binary fields. It was unclear
in the early '80s how one could do this.
Further research revealed that normal bases with certain properties could work. We discovered
optimal normal bases and a circuit architecture to implement them. The mathematics behind
these bases was certainly elegant and the subject of numerous research papers by many people,
including Henrik Lenstra.
Optimal normal basis is a problem that would not have even been defined had it not been for the
fact we needed to build something. We needed to do mathematics to do it.
We embarked on building a chip to do arithmetic in the finite field with 2 to the 593 elements.
This field has an optimal normal basis. Based on this work, we started Certicom in 1985.
Originally we called the company Cryptech; three professors, Cryptech. Until we heard from
many people they thought we were in the mortuary business.
[laughter]
So Cryptech became Mobius Encryption Technologies, and finally Certicom in 1995 when the
company went public. This chip was used in PC encryptors and a fax encryptor Certicom
developed. This technology kept the company going through the late '80s and early '90s.
Okay. Let's now return to ECC. As I mentioned earlier, I heard Victor speak about the potential
of ECC in 1985. Coincidentally, this was the same year that we started Certicom, and many
people believe to this day that Certicom was started because of ECC.
This is incorrect. It was started to do Diffie Hellman in the multiplicative group of finite field.
Victor started his presentation at Crypto '85 by saying if you built a chip to do arithmetic in a finite
field with 2 the 127 elements, don't throw it away. It may be useful for elliptic curve cryptography.
This was interesting and perked my interests considerably. I left the lecture hall that day saying
to myself: If this crypto system is secure, it's technologically superior to anything out there.
Returning to Waterloo, I embarked on a research program that would study the security of ECC
and how to best implement it. Certicom was not promoting it, nor implementing it, in these early
days. The security of ECC was a huge unknown, and I would not have bet the farm on this
technology at such an early stage.
From '85 until '93, my students and I studied the security. It was during this period that we found
the Menezes Okamoto Vanstone attack, now referred to as MOV, which showed that super
singular curves should be avoided.
In 1983, it became clear to me that Certicom was a me too company. We had an interesting
technology but the key sizes of the original technology were about the same size as RSA keys.
And we only had a small speed advantage over RSA. In short, there was no compelling reason
for anyone to move from RSA to our system.
By '93, 1993, eight years after its discovery, I was becoming more confident with the security of
EEC and I decided that Certicom should promote the technology.
This we did through evangelizing it and aggressively pursuing its standardization. We started this
process six or seven years before anyone else had faith in the technology.
Given that we started promoting ECC in 1993, Certicom did not produce a software product
based on ECC until 1997, 12 years after the discovery of ECC.
The first standard including ECC started January 1994. It was IEEE 1363, Part A. This standard
was not officially approved until 2000. ECC is now in every major standard around the world, I'm
pleased to say.
Spring of 1997 I was approached by Demetrius Marcaucus [phonetic] of Mondex International.
Mondex was a consortium of banks in the UK. They're producing a payment system using RSA
in a smart card. Mondex liked the efficiency of ECC but felt that it required global scrutiny. In
order to increase the visibility of ECC Mondex asked me to start a workshop on elliptic curve
cryptography. And they agreed to fund the first workshop where the -- fund the workshop for
three years.
The first three were at Waterloo, and then we moved the next one to Essen, I believe. So now
we're at No. 14 and going strong.
At the first workshop on November 1997, Certicom introduced a challenge similar to the RSA
challenge. The challenge was developed to increase industry understanding and appreciation for
the difficulty of the elliptic curve discrete logarithm problem and to encourage and stimulate
further research in the security analysis of elliptic curve crypto systems.
The challenge consisted of three levels: Exercises, mid-range and commercially deployed, ECC
implementations. I did not wish to implement this challenge as I thought that the problem was far
too difficult.
However, Alfred Menezes convinced me otherwise and introduced the idea of the exercises. The
exercises were challenging but doable. And it took approximately seven years for the exercises
to all be solved.
In 1998, I invited Joseph Silverman to speak at the ECC workshop. Joseph agreed to speak, and
a month before the workshop he called me to tell me that he had found a new approach.
The xedni calculus, which is, of course, index spelled backwards, to take discrete logarithms and
elliptic curves. At that moment I was pretty much convinced that my commercial venture was
going down the tubes.
Fortunately, Neil Koblitz was on sabbatical at the University of Waterloo. He and four
postdoctoral fellows of mine began extensive study of the xedni calculus.
Although Silverman's idea was extremely clever, three months of work by Neil and the post-docs
showed that this approach is less efficient than exhaustive search. ECC had weathered a serious
storm.
A significant milestone and a huge endorsement for ECC technology came in October 2003 when
the NSA licensed 26 of Certicom's patents.
In February of 2005, at the RSA conference, in San Francisco, the NSA announced their Suite B.
This was the first time that the NSA had endorsed a suite of cryptographic algorithms. The public
key portion of the algorithms is exclusively elliptic curve cryptography. Now ECC is implemented
in devices that we use every day.
It's in TVs. Virtually every flat screen TV out there has ECC in it. It's in Blue Ray players. It's in
smart meters. Even in some automobiles.
Certicom was acquired about a year and a half ago by the Blackberry folks, and this device -- this
is running the highest strength NIST curve on a prime field. It's running 521 bits of elliptic curve.
There's more security in this than the U.S. government requires for classified information. And it
all works well.
We can do Suite B on this, by just changing the key size. So anyway, ECC is here to stay and
I'm honored and proud to be part of this history. I hope it continues. It's been a wonderful ride,
and I can't thank the organizing committee enough for bestowing this great honor upon me.
Thank you very much. [applause]
>>: Thank you very much, Scott. So now the next item of business is the rump session, and so
Dan Bernstein will take over for that.
>> Dan Bernstein: Thanks. We have a few talks lined up. But the first and most important part
of a rump session is alcohol. Now, I think not everybody has noticed that there are free drinks
outside that way. I think most people notice the food over here. But there's drinks over there. So
we're going to spend the next 15 minutes, that's until 7:10, exploring the drink options back there
and then come back and we'll have some talks set up on the computer here. So see you at 7:10.
Welcome to the second part of the rump session. For the next 20 minutes we're happy to have
as an invited rump session presentation a fight between Neil Koblitz and Shase Golwasser
[phonetic] on the merits of fruits of fourvier [phonetic] security. Where are they? Where are the
speakers? In this corner -- since the invited speakers don't seem to be coming up, we'll move on
to the next presentation by Gaetan Bisson, Romain Cosset and Damien Robert on AVIsogenies.
>> Presenter: Hi, everybody. Let's go forward. So the three of us will be presenting library
recently wrote that computes isogeny between the [inaudible]. But first a little bit of context. We
come from the Caramel team that most likely you all know as the [inaudible] team. As you see,
we like tasty things. And I'd like to present the people who come up with these silly jokes of using
tasty things as a group name. Here are the serious people of the team. Half of them are here
and you'll see most probably all of them at ECC 2011.
And there are young people also in the team. So there's me Romain here and Damien. And
there's also a hardware guy that's actually not concerned at all with these talks. So let's strike
him out. [laughter]
So when the serious people write a code, then -- so when the serious people write code, it gives
this -- this is a few characters. And [inaudible] genus-two curve and it works. It finds Burg in
MAGMA, [phonetic] of course it's reasonably fast.
>> Presenter: It's the logo of the team.
>> Presenter: Now when we write code it looks more like this. If you look closely enough, you
see it does various things at various places. And, yes, really. Closer. Closer. Come closer.
And but actually all this code you see here is not even a complete genus, two or more SCI
implementation. So it's slow.
[laughter]
>> Presenter: It uses a lot of [inaudible] confusing structures. For instance, one of the ways of
presenting the [inaudible] and the functions.
>> Presenter: Then we also have a [inaudible] formulas. Again, you won't ever going to be able
to recompute by hand again.
>> Presenter: And for computing torsion -- for computing torsion we use field extension but in a
very, very bad way.
[laughter]
>> Presenter: Yes. And there are a lot of check off codes everywhere.
>> Presenter: Yes, for your convenience we split them over lots of files. They're all going to be
loading into MAGMA whenever you try to use the library. So just like the serious people we like
[inaudible] so we use in line function.
>> Presenter: And the [inaudible] is [inaudible] sometimes only in French for convenience.
>> Presenter: And, finally, if you ever try to run this code, which we hope you won't, well, you'll
find many interesting bugs. Not all of us are in our package. Some are in MAGMA.
>> Presenter: So the thing is if you have a strong desire to compute as a genus a bit more than
that [inaudible] and some [inaudible] cycles and even more important lots of run cycles to waste,
then just drop us an e-mail and you will get a GLP library for free. Thanks.
[applause]
>>: Okay. I suppose we have time for questions, if there are any quick questions.
>>: How do we get the other code?
[laughter]
>> Presenter: It's a logo on top of the ->> Presenter: No, you just need a pen and a piece of paper and here you go.
[laughter]
>>: David.
>>: I'm wondering, for English, is that the French word for [inaudible].
[laughter]
>> Presenter: No, it's just the 2.0 of Esperanto.
>>: In the absence of further questions, let's thank the speaker again.
[applause]
Okay. Coming up next we have David Freeman who is going to be telling us about homomorphic
signatures for polynomial functions.
>> Presenter: It has very little to do with elliptic curve cryptography. Use a little bit of number
theory and apparently the rump session had some empty slots so they let me come here. And
this is joined work with Dan Boneh. So a big thing in crypto recently has been homomorphic
encryption which lets you compute, apply functions to encrypted data, and what we do is look at
applying functions to authenticated data. So what does that mean? We have Alice over here
with her secret key. And she wants to send -- so here Alice is a professor. She's sending the
grades in her class to some untrusted database. There are all the grades and she with each
grade she puts some signature that indicates the data.
So the grade of 91. The student that has the data. So U sub I means which student. The I
student. And the fact that it belongs in this grades database. Then later Bob comes along and
he's a student and he knows what his grade is and he wants to know how well he did in the class
relative to everyone else.
So he wants to know what the class average was. So he sends along the query mean and the
set of -- just the indices of all the students in the class. And the database can compute the mean
and also a signature that authenticates the mean that says this is the average of the grades that
Alice put into the database. And ideally the signature is going to be short and it will also not let
Bob reveal that his girlfriend actually did better than him in the class.
So more generally we talk about F homomorphic signatures for some class of function F. You
just have some data that's all signed with some tag that ties it all together that says this all came
from the same set of data.
Then for some little F in this set of functions we can compute a signature on the function applied
to all the data. So it's multivariate function and the number of variables is the number of data
entries.
So well, what do we know how to do? In the past we know how to do linear functions on signed
data. So we can compute the average. We can compute least squared bit if the X coordinates
were some public parameter and Y was actually the data, because that's only linear, regards only
a linear computation.
We could do 4-A transforms, 4-A coefficients. But for nonlinear functions, well, until now we don't
know anything. And nonlinear functions tend to be useful. We can do with nonlinear functions
you can compute standard deviation. You can compute a least squares fit if both the X and Y are
variable, they have to be authenticated. Other nonlinear estimators and also some tools from
data mining.
So what we are able to do in this construction is we can do a signature scheme that authenticates
multivariate polynomials applied to data where the degree of the polynomials is low.
But even with quadratic functions we can already compute standard deviation or least squares
fits, linear fit for quadratic and higher polynomial fits for not much higher degrees.
So in one slide I'll try to tell you how it works. So in case you -- so you see somewhat
homomorphic scheme of Gentry is also paper by Smart and Vucodrin [phonetic] that explains this
in some more number theoretic language. So the public key scheme are some prime ideals P
and Q and some numbering. And the [inaudible] key is some short basis for the intersection,
which if they're both prime it's just the product.
And short depends on some embedding of the numbering into R to the N, so that's a detail I'll
leave for later. But basically the data correspond to elements of O mod P. So this is just some
finite field.
Functions correspond in some way that's determined by the tag to elements of O mod Q. So
some different finite field and the signature that authenticates the function applied to the data is
an element in the ring that's the right thing mod P and mod Q. It's the thing you want to
authenticate mod P and it corresponds in this well-defined way to the right function mod Q.
So if you're given the signature, you can verify that actually authenticates the data, that is, that
function of the original set. And now since mod P and mod Q are homomorphisms we see if we
add or multiply these signatures that are elements of a numbering, then we also add or multiply
the data and we can add or multiply functions as well. For example you can build a quadratic
polynomial by building two polynomial fields.
So security, well, basically a forgery is if I can convince you that -- if I give you a signature on
some data and some function and actually the data is not the right -- that function applied to the
original dataset. And so we'll try to get into the provable security argument. About you we say
under some specific model if you can forge a signature, then you can construct a short element in
this ideal queue.
And an interesting thing, and perhaps the only relation to ECC is that this is something that we
can't appear to do with ECC. If you try to work in a group of order P and a group of order Q in
ECC, you have the Poly Hellman algorithm that says work with them separately and you can
solve whatever problem you're trying to do. But in this case, because of the shortness, uses both
mod P and mod Q parts, there's sort of no way to separate them.
So the hardness of the system, finding a short element of an ideal, for example, if it's a principle
ideal finding a short generator is a classic number theoretical problem. Lattice space reductions
such as LL don't seem to be good enough.
They give something that's sort of exponentially too long. And we can also connect this with
some of the recent work of Gentry on worst case lattice problems. That's all.
[applause]
>>: Questions? All right. You're out of here. Let's thank the speaker again. I think we have only
25 more announcements lined up for you in the rump session and after a short day -- oops. I'm
sorry. We only have one more announcement lined up for up. So the last talk of the rump
session will be by Peter Schwabe who will tell you about the correct use of the negation method
in the power drill method [phonetic].
>> Presenter: Thank you very much. Yes, the subtitle of the talk says this is going to be an
entirely boring and actually enforced talk.
And it's joint work with Dan Bernstein and Tanya Lange. You might ask: Why is it a boring talk.
Actually everything I'm going to tell you, you already know already. So why is it enforced. The
rump session chairs as you can see are also the co-authors of that. The rump session chairs are
also my supervisors and by the end of this week I have to submit my thesis. Now Dan and Tanya
want me to give this talk, so I think it would be very stupid to not do so.
So let me to start to tell you something you already know.
I think pretty much everybody here knows that for the hard elliptic curve discrete logarithm
problems we usually encounter in elliptic cryptography, poly rose algorithm is the best method to
solve for those. It uses a pseudo random walk through the group G with point P I plus 1 is F of PI
for some pseudo random function F. When the walk collides, we can solve these at ECDLP.
That's the rough idea. The estimation is pi square root of group order G half.
Now, pretty nice idea if you define this walk, this F on equivalence classes modulo efficiently
computable endomorphisms, then this is becoming more efficient. And for elliptic curves we have
the negation which is always quite efficiently computable. So we can simply choose the smallest
representatives, representative module negation and we get a factor of square root of 2 speed-up
and this is all very well known as I said at the beginning. It's textbook optimization, so it's very
fine.
Now, the current record for solving ECDLPs has been announced in July 2009. It's one of the
12th ECDLP. It's been announced by Bos Kleinjung Lenstra and Montgomery, and this one used
a cluster of 200 PlayStation 3 gaming consoles to solve this. And the interesting fact is that they
did not use this square root of speedup from the negation map.
The reason is, well, if you have 200 PlayStations sitting around you just don't care. [laughter]
well, it would be a pretty good reason. I don't know what I would do if I had 200 PlayStations
sitting around. But fortunately I just made it up so the reason they're giving is that they did not
use the common negation map since it requires branching and results in [inaudible] that runs
slower in a SIMD environment.
SIMD is a way of computing that most modern microprocessors use which stands for single
instruction stream and multiple data stream. And while pretty much every microprocessor uses it,
so apparently if you want to solve ECDLPs nowadays you don't use the negation map. That's
pretty much what it says on any modern microprocessor. Why is that the case? Well, the
common way to construct the iteration function F is that you precompute some points. So 0 to 2K
and you define a function H which is a hash function from the group G to some index set from 0
to K and you define this function F as P plus well TH of P. So you just -- in each iteration you just
add some point. If you do that, with the negation, then you end up in some fruitless cycle. You
have some point P and then let's say that the hash function H just yields some I. So you add TI
and then you look at what the smallest one of this, of P plus TI and minus P plus TI is according
to some definition of smallest say it's minus and then the next -- you hash that again and if you
get again I as the value of the hash function, you add TI again and you end up at P. So you circle
around P and P or minus P plus TI. That's not a big problem. There are several techniques to
solve these fruitless cycles, but it's apparently highly annoying in a SIMD environment.
Now, what the reason why I'm standing here -- oh. Is that together with Tanya and Dan, we show
that annoying is not the same as impossible. So we come up with an implementation which
shows that you can actually use the negation map for exactly the same ECDLP. And it solves
this ECDLP 1.8 times faster expected roughly. This speed up by 1.8 comes from the use of the
negation map factor of square root two using a branchless computation and the remaining factor
of 1.3 comes from faster arithmetic.
The paper will be online very soon. Now Dan a few weeks ago announced it's going to be online
soon. I think it's going to be very soon. I'm going to say it's at least this week, at least I hope so.
So the conclusion is that the use of negation map in [inaudible] algorithm actually speeds up the
algorithm by a factor of square root of 2. And that's pretty much what you knew already before.
Thank you anyways for your attention.
[applause]
>>: All right. Before we bring this to a close, let me remind you that things start tomorrow at the
same time. Ridiculously early in the morning. But despite that, you might want to stick around
here for a while. I believe we have the continued use of this room until 9:00, and then there's
some buses for people staying over. So let's thank the speaker and all the speakers again.
[applause]
Related documents
Download