>> Eric Horvitz: I knew him most recently as... Medical Informatics Association, AMIA. He's now the immediate past...

advertisement
>> Eric Horvitz: I knew him most recently as the president and CEO of the American
Medical Informatics Association, AMIA. He's now the immediate past president and he's
present chairman of the Institute of Medicine's board on healthcare services, the National
Committee on Vital and Health Statistics, and the board of regents of the National
Library of Medicine.
He's been involved in many studies and reports, some of which have done some
foundational work in evaluating and understanding the role of electronic medical records.
The topics that he's been engaged in and has a deep experience in over the years are
topics at the forefront of the minds of our policymakers these days as well as researchers
in academic health care.
Today he'll be talking about "Personal Health Information Among Competing Public
Goods: Can American Public Policy Find a Better Balance Between Individual Freedom,
Health, Privacy & Research?"
It's a great title, great set of topics.
Don.
>> Don Detmer: Thank you, Eric. And a kind introduction. And it's really a pleasure
and I'd say honor to be here. I'm sorry I haven't been here sooner, because I've have had
terrific conversations with a lot of people obviously related to Microsoft over the years.
Eric and I were actually panelists together on a Diane Rehm show a few months ago with
Jonathan Weiner talking about personal health records. And Kristin Tull has been a
colleague related to AMIA, so it's not like I haven't had some interfaces.
I sent Eric some -- actually a set of topics that I'd be happy to talk about, but I thought
that this one might in fact -- in light of I think really the importance that Microsoft plays
globally as well as nationally, and also at the policy level, might be really a good topic.
So I hope it won't seem a little too arcane. It's kind of a 30,000-, 20,000-feet talk. On the
other hand, believe me, if you've been into relating to privacy policy, it's a contact sport
much like implementing computer-based order entry or any other kind of informatics tour
de force,
So, at any rate, I will start by saying a few words about AMIA. I don't know how many
of you are familiar with AMIA, the American Medical Informatics Association, but I
want to talk a little bit about them.
We've been involved, AMIA, in this issue of public policy and also national health
information infrastructure kinds of discussions for at least 13, 15 years in Washington
and actually are frequently called upon by staffers because we're seen really as sort of a
neutral player.
AMIA has 4,000 informaticians from 53 countries, so it's pretty global as well. I will talk
about the current state of privacy policy and also attitudes relating to it. Obviously all
these things could be an entire semester course. So I'm going to be truncating this.
I do want to talk just very briefly about some of these thinkings, this thought, because
actually things are changing. We often think that things don't change, but things do
change.
And then I want to talk about the current status of where we are, and ARRA, a law just
passed, actually has some interesting issues that are still going to be -- are being sorted
out over the next number of months and years.
And then what I really want to do is hopefully make this somewhat controversial and
stimulating, to talk about how might we nudge essentially a better balance among these
competing social goods. Because I think right at the present in our country in particular
we really do not have in my view a healthy balance. And I think the fact of the matter is
that, to my way of thinking, is not ideal, and maybe if we can talk enough about this we
might be able to shift the drift.
So AMIA is an authority regarding the sound application of health-oriented information
and communication technology. I enjoy talking to Bill Crounse today about both of us
are as interested in the CT side of this as the IT side of it. I think that's where a lot of the
future needs to go. And then related issues.
But we also are leaders in informatics education and policy and research. We have a lot
of meetings. Our journal is the most cited -- JAMIA is the most cited journal in the field,
and I'd urge you to look at the Web site at your convenience.
We also have an American College of Medical Informatics of people who are thought
leaders who have been elected as an honorary kind of group to that.
What is it? This is my own definition. It's still sort of an emerging indoor sport. It's the
scientific field that draws upon really information sciences and related technology to
enhance the use. It's related to action. How do you use these things to do what you're
trying to accomplish.
So it's not so much on the IT or the CT is the T, but it's more like the verbs, how do you
make the nouns work so you get complete sentences and get to where you're trying to go.
And whether it's health care or research, education management or policy, we see it as a
very key issue going forward.
We like to think that we're AMIAble informaticians. We describe ourselves as not
A-M-I-A but AMIA. We AMIA to please.
So I think that the idea is we're really interested in transforming health as well as health
care from both personal health care and consumer dimensions of health records and so
forth up to and including public health and down to the molecular level in translational
bioinformatics.
And then the other thing that I think we've done over the last five years is try to make the
transition of informatics from a -- what I would call a serious advocation or a serious club
sport to a formally recognized profession. And I think we've pretty much accomplished
that.
We are also very globally interested, and we're the kind -- happy benefactors from the
planning grant from the Bill & Melinda Gates Foundation to try to set up with
international colleagues a global strategy to try to improve health outcomes and health
status internationally, also through using informatics and also again particularly on the
human capacity side, training education and better things.
So I won't go through this alphabet soup of organizations we relate to. But happily I
think excellent progress is being made.
To start moving to the topic we're talking about, Richard Norton Smith, who is a
well-known historian, has said "If presidents are governed by any law besides the
Constitution, it's the law of unintended consequences."
The point is that I've had a lot of work in health privacy over the years. I won't go
through all this except it's like 20-some years of a fair amount of engagement at the
policy level on a number of these things.
And the fact of the matter is there's this quote -- I don't know the source of it; maybe
some of you do and you can enlighten me -- but it says "Two things you don't want to
watch being made are laws and sausages." Yes, sir.
>>: I believe it's [inaudible].
>> Don Detmer: Okay. That's great. Because it's a great quote.
>>: People who respect the laws and eat sausage [inaudible].
>> Don Detmer: Right. Yeah. I mean, it's one of those things that would make you
probably most skeptical about a democracy getting through a week.
>>: [inaudible]
>> Don Detmer: Right. Any rate, I think the key thing about it is -- I mean, obviously
you do have -- and I think serious lawmakers do want to create good policy, whether it's
health or energy or anything else.
But if you'd look at health policy, at least, this is my own personal list of the desired
things that you'd like to see come out of it, you'd like to have the capacity to better deal
with illnesses and among loved ones and family members. You'd like to build altruism
and a better sense of community. You'd like to have better patient-health professional
relationships. That's a good idea. Obviously you do want to improve healthcare privacy
and security. That's a desired thing. Improve health outcomes and also health itself care
through evidence-based medicine and the act of engaging patients and caregivers so that
in fact they are part of the team, if you will, in every sense of the word.
I think you'd like to enhance personal choice on important health matters. Yes.
>>: [inaudible] priority?
>> Don Detmer: Pardon? No, they're not. They're just a basket. And that's what we're
talking about is these are all goods. And to some extent in terms of public policy you're
in competition. So it's not like there's an answer. But I will -- you're anticipating a
couple of my slides in a minute.
And obviously you would like to have both healthy individuals as well as healthy society.
And ideally you'd like to have trust built in society.
>>: [inaudible]
>> Don Detmer: Pardon?
>>: Some of these are pretty fundamentally different, some are related.
>> Don Detmer: Yeah. Exactly. In fact, actually earlier Eric and I were talking about
something to both of us, at least in my roots -- I'd worked at Wisconsin in multi-attribute
utility modeling. And I'm sort of puzzled somewhat that policymakers aren't using it
more. I think it's really quite robust. And so in fact these are sort of just a basket at the
moment. But they could be weighted and they could be ordered obviously to a specific
challenge.
But we've not really done that as a matter of public policy. And as far as I know, JAO
doesn't do that either, or any group that's looking at these things.
Now, the challenge of course -- and Cyril Chantler is a colleague in the UK that in the
past is health care -- didn't matter a whole lot. You could hurt somebody, but you
wouldn't necessarily help them a lot. But the problem nowadays, you know, it used to be
simple and effective and relatively safe. Now it's complex, effective, and potentially
dangerous. And so your potential downside as well as a much higher, rosier-potential
upside is there.
But you increasingly need to know what you're doing. And if that was true in the past -this is a slide of Bill Stead from an Institute of Medicine talk a couple years ago. As we
move from organs and systems and clinical cognitive capacity being what it is where we
made decisions about clinical phenotype to moving increasingly to molecular medicine,
our capacity to do any of these things without computer systems is just quickly becoming
obviously beyond the usual way of practice of sort of memory-based care.
The thing is now to shift a little bit to privacy and health. Historically for about 2,000
years-plus health has been considered both an intrinsic good as well as an instrumental
good. In other words, there are times where it can be instrumentally useful to be healthy.
You can climb a mountain if you're healthy, but it's also a good thing to be just healthy.
Privacy has been considered historically by most philosophers as an instrumental good.
And in the Hippocratic tradition, as well as the oath, does speak to confidentiality, patient
information among nonprofessionals, but actually the professional had an obligation to
try to seek care for the patient and share information if he thought that or she thought that
the other person would actually -- the other professional could help that patient and you
couldn't.
Similarly, part of this in the context of do no harm, you had an obligation to improve the
art, if you will. So essentially research was considered a piece of this. And oftentimes it
was prognostically so you would know what case not to take if the patient wasn't going to
get better. But clearly it was looking at knowledge and looking at research as an issue.
So in the context of do no harm, it's not like you're not going to do something. You don't
want to hurt the patient in the process of trying to help them.
So it is an action issue and researchers keep -- and if you look at the early days of this
country -- you can't be from the University of Virginia without a Jefferson quote. So this
is my obligatory Jefferson quote. You may not agree with this, but this is what he said:
"Without health, there can be no happiness. An attention to health, then, should come
before all other objects."
If you're from the Northeast, Ralph Waldo Emerson said: "The first wealth is health."
Now I stand in front of you as a health professional. So there you are. Obviously that's
my way of getting out of bed in the morning.
>>: As physicians, isn't it interesting? And I think most people never think much about
their health until they don't have it.
>> Don Detmer: Well, it is true that, you know, again, if you look at weights, what you
put on these weights, it's a nice sunny day and your healthiest that you put a different
weight on some of the values and these things shift. Again, how you would shift some of
those would shift perhaps depending on your health status or other kinds of status.
But any rate, the point is historically, at least, health has been considered really pretty
important.
If you look at today, what's actually happening out in the real world, Utah has a law
where patients can opt out from their children's immunization records being shared with
other providers. If the kid comes into the emergency, they can download and find out
whether -- so you don't overimmunize kids or also underimmunize kids. Parents can opt
their children out of it.
And the last -- this is probably about four years ago, last I checked, but less than 4
percent of parents do. So 96, 97 percent, something like that actually stay in the system.
The flip side of it, because one of the things that you get into when you talk about these
things are opt in, opt out, mandatory opt in, opt out. There's a lot of ways you can go at
this. In Massachusetts they have a system where you have to opt in to having your data
shared. Not as children now, as adults.
And I think if -- I think John Lamka last time I talked to him said that I think it's
something like over 96 percent do opt in to have their data shared.
So the point is if you look at places where people have to actually decide and do decide,
somewhere around 90, 95 percent of people are pretty interested in seeing ->>: [inaudible]
>> Don Detmer: Well, it's secure -- it's generally secure data. But the point is, I think,
that the fact is they're trusting at the margin that the system will in fact -- will look after
it. Yeah.
>>: Yeah, so just to expand I think more exactly what that means in Massachusetts when
they share, does that mean that they share other data, who they are, their identity to
anybody who is interested in looking?
>> Don Detmer: No, no. This is among -- this is among -- if you go from one
emergency room to a clinic or something, this is official providers.
>>: [inaudible] on demand, right [inaudible]?
>> Don Detmer: And the data are obviously protected, secure. On the other hand, the
clinician does get the data.
>>: But if I said a Harvard-based clinician is going to do a study, could they work with
the system?
>> Don Detmer: That's part of what my talk is about. In other words, I don't have
blanket usage to be able to use it.
Now, some of the argument that's gotten into the issue of privacy and patient data is the
issue of trust and that the feeling that if you're not very careful about privacy
considerations, it's erosive to trust.
Of course, that's difficult to quantitate. But one of the things that I think is interesting,
David Mechanic, a very distinguished professor in medical sociology at Rutgers and
co-researchers have actually studied this.
And I think, again, what's sort of fascinating when they just go to patients and ask them
what are the factors that are important in trust, in your care, the top one that comes out of
that is does the person -- can I relate to this person. Is the clinician actually somebody
that actually I can even connect with. And oftentimes that takes a little time.
But the point is is this somebody that, you know, really listens and really hears what I'm
trying to talk about and really meets me there.
Next to that, and fairly close to that, is do they seem to know what they're talking about,
are they confident, which obviously is quite important as well.
And technical competence is a high priority. Oftentimes, though, that's not that well
studied. It's kind of like, well, do they appear to be up-to-date on research or other kinds
of things. It's not like this is a very sophisticated approach that a lot of patients take to
this.
The third thing that's important is do they advocate for me. If I need to talk to an insurer
to get some help or something will they kind of go into my corner and fight for me for
things that they think are important for me.
And if it even comes up at all, confidentiality is just generally assumed. And it rarely
even came up as an explicit issue.
So it's mentioned much less frequently than, again, oftentimes the conversations about
this would tend to lead you to believe.
It is interesting on the relative issue of trust and electronic health records that at least an
anecdotal evidence from a lot of these places that do have secure Web sites where
patients are able to interact with their own data and even have dialogue with their
clinicians, it does seem to actually be quite important relative to the sense of trust.
So, again, that's a space that I think isn't necessarily separated, if you will, in the
electronic age. It looks like some of that, in fact, some of the communications
technologies, it's not just really information, it's really communication technology, can be
really quite helpful to patients.
So the point is that if you look now, though, shift from some of those things to say, well,
where is it today, where is privacy in this. In contemporary health policy, privacy is very
important. And how important I think is really what we're talking about, how important
should it be in the relative mix of a variety of social goods that you're trying to balance
again. So that's really -- the question is it more important than quality, safety, cost,
research, social trust, altruism, or should it balance in this kind of scale.
>>: [inaudible] I mean, the people who talk of privacy, they're clearly concerned about
certain things that can happen if privacy is not respected. And those are societal things,
right? I mean, if you're in Europe in the '30s, you were concerned about your religious
[inaudible] because that determined whether you live or die.
In health care, is there any consensus in the U.S. on why is privacy -- what consequences
of lack of privacy are important?
>> Don Detmer: Well, I'll talk about that in some subsequent slides. And I'll get into it a
little bit. But the problem of course using the word privacy is there's not even a common
definition. I mean, confidentiality and security you can kind of think of in operational
terms. Did somebody divulge something I didn't want them to that's breaking a
confidence. Is the system not secure. That's something I can sort of test. Privacy is a
[inaudible]. Yes, sir.
>>: Tangential [inaudible] as the economy has recently [inaudible] the U.S. sex offender
and sex offender register rules, how many new affiliates will be created if the government
draws from all the existing medical records?
>> Don Detmer: I don't know the answer to that.
>>: Do you have a guess?
>> Don Detmer: No. Any rate, this is kind of interesting. I went to Bing just as an
interesting kind of proxy of just throwing out words. And if you throw privacy on there,
it was 1.2 billion; community, 709 million; health, 687 million; health research, 343.
You can read them. Trust, 157. You go down to freedom. Freedom, 109. And by the
time you get to altruism, that's just 1.2.
>>: Do you think Google is times two for all that?
>> Don Detmer: Pardon?
>>: Do you think Google is times two for ->> Don Detmer: Yeah, well, I'll leave that to you folks. Just as a footnote against this,
sex outperforms privacy. But at any event, love incidentally doesn't. And death and
taxes don't come close.
So the point is just as a weird sort of social proxy of at least what's circulating out there in
terms of terms, I think it's sort of fascinating.
The point is definitions, to get at your question, definitions of privacy are not static. And,
in fact, as a matter of law, for many years Brandeis had really pretty much defined
privacy in the American context for American law. And I'm not a lawyer, so some of
these things I don't present to you as being something a legal scholar would present to
you.
But it was really defined as a right to be left alone, not to be bothered. And, in fact, the
fact that you can call up a phone number and say I don't want marketers calling me
during dinner is an example of the kind of the right to be left alone sort of thing while
you're having dinner with your family.
When current a Supreme Court justice was being vetted by the senate, he described
privacy as actually saying really maybe the not the best word. He really saw it as more
trying to get the concept of freedom of conscience more than privacy, which, frankly, I
think personally I tend to resonate with just at the personal level.
But at any rate, on the other hand, I would say there's some people on the more rigorous
end of the privacy that have been defined by others as privacy fundamentalists, would say
I want the right to remain unknown, not to be left alone, but actually remain unknown.
And somewhere among all of these things are bubbling out there, and I have another slide
in a minute to talk about it, but I do think that right now I'd say that privacy in this
country, we really do define it both as an instrumental and intrinsic good. I think we
have crossed that boundary. But, again, there's not a way to really know this.
Now, there's been a fair amount of legislation over the years, and it's kind of fascinating
that in fact in the 1970s -- I think it was '76, but I didn't take the time to really look it
up -- the U.S. passed actually a law that did -- if you were at the -- in the Department of
Defense as a service member or in the VA, you actually did have your personal health
information protected as a function of law. But because we don't have a national
healthcare system, it didn't cover everybody.
And it was fascinating because the rest of the world that did have the systems of care
kind of passed a law like the U.S. law, but we never got around to it for doing everybody
else. And by the time we started really trying to do that as a function of federal law in
'67 -- '96, '97 ->>: Was that actually known to be a template for the other countries?
>> Don Detmer: Yeah. Oh, yeah. Absolutely. We were clearly leading the world. In a
summary kind of ironic sort of way, the study I chaired on the computer-based patient
record in '91, a lot of countries went off right then and the VA went off and did it, as we
know the story on that [inaudible] that really then moved the VA from having some of
the worst performance experience to some of the best of breed.
So it's kind of fascinating that again the price America has paid by not having, if you will,
and it's obviously we're debating at the national level right now. But I think we've clearly
seen with a balkanized sort of approach to care that some of these things the U.S. has
actually led on but didn't -- necessarily hasn't across the country benefited totally from.
Anyways, so HIPAA 1 -- I'll call HIPAA 1 -- was basically done as a privacy rule. It
wasn't really mandated in detail. It was left for rule making, which is a very arcane kind
of process, to decide what that would look like because the congress couldn't come up
with a clear decision.
Actually, I had met with Senator Bennett on Arbor Day in 1995 and gave him my little
elevator speech and he decided to in fact sponsor a bill, Senate 1330, which was actually
a confidentiality security bill that got very close to passing the senate and might have, in
fact, actually bypassed that. But it failed just like a lot of other things. Most things don't
make it through the arduous approach.
So, anyway, went to rule making. And at that point I was chairing the NCVHS, the
National Committee on Vital and Health Statistics, as Eric mentioned, during the
Shalala -- for Secretary Shalala during the Clinton administration when we went through
all the hearings. We heard from 75, 76 people about what these rules should look like
and advised the country on that.
So I sort of have some scars from obviously that process, because this is not something
that everybody is going to line up and agree on. That's just the way that it is.
By 2009 we passed what will essentially qualify as a HIPAA 2, and this time congress
did write some specific kinds of things, and then also did say there'd be rule making on a
number of issues.
And the fact of the matter is I think there's going to be a little bit of a struggle to both see
that care and research issues in medical research can in fact move forward, or all kinds of
health services research, because of some of the ways this is framed.
So part of the reason I'm talking about this today is this is dynamic and it's not actually
getting more balanced; if anything, we're running the risk I think of making it even more
of a problem.
Of course the genetic privacy law did pass, and obviously one of the reasons and I think
this issue is a hot issue in this country is where people can lose health insurance because
of somebody finding out a prior condition or something. Our absence of having a
nationally mandated universal access to care clearly torques the privacy debate as well as
in an awfully lot of issues that are out there that relate to this. So it's not ->>: The congressional discussion in 2009 is explicitly talking about research interests
and care? That's part of the actual discussion?
>> Don Detmer: Actually in this country we sort of are dealing with privacy and
research -- I mean, excuse me, care and research as sort of the same bag. And that's not
the way Europe, the UK, Canada have approached it. They've actually approached it
separately as this is care policy procedure and regs, and this is research.
And I think, frankly, what's happened is it's hard to keep both of the those balls in the air
because the care alone is so complicated.
>>: I'm curious if those terms are actually pulled out in discussion, care and research.
>> Don Detmer: There are elements of -- yeah, there are elements. And of course
obviously if it's a matter of public health, as you know, there's private -- I mean, there's a
set of law that says, yes, you can get something for H1N1 or something like that, the
public good from that context on a public health is not part of this but clearly is part of
the total picture.
Now, if you look at the market segmentation about attitudes of privacy, just to get to your
point, question earlier, or comment, it actually hasn't changed a huge amount, at least
from my ability to try to look at polls of this over the years.
Since about Harris/Equifax in '95 it may have shifted some. But the fact of the matter is
about 20 percent of the people say, Huh? You know, there are different ways of saying
huh, but it's just not something that they think about or get exercised about.
Somewhere around half are what you'd call privacy pragmatists as their terminology,
which is that, well, privacy as opposed to what. I mean, what am I trading off on this
thing. If it's important enough, I might be willing to give quite a bit whereas, on the other
hand, if it's something trivial, hey, what are you talking about.
Meanwhile, they defined it as privacy fundamentalists as being about 25 percent, but, as I
said, when you really get down to when people are faced with the choice on a pragmatic
side, it's probably closer to, I don't know, 5, 10 percent. But that seems the way it seems
to be playing out.
At any rate, the point is that from those data, people had the highest confidence at that
time. And I think this may be a shift a little bit of data being used by researchers as in
fact being the highest confidence.
The next highest confidence were doctor, nurses, hospital administrators. So-so
confidence in the use of it by insurers and employers. And, in fact, if you look at ARRA
2 or HIPAA 2 on the ARRA increasingly how other business associates and such can use
data seems to be tightening down. Yes, sir.
>>: [inaudible] doctors, nurses, administrators in one pot, are they about the same?
>> Don Detmer: Yeah, they're pretty close. No, they're really quite close. And I think
the hospital administrators, mostly they were using it to manage the system. And they
were just seen as just actually part of the, if you will, the legitimate kind of care process.
But if you look at the full spectrum -- yeah.
>>: Go back to the side. So this [inaudible] that the general population has a pretty
sophisticated understanding of the issues?
>> Don Detmer: Yeah. I would say as general numbers go, but keep in mind these are
sample data. And it shifts some. So it's not, again, like this is really, you know, science
in a really robust .000, three standard deviations kind of thing. But as such things go.
>>: Is it statistically significant?
>> Don Detmer: Well, I'm not a social scientist in that. You'd need to have Eric answer
that for me. But at any rate ->>: [inaudible]
>> Don Detmer: I mean, the numbers, they have some confidence in it. Let's put it that
way.
>>: My question would be, you know, what's the definition of -- you hear the word
privacy, on average, what do different cohorts think that means? That would be
interesting.
>> Don Detmer: Yeah.
>>: You said before that people forget about health [inaudible] I heard healthy, and I just
suspect similar stuff happens, especially in the medical privacy.
>> Don Detmer: Well, I mean, there's just no question. Just having practiced surgery for
25 years, a patient on one day as opposed to when they're -- both their disease can affect
their way of thinking too. I mean, let's face it, it's not like we're separate from what we
are and do. So there's a lot of reasons why some of these things are certainly meetable.
Any rate, if you look at the full spectrum, Dan Macies [phonetic] has this comment that
in every global village you'll have a global village idiot. And I think part of the problem
unfortunately with some of the early days of computer technology is that the Wild West
idea of everything ought to be free and open and let everything go was I think part of why
there was some concerns about in fact how responsible was this going to be as sort of an
industry.
As I said, I never saw a record I didn't enjoy, that kind of mindset. And it at the total
other end of this, well, and then you get a privacy what are you even talking about,
privacy versus what, I want to be left alone. I want to remain left alone.
And then the purist was I want everybody to be anonymous. I mean, there's actually
some folks who would just as soon have essentially a way off the scale.
Anyway, so the question I think where are we today. And my assertion, and I'll present
data that will suggest where we are in the current bake-off of social goods, I say at the
moment federal privacy legislation and regulation is increasing healthcare costs. It is
adding some greater privacy protection, but I'd say even the privacy people don't think it's
adding enough. And most of the other folks see where it could be improved. And I think
actually HIPAA 2 will -- or ARRA will in fact tighten up some of those things.
Clearly irritates some patients and citizens involving seeking care, signing forms and so
forth. It worsens the quality care and safety, and I'll go through this -- while contributing
to a loss of medical privacy. Given a key example of that.
And I think the best can be made, it impacts negatively on the volume of all types of
biomedical and health research. And at the philosophical level, there's evidence and
certainly scholars -- Eleanor O'Neil [phonetic] probably the most distinguished who's
written about that it actually threatens community sharing and corrodes social trust and a
sense of community.
One survey at the time that HIPAA 1 was coming through, a study that the American
Hospital Association, which you could say is hardly a neutral bystander, did a public
survey on an instrument that showed that 85 percent of people when this was coming in
the rule making felt that consent requirements would inversely affect the care of elderly
because patient might have a hard time understanding all of this and might get scared and
so forth and wouldn't therefore share data to frankly be in their interest.
Time in doctors' offices would be spent with care -- paperwork instead of care. People
felt like they'd be spending time on this. You could spend the money for patient care and
get a better ROI.
I did review a paper for the journal that didn't get accepted, the paper didn't get accepted,
but that was the only thing I've seen that really tried to cost this out. But they cost out
that it added by their analysis at least a dollar per visit or a dollar a day per hospitalization
to deal with the HIPAA requirements, HIPAA 1.
Says it will be a hassle to sign the consent form before every visit to a doctor, hospital or
pharmacy, so forth. And then 60 percent didn't think it'd probably enhance their privacy.
So there's a reasonable amount of skepticism actually going into this. I think the point is,
though, does current privacy policy impact on privacy safety and quality.
And the argument I would make is the fact that we don't have a unique health identifier in
this country, which almost every developed economy does have, is I think both a problem
with quality as well as privacy.
And the way that works, I was on President Bush's interoperability commission. The day
we presented our report happened to be the day we were also having our national meeting
at AMIA. And I think there were five members of the commission who were on that
panel. And we went from the Capitol from where the report was released to our meeting
at the Hilton and had a panel about what was in it.
And a come during the discussion period raised her hand and she says: I've got a
question that I hope the commission addressed. She says: My name is Mary Smith;
maybe you'll already guess where I'm coming from, she says. But I would love to have
the country allow me to have a unique health identifier because Mary Smith is not a
really uncommon name in the U.S. And she says: I don't want other Mary Smith's
getting my health information and I'm absolutely sure I don't want my record to have the
wrong Mary Smith's data and then some doctor I come to thinking it's my data and do
something to me that is not sensible.
Well, that's -- although, again, there's not a lot of science to prove that that happens. It's
hardly what you'd call a strange kind of logic. And I think the chances that in fact that is
real is true.
Now, the fact is people have said, well, you can do all this authentication with algorithms
and you don't really need a unique number. But the fact of the matter is you do have a
drop-off if in fact you don't have more identifiers that you can use in that algorithm. And
Social Security number, for example, which when I was chairing NCVHS supposedly in
LA County was a toss of a coin because of so many people that had did counterfeited
Social Security numbers.
But if you go to other places, I guess in Indiana, it gives you about a 15 percent better
accuracy if you have it in the algorithm.
Well, shouldn't people be able to choose to have a unique identifier that it would reduce
at least for Mary Smith. Now, if you're, I don't know, a name that's highly unusual, that's
probably less of an issue. But it still is I think a safety issue as well as a quality issue. So
that I think is actually is not trivial and there are a number -- AMIA has been on record
for some years, has pushed it because it's -- as you know, this was part of HIPAA was a
unique provider plan and personal health identifier. But the -- pardon?
>>: [inaudible]
>> Don Detmer: Yeah. But it never went anywhere because there was such an uproar
that Vice President Gore put a kibosh on it, and that's been since maintained. So even the
option to be able to do this is not something that actually -- so we have a paternalistic
view in there that opts you toward privacy kind of whether in fact you really wanted to go
that direction or not, which is part of my point.
>>: [inaudible]
>> Don Detmer: Pardon?
>>: The uproar, was it a privacy uproar or ->> Don Detmer: Well, the privacy -- there was enough from the privacy -- I'm a privacy
advocate. I really feel like I am. In fact, I went and, as I say, I got 1330 bill sponsored at
one point.
But I think if you were really talking about the more committed, whatever, privacy
community, made such an uproar about it and the The New York Times was -- it was a lot
of fear got into the mix of the thing and so they just said we don't need that headache.
And so they shut it down.
>>: [inaudible] about the implications for a national health system?
>> Don Detmer: Well, they've got a lot of things on their plate right now. And in fact
that was during the Clinton administration, and I just think they didn't want to mess with
it.
So, anyway, so the question then moving to research, which really has the most
information -- yeah.
>>: Going back to the Mary Smith's problem, what happens if you say Mary Smith,
January 1st, 1985, right, name and birth date?
>> Don Detmer: Well, again, if you look across the U.S., probably not just one Mary
Smith that was born that day. Or was the birthday really recorded right. And do you
have different ones of these.
So, I mean, the point is that, you know, the algorithms, you probably still will absolutely
want to use an algorithm.
>>: So how about using the same security number together with the name?
>> Don Detmer: Well, as I said, you have these things all gave you an added additional
accuracy. But the fact is if you can have another unique number, it does at the margin
give you even greater accuracy. And the point I'm trying to make is right now our federal
policy is not allowing us to even call a number and say why don't you use my passport
number or my state driver's license number as a unique health identifier. It's fine for me.
Okay. That's what I'm saying. That's not an option right now. And it has both privacy
implications, you know, as well as health and safety implications.
Okay. Research. This is probably where -- and being at Microsoft Research, this is
probably where I think the best evidence exists that it's a problem.
There have been in the first instances the Association of American Medical Colleges,
American Hospital Association, American College of Cardiology, American Society of
Clinical Oncology, NCAB, AHRQ, Academy of Health, there have been a number of
survey studies that have suggested and shown that the HIPAA does in fact impact
negatively on biomedical research and health research across the board, all kinds of
whatever research you're engaged in and whatever kind of research that you're doing that
relates to person's specific information.
In 2007-9 the Association of Academic Health Centers. But probably most distinguished
and influential report just came out from the Institute of Medicine. But the point is none
of these studies have given conflicting evidence that, oh, no, it's not so bad. Everybody is
pretty much in agreement that this is having an impact.
And in fact there's some people who feel like it's much easier nowadays to do research in
other countries because of the costs generally are not that cheap to do certain kind of
studies, and by the time you have to do all of this data management as well, it just
becomes at the margin something that's problematic.
Now, the question is will the ARRA 2 -- well, ARRA, HIPAA 2, make this even worse.
And I think there are a couple areas that are concerning some of us. One, it's advocating
use of minimum datasets and it's in fact changing the rules on when data is anonymized
and arguing to try to move to minimum datasets.
And it also gives business associates -- brings more business associates into it and their
penalties that if the data are misused, the business associate may pay quite a penalty.
In the UK when they were going through this when I was over there at Cambridge, and
they sorted it out. But at the time they were, again, putting more institutions to greater
culpability, and the institutions were saying, well, why do I want to go through the risk of
having more problems come down on my head with -- if I don't give data for research I
save myself potential exposure.
>>: How do you define minimum?
>> Don Detmer: Well, that's part of the rule-making process. So you can go to the Web
and check some of that out. But the fact of the matter is ->>: But for research task, it might be a little bit challenging, especially when you can
discover not just known variables but hidden variables at work, the datasets that say give
me -- just give me these three variables.
>> Don Detmer: Well, and this is where I think, again, it depends on the kind of research
you're doing. But again at the margin, I think it would make things more problematic.
Now, there's a set of timelines on this, and I'm not going, in the interest of our time, I'd
like to have some -- also time for open discussion.
But there's a set of things that relate to all of this. And they're arcane enough but
important enough that I think the research community should really show at the margin
more interest in tracking some of these things and at least today I'm aware that they're
necessarily doing.
That's personal and, you know, I think there is interest. These things are being traced.
Well, what did the IOM say? In "Beyond the Privacy Rule," they basically were saying
two things: Privacy isn't as protected as it needs to be with the current way we're doing
it. But we also are impacting negatively on research.
We could probably build a better mousetrap. And the fact was it says HIPAA doesn't
adequately protect privacy security as well consistently as it might. I think ARRA will
actually help some of that by broadening some of the coverage.
But it also -- it doesn't facilitate legitimate research, especially database and repository
research. And it's fascinating that if you look at the ARRA, it talked about comparative
effectiveness as a research and using data repositories to look at comparative
effectiveness.
So it's kind of fascinating there's a discoordinate tension in there which was part of the
reason why I thought it might be kind of an interesting talk to try to summarize where
maybe we are on this. Because even congress sort of seems in a sense to recognize that
they want two things but they aren't necessarily managing two of them simultaneously.
Yes.
>>: [inaudible] to illustrate the security and privacy is not really concerned despite
HIPAA 1?
>> Don Detmer: Well, I think most -- the question was what's the evidence that we got
that big a problem, if you will. I think a lot of it is if the occasional highly publicized
situation where a veterans administration computer turns up in somebody's car or is
stolen or something or thousands of Social Security -- so, in fact, it's really fascinating.
If you look at -- keep in mind, with Medicare, since 1965, all health data on everybody
over 65 in order to get payment has been sent to the government. And how many
breaches, I mean, you know, have you heard about.
So, you know, clearly the performance at some level there's a disconnect between in
some instances the popular sense of how serious this is. I think a lot of it, though, is not
really -- if you really are a sophisticated hacker, it's really tough to protect all of this.
You guys are experts at this. So it's not as though in fact you can run. But it is kind of
hard to hide.
So the other issue, though, is where it is so it's both a science issue but it's also a
perception issue too. And obviously you live off the perception issue. Yeah.
>>: [inaudible] unencrypted [inaudible].
>> Don Detmer: Well, I mean, I think the fact is there are problems and I don't think that
we are realistic to be standing here saying there won't be some problems, can't be
problems, it is an issue. These are competing goods. They are goods. And they're in
competition. I think that's an issue. You're not going to come up with obviously an ideal
situation.
>>: [inaudible]
>> Don Detmer: Pardon?
>>: The financial investment [inaudible] so, I mean, there's an example we're talking
very high quality, very high value information [inaudible].
>> Don Detmer: [inaudible] we have a [inaudible] financing system. I can go use an
ATM machine in Italy or whenever and traveling somehow seems to basically fly. Yes.
>>: I would add, every single privacy document I get from a financial institution says we
will not disclose except as required or permitted by law. That means I can do what I
damn please.
>> Don Detmer: Well, this is the part of the other issue because for payment and quality
and all these sorts of things there are a large number of players, again, in our system that
will see your information. And, believe me, fraud and abuse is not a trivial consideration.
I mean, the President the other night was saying, well, we could probably pay for most of
our healthcare problems if we just clean this up.
And that's not totally off the mark. Fraud and abuse in this space is really huge.
Anyway, the IOM said that really the HHS should really develop a new approach. And it
really should favor de-identified datasets. And it's interesting, as I said, limited datasets
is included in ARRA; it's not clear how it will come out.
But at the end of the thing they're saying HIPAA just should exempt all research from the
HIPAA rule. It's just enough of a social good that we don't want to pay that much of a
social price. We need to protect it better where we can and should, but we ought to
manage to do this research as something that is obviously important. I think taking the
sense that, gee, at the start of the genomic era and such there might be some things that
could really improve patient health.
So if it didn't matter, I guess you wouldn't see if research didn't matter, so what. But
increasingly I think it does.
And as I made this comment earlier, the fact is the U.S. is taking a different approach to
this than if you're up in Canada. Yes, sir.
>>: [inaudible] de-identified data. When you start talking about genomic information,
it's really hard to get de-identified data at that point in time.
>> Don Detmer: Yeah. You're not just whistling. In fact, Russ Altman, who's professor
of bioinformatics and computer science and health and everything at Stanford, gives an
annual Science of the Year presentation at our translational bioinformatics meeting. And
March 15 he talked about how tough it is to actually, if you will, anonymize your
genomic data.
You have to -- there's a lot of fingerprints on a lot of snips. And if you really want to
deal with that, that's not a trivial task, as you pointed out. And I think there was a general
surprise from some of this more recent research. It's a little tougher to do than, in fact,
people were even thinking two years ago.
>>: [inaudible] if that's the target, then you're going to still be in conflict where you can't
really ->> Don Detmer: Well, and that's -- that will get to part of my least reason why I think we
need to think about -- not do downside the privacy but from the IOM perspective find
more creative ways to also let people in fact express themselves as free citizens in a free
nation to say how they'd choose to use their data and given some of the options to do that.
>>: We can't do that now?
>> Don Detmer: I've said as a citizen today, the government does not help me have a
unique personal health identifier. Okay. Meanwhile -- but they do ->>: Can I do that on the side? So if you look at the personal genomic space, right,
[inaudible] so you're getting gnomic information, you can fill out those surveys, all this
other stuff ->> Don Detmer: Yeah, but I'm talking as a matter of government policy. I mean, it's a
free country. You look at people like me are faster curers or some of these where people
are giving all their data. And so it's no question.
Okay. So the point is -- the other thing is true about this. If too many people opt out, the
free riding problem comes up, nobody leaves the station. So it's not like it's simply a
trivial matter that we'll just let people totally kind of choose.
On the other hand, I'd say the evidence to date would suggest that you would have cohort
sizes that are pretty darn large. If you're up to around 85, 90 percent, that's pretty good.
And so it looks like, frankly, that you'd come out, okay. But we don't give people as a
government policy that opportunity. Yeah.
>> Don Detmer: The cohorts or the 85 percent may be enough to skew the [inaudible]
badly. The recent report on HRT and breast cancer [inaudible] depending on the
[inaudible] one set of [inaudible] was voluntary, the other was not, and the numbers
flipped around [inaudible].
>> Don Detmer: No. It's not a trivial issue. You're absolutely right. In fact, Minnesota
during the '80s, early '90s, I guess, passed a law that you had to have people's permission
to use their data for research. Mayo Clinic had been in operation a hundred and some
fifty years. They never had a complaint from Olmsted County, anybody every in their
history. And they had this law.
Well, Mayo Clinic is not a trivial institution in Minnesota, and neither is University of
Minnesota. And so the legislature came back, rethought that and said, well, you got to
make a reasonable college try to reach them to get an answer. But if you can't get them,
you can use their data. And it turns out they changed the law and, again, it's the same
sort of thing moving along.
But the interesting thing was during that period, it was disproportionately young women
who were opting out. So if you're talking about breast cancer in a young cohort kind of
breast cancer, you sort of make the decision for everybody. So it's not a totally free kind
of thing. And I think your point is a good one.
Any rate, I think it's also true that many research projects absolutely need to authenticate
who they're dealing with. They're not interested in Don Detmer, but they're interested in
knowing that it is this person. And so sometimes you do need to definitely be able to
know that. And of course, Eric, I think some of your research has tried to deal with that.
But some of the things you can answer and get some pretty good data on, but other kinds
of things are more robust if you actually were able to really know who you're dealing
with.
Anyway, Madison -- I quoted Jefferson a little bit. Another local through UVA and
Virginia, James Madison, he was part of the original Founders, his quote was: A popular
government, without popular information, or the means of acquiring it, is but a prologue
to a farce or a tragedy. Knowledge will forever govern ignorance; and people who mean
to be their own governors must arm themselves the with the power which knowledge
gives.
I mean, again, we've had a tradition which sort of says and if you look at our census, it
goes back to the founding of the country, we've typically really have I think sort of
argued that data, and accurate data, the best data we can is really a social good and is
something that we should do.
I'm running a little short of time. I think the point is I think some of us were not as strong
on the switch as we might have been when we were going through this. We were
focusing on it more from the privacy side and security confidentiality side and didn't
really kind of have a more robust thing. I'm partly responsible. I started the working
group on the National Health Information Infrastructure and didn't have, in fact, in that
model either education or research.
But the fact is we need a more robust model to be thinking about this. I think more
recently I think the Office of the National Coordinator, struggling with a lot of important
things, was continuing to do that as well.
And of course I think there have been enough activity from the privacy, stronger
advocacy side and some of the mental health community that have offset the fact that the
researchers have not really come out of the woodwork on this. I got a lot of things they're
thinking about and worrying about, and they really haven't stood up sufficiently to out
balance it.
And as Madison also noted: Nothing is so contagious as opinion, especially on questions
which, being susceptible of very different glosses, beget in the mind a distrust of itself.
And there have been concerns where a very egregious lapse in a specific place have been
generalized and seemed like this must mean it's all going haywire. Exactly. So that's
hardly new to our time.
So, anyway, the thing that I'm really positing here is I think we really do need both
policies and standards that do try to in a free society allow us to better manage to these
things without saying that privacy isn't really important.
So the question is how do we enable this. And I think, in a few minutes, I think we need
a new model on the standard side and then I think we also need to have a few new
policies that in some instances could be federal and in some instances could be state.
This will be available I think -- in fact, there's a number of people I guess that are
watching this elsewhere now and it will be available to go down through this.
But the point is today we don't have a single consent process to allow citizens to share
their personal data. If they say, look, I'm interested in the public good, you know, at age
70, what do I got to hide or worry about, particularly, I'm willing to take the gamble.
You can use my data and put my name on it. That's just fine with me. And I'm willing to
take whatever downside there is that might come from that.
You don't want to do it, don't have to do it, but allowed to do it.
And rather than having to do it institution by institution by program by program by
program actually have ways to facilitate access, as well as obviously for those who don't
want to do that not do it.
So I think the point is that in a lot of these areas it's not like what we've done is wrong; I
think it's just not quite been balanced.
And so for now, though, this is going nowhere. This unique identifier thing is going
absolutely nowhere in the water at the federal level.
And I think the point is is that my argument would be that why don't you allow the
citizen to choose an alphanumeric number and allow them to have a site that's done.
And also, as you'll see related to this, I read this book Nudge. I don't know if you read
that. But it's an interesting, I think, point that if you're in a cognitive psychology area, if
you could just nudge people a little ways, it doesn't really cheat a lot and it, frankly,
makes society better, makes everybody in a better situation.
They use the term libertarian and paternalism. As I always said, we have paternalistic
policy today, but we don't have offsetting paternalistic policy at the same time.
So, anyway, the reality today is that we don't really have a proposal for action to allow,
you know, informed consent across a broader kind of scale for legitimate IRB research.
You could do it based on IRB-approved research.
And I think we do know that obviously talk about the genome and so forth, I mean, this is
a time when obviously these are not trivial kinds of questions for human health and
disease.
So I think we have a question here that's real.
And also it's not like the citizens and the people were putting a lot of billions of dollars in
the research as a nation [inaudible].
So the point is there's a lot of sense with, for example, what, 38 billion going to our own
national health information IT infrastructure for electronic records and such.
So we clearly want this, but we don't necessarily have it as balanced as we might.
So what I'm personally advocating, and I've talked to a few states, I'd like to see -- I have
on my driver's license in Virginia a little heart. I can say I'm willing to see my organs
used for transplant. At my age it's probably my corneas is about all they'd be interested
in.
But the point is I'd like to see a few other icons there that if there's not a slash through it,
you can use my driver's license number as a unique health identifier. And if there's not a
slash through it, you can pass my data around among health professionals, have a Web
site so docs in institutions could check and see where I was on that.
Have another one, a little microscope saying you can use my data for IRB research and
I'd like to have in some states an option which to say if I call or send an e-mail to you, if
there's a protocol that comes up that comes -- that might apply to me or my family, you'll
push something to me so I can maybe sign up for it, participate.
And then the last one would be a little DNA spiral that would say you can use my genetic
information for that. And that would totally change the picture of things. And I think it's
totally consistent with a free society. Probably the state level is the best way to do it
instead of federal, frankly, because there are currently -- our main ID is our state driver's
license and the organ donation thing is something that the public is certainly familiar
with.
But I'd prefer to see it actually as a mandated thing, you had to choose one or the other at
the time you got your license. But in addition what I'd like to do is have some pilots,
where ->>: [inaudible] doctors might get confused with do not resuscitate.
>> Don Detmer: Well, whatever. I'm not here to advertise the icons. I think that that's a
good point we need to figure out with certainly after the death [inaudible] all that
business, we probably ought to talk through those things. But some kind of simple thing,
what I'm saying, would have some credence. I mean, who's using a microscope
particularly today either. But you need some icon that might work.
And that's part of why pilots would be smart too soon, obviously have user groups that
would look at some of this. So we could look at opt in, opt out, mandated choice and so
forth.
So, anyway, I'm talking about this. I'm on Bob Kahn's National Corporation Research
Initiatives, on his board right now, and I think we've been talking about some ways to
deal with the -- keeping those data's secure in that kind of approach.
And obviously it'd be nice if Washington State wanted to be a partner in that. Clearly
they do a lot of research.
Anyway, I probably have had too many slides in this deck, but I really do appreciate the
opportunity to come, and I'll be interested both now as well as later on the feedback you
might have literally in any direction. I've been at it long enough that there's no such thing
as unimportant feedback. And that would be great.
So thanks again for the invitation. I don't know if you're interested, but if you are
interested in our annual symposium, it's middle of November, not too far from here. So,
anyway, thanks very much. Thanks again.
[applause]
>>: When a while back my credit card was compromised, I got a new number and a new
card maybe two days later. Try that with Social Security numbers. And think what
happens if somebody from Virginia moves to Maryland, God forbid. And until they
[inaudible] as well as a cheaper credit card company, I have my doubts.
>> Don Detmer: Well, may they subcontracted Microsoft to do it.
[laughter]
>>: So one of the things that would be interesting, talking about medical records as -and who owns medical records? Today, I don't know, as I look at my credit report and all
this other stuff, the information that's stored about me is not owned by me. And I have
no control whether they use it for -- the banks use it for research, they have all our credit
card records and they use it to profile me and do lots of stuff.
Has there been any thought that says, you know, we should really make medical records
owned by the individual? And if you have a test, you have whatever, you've got them
available to you, and then you could go and participate in studies and you could then give
your information.
Now, you do that individually one at a time where you could put it in a repository that
then made it available and things like this.
>> Don Detmer: Well, in fact, the congress did recently pass something along that line,
and I think it's a terrible law, personally, and I'll tell you why. Personally. Just from my
perspective.
It says that basically if you don't want any data to go into your record and you pay for it
rather than the insurer getting information, then it doesn't need to be in your record.
What I don't like about that is that, oh, okay, so if I'm poor and I don't have the money to
be able to do that, I have absolutely no choice. I just think it's not good law.
So from my perspective --
>>: [inaudible] define the law differently than I would have done, is that what you're
saying? I have to pay ->> Don Detmer: Well, that's what the congress has done on that. I mean, you can opt
out, but you have to pay the freight if you're going to opt out. But, the point I'm making
is, is that it's basically regressive. It's very regressive.
>>: Yes.
>> Don Detmer: And, so, I mean, I'm not saying you're advocating that, but I'm just
saying that's the closest thing we've tried today to have a way of letting people protect
their information.
Now, the other concern of course on that too is as a clinician, if I then subsequently am
taking care of a patient who's decided to not have this information in their record and I
prescribe a drug that absolutely would be counterindicated based on if I knew that
information, is the government going to protect me from, you know, abusing, if you will,
making bad malpractice because obviously the person has done it.
So I think this is another dimension of social good that's really, really critical. And I
think there's a lot. We have a chat room, American College of Medical Informatics. We
had a lot of buzz on this issue. And our feeling, frankly, it's not a policy but it sort of the
minds is that we really need to make sure that if people choose that -- take that route, they
realize exactly what they're exposing them to.
It's much more serious than they might on the surface think that it is that they're just sort
of interested that they can have the cake and eat it too necessarily, because I don't think,
in fact, congress is likely to give you that kind of malpractice coverage, if you will.
So then the question is, well, if somebody chooses to act that way, would you decide to
just say, look, you need to find another care provider. I'm just not willing to take care of
you under those circumstances. It's too problematic for me as a compare giver.
>>: I think the laws you described is a totally worthless thing, and so that's not what I
was saying, where you can pick and choose what goes into your record and what doesn't.
>> Don Detmer: Well, but that's currently the [inaudible].
>>: I understand that. I appreciate that. I'm proposing an alternative, then, that really is,
you know, fine, you guys, the insurance company has a record as well. But no matter
what goes into it I get a copy ->> Don Detmer: No. Historically, look at the data ownership. Historically an institution
and a clinician have owned the record. And, in fact, actually when a physician changes
practice, he sells oftentimes his records to the next provider that's coming in.
So it's not -- it's not a clean this is mine and, if you will, for all time and so forth. It's
actually -- in fact, we just did a couple of studies, a couple of policy projects talking
about secondary health data. When we did the IOM study in '91, we sort of said if it's
involved in care itself, it was considered primary health data and all other uses were
considered secondary.
We recently said, look, we really should avoid this issue of ownership because it, frankly,
leads to just a lot of discussion and not good public policy.
Instead what we should talk about is what is good -- what are good stewardship principles
for anybody that's going to have access to person's specific health information and what
are really, in fact, the way you treat these data and, in fact, with a fair amount of both
kind of moral guidance as well as obviously some legal constraints as well, because -and so, in other words, secondary data, as we ultimately defined it, was only those data
which are being sold to somebody without any conceivable possible benefit back to me.
I'm not being paid for. You're not likely to be any good coming back to me from that.
We now say that you ought to consider secondary data.
But otherwise to run a healthcare system and look at quality and the fact actually
ARRA -- HIPAA 1 basically sort of said business operations, quality, payment, and so
forth, just is a part of care. That's basically what -- how they defined it. So they basically
even sort defined it that way.
And then they're trying to tighten up some of that. And then with the personal health
records, how do you look at other kinds of associates that might see some of those data
and put some of those protections, whether in fact they say, look, we don't need it, we're
good guys and gals or not.
Yes, sir. I'll be happy to stick around too.
>>: [inaudible] corporation a fake phone call from some charlatan claiming to be a
doctor and saying he has [inaudible] twice a minute in the U.S. It's a big country.
Numbers are rough. Numbers are a little [inaudible].
>> Eric Horvitz: Maybe you guys can take that offline, that discussion. Okay.
[inaudible] did you have anything?
>>: Yes. My question is it seems like there's two issues that are almost orthogonal but in
your presentation they were very tightly interleaved. Identity and privacy. And at some
level there are some that might even -- don't need identity to have privacy, one helped the
other. So why is it so critical to sort of address [inaudible].
>> Don Detmer: I think it's just the way the policy dialogue has kind of worked out. I
think you're right. I mean, how some of these things necessarily get in the same bed
doesn't necessarily have that -- an entrance, if I'm tracking on your question. I don't
know.
But if it gets into the policy debate, then it's in the debate and you get to deal with it
whether it kind of makes sense or not.
>>: Do you have evidence around different models that other countries --
>> Don Detmer: Yeah. Oh, yeah. Sure. But obviously I didn't have time to get into it. I
mean, when I was in the UK, I was actually party to some of the discussions. In fact, I
was a member of the Nuffield Trust and we had a policy meeting one night, and Lord
Hunt, who was the undersecretary for health at that time, as we were discussing this, he
says, well, now that you mention it, this is a serious problem. Right now I got a
committee in here that's looking at this.
And so normally we would break for a nice dinner. And instead of breaking for dinner
and letting it go we went back into session and tried to give him some advice. And I'll
leave it at that.
But the point I'm making is yes. This has been certainly and I think some countries have
done a much better job. And some of it is actually because of the importance that the
biomedical research industry is to the nation's economy. It's not like there aren't
economic dimensions of what we're talking about too. And, in fact, I think that's -- that
also is an issue here that I didn't even obviously talk about at all.
>>: That's a very interesting point. So I can foresee, say, some smaller countries getting
a comparative advantage from having the right kind of policy [inaudible].
>> Don Detmer: Well, clearly you look at stem cell research. I mean, obviously that's
the most recent example. This is global market. You look at medical tourism to
Bangkok.
And, again, we priced an awful lot of people onto a spot market internationally, and it's
not a trivial issue. They're all board certified in American medical specialties. So it's not
like they're not well trained. The hospitals are very good.
So that -- yeah. There's no question. We are, in fact, a global kind of market. There's
also opportunity unfortunately as well, though, for abuses in some countries and
exploiting people.
So from my perspective, again, it's part of why I'd like to see us deal with this better, is I
think the fact is the world does pay some attention to what we do with the margin. And if
we had better policy, I think everybody could benefit from that as well. So it's a little bit
of selfishness in that context too.
Anyway, thank you very much again for the invitation.
[applause]
Download