Session III Transcripts

advertisement
Session III Transcripts
November 21st (AM) Session III: Physicochemical
Application of Physicochemical Data to Support Lead Optimization by Discovery
Teams
Li Di (Wyeth Research)
Question: I have a lot of questions regarding PAMPA. Even though in our company
we don't have thousands of compounds to screen, the chemists pushed the ADME
group to develop PAMPA. We tried the PAMPA system , but we had a lot of problems.
First of all, to deliver the 5 µL or 10 µL lipid components based on the protocol, we
created a lot of leakage. The second thing was the bubbles; this tiny 96-well plate, the
bottom has a concave shape, and it's very hard to avoid bubbles. Another issue is how
to look at PAMPA values, because we got used to the Caco-2 data and then all of a
sudden, we switched to PAMPA. The permeation values are so different between
Caco-2 and PAMPA. Also some compounds show pretty good permeability in Caco-2,
but when we switched to PAMPA the rank order was different. So, it's hard to interpret
to the chemists. I wonder how you handle this and if you've ever had these kinds of
problems.
Li Di, Wyeth, Response: I understand that a lot of people are developing their own
PAMPA in-house instead of going to the experts. We went to the experts; they
developed all of the assays and optimized the conditions. We didn't have any leakage
problems or bubble problems, but there are some tricks to doing that. In our company
we also have Caco-2 data and PAMPA data. We did comparisons between the two
assays for the compounds that are transported by passive diffusion. The two assays
showed a very good correlation, with R2 around 0.8. So we have confidence for those
compounds. Although the absolute values of PAMPA and Caco-2 might be different,
the compound ranking is similar. For compounds that are either efflux substrates or
transported by active processes, then the two assays have discrepancies and that's
natural because Caco-2 has other mechanisms and we use Caco-2 as a diagnostic tool.
Question: I was wondering if you're doing anything about polymorphism, because
you're speaking about solubility. Now, I know from personal experience that we have
run into the polymorphism trap more than once. Are you doing something about that,
and how early?
Li Di, Wyeth, Response: In early drug discovery, usually only 10 or 20 mg of
compounds are synthesized, so we don't usually characterize the crystal forms of the
compounds. In discovery, we try to be optimistic about the solubility; usually
compounds that dissolve in DMSO tend to precipitate as amorphous material when they
are added to aqueous buffer. This actually gives you a higher reading of solubility than
crystalline material. So in early drug discovery we don't usually characterize crystal
forms.
Question: You started with really good solubility, because of the amorphous material.
Any time you get crystalline material you don't have a drug because it's not soluble.
Li Di, Wyeth, Response: If it is crystalline, the solubility can be much lower. In this
case, it will give you an indication to go forward as some kind of solid dispersion of the
amorphous form in development.
Question: Li mentioned that the Caco-2 assay usually operates with a pH gradient;
you have a lower pH on the apical side than on the basolateral side. When you run the
assay in the B → A direction in order to probe for PGP efflux, would it be advantageous
to have the same pH on both sides?
Li Di, Wyeth, Response: Exactly. So if you want to determine PGP, you want to do an
iso-pH instead of a pH gradient.
Question: Are you running the Caco-2 assay routinely under those conditions?
Li Di, Wyeth, Response: Yes, we used to run the pH gradient to mimic the
gastrointestinal conditions. Right now we're running an isocratic pH. 7 to 7. We're also
adding inhibitors to make sure that it's a PGP substrate.
November 21st (AM) Session III: Physicochemical
Computational Methods for Physicochemical Property-based Optimization of
Drug Discovery Leads
Philip Burton (ADMETRx)
Question: With your physiologically based sensitivity modeling, one thing that strikes
me is that if you modify anything to address one parameter, there's about four different
parameters that are going to be impacted. The two that strike me the most are your
hepatic clearance—if you do something, say, change the log P, you're obviously going
to change the hepatic clearance end result—also the unbound fraction is going to
change. How do you view using the sensitivity modeling within a drug discovery setting,
taking that into account.
Phil Burton Response: These are sort of first generation models. I think as they
evolve those functional interrelationships can actually be programmed in so you can
sort of make a global change to an input parameter and then have that accommodated
in all the downstream properties. Right now we have to do that manually, so you do
stoichiastic sensitivity analysis where you can simultaneously change a number of
different properties, but you have to do it manually. Again, it sort of gives you an idea of
what are the limits that one might expect given those sorts of changes in the input
parameters or the structure of the molecule.
Question: Has anyone said anything about extending this to PB-PK?
Phil Burton Response: Yes, it has been proposed. Actually, one of the references
that I included in the list was a report from the PB-PK workshop that was sponsored two
years ago or so in Washington, and that was one of the items that was identified as
highly desirable for future development of the capability.
November 21st (AM) Session III: Physicochemical
Application of Statistical Analysis Techniques for Discovery Compound Property
Optimization
John Ellingboe (Wyeth)
Question: In the last data slide you showed it appeared that the MIC values of your
improved molecule have higher values than the solubility that you found. What do you
think about that?
John Ellingboe Response: I don't remember what the conditions were for the assay
for MIC, but they're not the same as the solubility measurement profile. So, at the
moment that’s the only explanation I have for that, but it is something we noticed.
Question: Can you tell us what a medicinal chemistry group would need to do as
background to bring these kinds of models in and start these types of multivariate
analyses in their laboratory?
John Ellingboe Response: The software that we use, which is from Umetrics, is very
easy, very straightforward to use., but the key thing is to have some training in what the
results mean. So I think that's the key thing. Our department actually went through
about two weeks of training on these two software packages.
November 21st (AM) Session III: Physicochemical
Panel Discussion
Li Di, Phil Burton, Val Stella, John Ellingboe
Question: This topic came up at the previous workshop. Someone said it seems that
we pick the low-hanging fruit in terms of profiling molecules for their drug-like
properties—I don't mean to trivialize things like solubility or permeability by passive
diffusion or Phase I metabolism. So, the questions is what are the unmet needs, in
other words, what are the properties out there that we really need to assess early in
drug discovery and maybe that methodology isn't available? I ask this question
because I see the field as kind of stagnant in many respects, I don't see people going
much beyond where we are right now. Who's going to come up with the new
methodology? So maybe some of you could address that question and then maybe
some of the panel from yesterday could also.
Val Stella Response: I think the thing that Phil talked about—that is, the one area
where we have to have a breakthrough is the ability to predict the pharmacokinetic
properties of drug. You solve the problem if you can design ideal pharmacokinetic
properties, reasonable clearances, reasonable half-lives, reasonable volumes of
distribution, and properties like that. If you have that—yes, everything else can kill it—
but I think we've yet to reach the point where we can a priori predict pharmacokinetic
properties. That would be a major breakthrough.
Phil Burton Response: To follow up on that, and this is kind of a personal prejudice, I
think that in some respects several years ago when we began to implement this practice
of reductionism, trying to reduce the problem down to constituent components and
understand each one of them individually, it's been useful but we've sort of become
myopic in that we're now faced with the problem of understanding well individual
components but we don't understand very well how to put them back together again to
predict the whole. Biology tends to be somewhat complex, if you believe in complexity
theory, that there are emergent properties when you begin to put things back together
again. And I think the real unmet need is how to put together the elements that we can
measure in the laboratory very well in a biologically relevant way to make predictions
about performance in vivo.
John Ellingboe Response: I think that the point that Phil made is very interesting.
The first project that I started on at Wyeth about 20 years ago, the primary screen was a
4-day rat model and, of course, the throughput was very low. But if you put a
compound in there and it was active, you know it got in. The compounds were dosed
orally so you knew they got in and since it was a 4-day model you knew there was no
acute toxicity. So we had a lot of that information in the beginning. Now through the
reductionist approach we're taking—a high throughput screen, everything's broken
down—we keep having to introduce new assays to try to put the pieces back together
again. Now the profiling assays are done and there are some physical measurements,
but there are some additional cellular assays we could do that might bring us one step
closer to actually a whole animal model.
Val Stella Comment: For a start, we need a high-throughput rat.
Li Di Response: In the future I'd like to see more types of assays on transporters; right
now it's still not reliable and accurate enough for us to use as a screening tool.
Question: Let me try to be a little provocative and ask people like Jeff Silverman or
Dave Rodrigues to comment on this. I think the problem with transporters is that while
we're identifying all these transporters we really don't necessarily know how important
they are in determining drug disposition. It's kind of like cytochrome P450s were 25
years ago. We knew there were various isozymes in those days but we did not know
which ones were really important from a metabolism and a drug-drug interaction
perspective. One of the dilemmas today is that we can generate data on transporter
substrate/inhibitor activity but we do not necessarily know how to interpret these data
from the drug disposition and drug-drug interaction perspectives.
Jeff Silverman, Sunesis Pharmaceuticals, Response: I think we're very PGP
focused right now. There are almost 50 ABC transporters and there are several
hundred other transporters that are yet to be functionalized and we don't have a clue
which one we're dealing with in an ADME setting. I can speak for one of our internal
programs, we knew that PGP was not the issue and it was likely an organic anion
transporter, but we don't have a good assay for it. So it killed the program for lack of a
good understanding of its relative importance. in ADME. We don't even know what the
specific transporter was. I think, like P450s 10 years ago, we're just in the early days as
far as transporters go.
Ed Kerns, Wyeth, Response: There is a continuing need for improved blood brain
barrier models. To be able to look at actual tissue penetration, getting the drug to the
tissue that we're most interested in through barriers—blood brain barrier, blood testes
barrier, blood placenta barrier. We are also trying to keep some drugs out of these
areas too. It is certainly an area that needs a lot of work; some estimates are that only
a very small percentage of the diseases we'd like to treat in those tissues can be treated
because we can't deliver the drug to the tissue through the barriers. Better models and
integration of various data to be able to predict barrier penetration is certainly a growth
area for the future.
Phil Burton Response: I think even in the transporter area the integration and the in
vitro and in vivo relationship is really important and I think you could indict reductionism
as possibly a short-term barrier. We have a lot of tools now for transfecting individual
transporters into cell-based systems and understanding them in quite some detail, but
when you make predictions about the importance of any individual transporter derived
from those in vitro experiments in in vivo, I think Sugiyama and Kim Brouwer and a
couple of others have shown that there must be some complex functional
interrelationship in vivo among those transporters that tend to give results that are at
odds with what you would have predicted from the in vitro situation. So, taking this
reductionist approach and reconstructing something that's biologically relevant I look at
as a real unmet challenge.
Jerome Hochman, Merck, Response: I just want to elaborate more on what Phil has
been saying—we have the tools for expressing transporters, but often we don't have
good inhibitors for to identify what the specific impact of a specific transporter is. With
the exception of PGP, we don't really have good, well-characterized animal models for
evaluating the impact of the transporters and extrapolating what our in vitro results
mean relative to the to the in vivo consequences. With regard to characterization of the
animal models, it's really much more extensive than just understanding transport of a
model substrate, it also entails understanding what the impact is on other transporters in
terms of compensatory expression. Similarly, with our in vitro models, we often tend to
look at the best substrates for characterization of the models, so then how do we scale
to a real drug with moderate but pharmacokinetically relevant transport? For example,
with renal transporters we might use PAH, which is extensively transported in the kidney.
Now, how do we extrapolate that to a compound that is moderately transported in the
kidney and what the impact of that moderate transport is going to be in vivo? The other
aspect that we need to generate is to extend it all the way to how we can give guidance
to the chemists. If all we end up doing with these models is screening, we really aren't
going to help the drug discovery efforts, we aren't going to help optimize drugs that well
because we aren't providing direction in terms of what types of modifications need to be
made to move toward a drug.
Jeff Silverman Response: I just want to follow up on something that Phil touched on,
in particular, with transporters it's often not just one transporter involved, we often focus
on the efflux transporters, PGP or MRP, but in terms of ADME characteristics, we're
dealing with at least two transporters, an influx and an efflux, if not multiple transporters
going in both directions. So to give feedback to the chemists or a drug design team we
have to understand that more than just that X compound is a substrate for an efflux
transporter. Because if it has to be effluxed out, more than likely it has to be influxed
into the cell to be a substrate. And we don't understand which transporter pairs or
multiple transporters are working together to manifest themselves as a particular ADME
characteristic. That's an area for significant growth in this field.
Question: This is a question for Val regarding monitoring the breakdown products
when you make a prodrug and some of the work you did with formaldehyde. One of the
things that I think a lot of people haven't thought about, but I know the FDA has, is that
in attaching these moieties to your compound one of the things that you can alter is the
distribution of that so that, even though you have equivalent levels or exposure in the
blood, you can get dramatically different distribution patterns. We did some work years
ago looking at that with whole body autoradiography and you can definitely see some
pretty dramatic changes in the distribution of those compounds. We were more
interested in metabolites, but the difference between giving something that is preformed
or having it formed or metabolized in a tissue, you can get very dramatic differences in
distribution.
Val Stella Response: I completely agree. You can see very dramatic differences,
especially when you're doing prodrugs of very polar materials where you can change
dramatically the accessibility to some tissues that were not accessible to the parent
drug. There's no short cut. A prodrug is a NCE. That's why it behooves you, if you're
going to do a prodrug strategy, it has to be done at the design level not as a hindsight
two years down the road. Because if it's two years down the road that you realize you
need a prodrug approach and you're now going for the Holy Grail to solve the problem,
you really have to go back a long way, including the tox issues that you've sort of raised,
the tissue distribution issues. So that's why it's a question that has to be raised right at
the drug discovery phase. If we're going down this road and we're going to produce
molecules that have these properties, are we going to get ourselves boxed into a corner
where we have to do a prodrug strategy? Therefore, it has to become part of that
design team approach that we've all talked about. If you do it too late, you run into the
problem that you're talking about, that you actually have to go back and address a lot of
the additional tox issues that come in from the promoiety, either from the moieties
themselves or the altered distribution of the drug that can come about from the prodrug
approach.
Question: In reality, I think most prodrug approaches are being undertaken at a later
stage of discovery when there's something found wrong with the molecule. But what
invariably happens is that you will add molecular weight to the molecule. I'm wondering
whether the Lipinski rules should still be applied to prodrugs. For example, if I count the
oxygens and nitrogens in Veriad, I arrive at a number of 15 if my counting is correct.
Also, I think the molecular weight is in excess of 500. The second part of the question
is should we still use the routine pharmaceutical profiling assays to characterize these
molecules?
Val Stella Response: We could spend hours on those questions. The molecular
weight one is an issue and that's why the old KISS principle, Keep It Simple Stupid,
applies to a lot of prodrugs. You really don't want to get any more complicated than the
problem already is. I've seen a lot of people say "I'm going to develop a prodrug, and
what I'm going to do is do this so this will happen and when that happens this will
happen and when that happens this will happen and eventually I'm going to have a
drug." And if you try to take that approach, invariably Mother Nature wins. So you
really have to keep it simple. With regard to molecular weight, I know with the Tenofovir
issue, the compensation in that particular issue was yes, you had a lot of hydrogen
bond donor acceptors, yes, you had a lot of high molecular weight, but you had great
solubility. And, in that particular case, the solubility overcame a lot of the problems that
would have been counter to the Lipinski law. By the way, you will notice that in my talk I
didn't call them the Lipinski rules. I changed his word to the guidelines; and Chris would
be the first to admit that they're not rules, they're guidelines. The second question was
the issue of whether we should still use the pharmaceutical profiling assays to
characterize these molecules. To some extent, yes. The thing is that since you're at
the optimization and you kind of know where you're going to some extent so you don't
have to do the whole thing. The thing that's most frustrating to me is when I sometimes
see the pharmaceutical industry do a prodrug and they throw it into this cell assay and if
it's not active—well, it's not designed to be active, it's only designed to be reactivated in
vivo. So I think the answer to your question is you have to design your assays and think
through your assays and interpret them.
Question: In many organizations studies in small animals are the Holy Grail. However,
in rodents you have very high esterase activity. How should we test prodrugs in
pharmacology models?
Val Stella Response: That's the $64,000 question because we all know we're being
led up the garden path not only in analog development but also in prodrug development
by the animal models. I'm a fairly big advocate of the dog model simply because I think
in most prodrug strategies I've worked on over the years it seems to be a more reliable
model. That doesn't mean that we don't do rat work or rodent work because it's
cheaper and quicker and you can at least get some baseline, but like anything else
when you get into a series of analogs you kind of take a look at a broader animal model
and I think you have to do the same thing with prodrug strategy. You saw in that one
slide that the bioavailabity of Tenofovir varied from 11% in the ferret model to 73% in
the dog, and man happened to be somewhere in between. So, there's no simple animal
model just like we know with analog development there isn't; you just have to use your
smarts. I've heard everyone talking about these efflux models and everything else and
I'm reminded of the W.C. Fields joke where he said, "I took a drink to steady myself and
I got so steady I couldn't move." I sometimes wonder when I listen to my colleagues
talk about solubility and permeability and efflux whether we're not getting to that same
stage. What we've got is so many of these in vitro models that if we believed all the
data we'd never develop a drug. I made a comment that we need to develop a highthroughput rat. It was facetious, but it's not facetious. I do think that we need to go in
vivo as quickly as possible because ultimately that's the test of whether you're going to
have a chance. How you do that quickly when you've got thousands of compounds is
clearly the conundrum that we're in.
Question: What do you see as the role of rabbit PD or surrogate efficacy models in
what you're talking about? What are the issues or the caveats ? Also, as a biologist,
I'm trying to understand the solubility and how it's measured. Usually it's measured in
some solution that may or may not track with one of the assays you're doing, but not all
of them, and what you're really trying to ask is what is the solubility in the place it needs
to get into the animal. So, I'm trying to understand how you actually measure solubility
and what makes a good solubility measurement.
John Ellingboe Response: The solubility profiling assay I described is a kinetic
solubility and it's meant to be high throughput to look at a lot of compounds so it gives
us an idea of trends within series. If you want to understand the solubility of one
compound and how it's behaving in an assay, then you have to go back and do much
more detailed solubility assays and you can look at variations in solubility over a
different pH ranges, you can look at solubility within the assay medium, there are a lot of
sort of secondary things you can do. Or you can look at solubility in simulated intestinal
fluids and things when you're getting even further advanced. It depends on what the
goal is. The profiling assays I described are meant to look at lots of compounds to
identify trends. But if you're getting to the point where you've got something that looks
interesting and you want to put it in an in vivo model, then you have to go back and do
more detailed work.
Li Di, Wyeth Research, Response: What we do is use solubility at pH 7.4 as John just
mentioned. The biologists will use that data as a reference. When they do the assay,
they will look if the IC50 concentration is higher than the solubility at 7.4. When IC50
exceeds solubility at pH 7.4, it will serve as a warning to see if the IC50 curve is normal.
If it is not a normal curve, we usually measure solubility in the assay medium to help
explain the data.
Val Stella Response: We focus in on solubility with respect to in vivo performance, but
what we also have to remember is that the range of dosage form and dosage strategies
that you can use is also dictated by solubility. For example, if you want to develop a
matrix dosage form for sustained release, it's driven by the size of your dose, the
potency of the compound and solubility, and the release mechanisms from matrix
devices. If you're looking at transdermal delivery, if you're looking at ophthalmic dosage
form—we focus so much on oral delivery and standard oral delivery, the standard fast
release tablet. But, you know, a lot of these parameters are extremely important as
they relate to ability to deliver the drug in any sort of sophisticated dosage form. If you
develop a 1 g drug to treat AIDS, your ability to then manipulate the release properties
in a dosage form really becomes limited by the size of the dose and solubility. So
solubility is an ultra-critical variable that affects not only your ability to assay and get
activity of the drug but also affects the sophistication and often the downstream IP
position that you have to develop to protect your gold mine. And on your solubility with
respect to GI tract, yes, the high throughput screening that was mentioned with the
DMSO is obviously something that tends to give us a significant overestimation of the
solubility. But I've been a firm believer for a long time that the GI tract is not pH 6.8
isotonic phosphate buffer. You saw that with the compounds that I tested with the
prodrugs where we had the very insoluble compound that performed very well. It
performed well because the environment of the GI tract was conducive to dissolution for
a drug with its physical properties. So, when I do consulting work and people say, "I
have a low solubility drug," I ask them "What is the cause, the ideology of that low
solubility?" If the cause of the low solubility is high crystallinity, very crystalline material,
then your strategies for overcoming the low solubility are more limited. It's basically
smash and mash, make smaller particles or prodrugs. It's almost impossible to develop
a delivery system for a low soluble compound that's reasonably high dose and highly
crystalline. Your options are limited. If it's a relatively low melting point compound with
the solubility being largely limited by solvation, you have more options. A soft-gel
capsule, perhaps. So, for me, it's not just solubility, it's understanding the cause that is
critical. And that's something that sometimes can be built in, sometimes can be
manipulated by the prodrug approach, but ultimately the GI tract is not pure 6.8 isotonic
phosphate buffer. It's a complex environment that involves a fair amount of surfactant
action. So, to me, not necessarily an initial screen but fairly soon the screen really
should look at the effect that environment has on the dissolution process—if you have
low solubility.
Question: In medicinal chemistry we often encounter solubility and permeability
problems. As medicinal chemists we always try to solve the problem by adding an
appropriate group—either a polar group to improve the solubility or a lipophilic group to
increase the permeability. Unfortunately, very often when we add these functional
groups we lose activity. It's not that we don't cooperate with the ADME guys or the
development people, it's just Mother Nature. Often we try to solve this problem first by
improving the potency and so we don't need a very big dose or very much permeability.
Secondly, very often if we are not able to solve the problem we try to use a prodrug
strategy to solve the issue. Whenever we raise that proposal to our development folks,
they often say whenever you develop a prodrug you have to have longer development
time and also it costs more because every time you do an analysis you not only
measure the parent compound, you also measure the metabolites of the compound. So,
actually, you double the amount of work.
Val Stella Response: Do you want a drug? If you've got something that works, it has
the right pharmacokinetic properties, and a prodrug works, so what if it costs twice as
much? To me, the most fallacious argument I've ever heard is the issue that it takes
more work and costs more money. Close to 20% of the drugs on the market were
prodrugs—they made the effort, they're now billion dollar drugs. So what? You could
spend a lot of time doing analog approaches to solve the problem, which is good, but
that doesn't cost any money? To me it's a chicken and egg sort of thing. I'm not saying
that prodrug solution is always the solution—it very rarely is, so don't get me wrong. But
that argument to me just means people are copping out. They've got a job to do, you've
got a job to do; the team has to find the best solution. The fact that it costs more and it's
a little bit more complicated—if you do it early and don't have to do it retroactively 2 or 3
years down the road—you need to recognize the problems early, you need to get the
solutions early, and you need to work efficiently. If you do that the development of
prodrugs is not that much different than the development of an NCE, and in that case
you generally know what the metabolites are, the drug and the promoiety, so you've got
a reasonable starting point.
John Ellingboe Response: I've heard that argument from some people in my
company. But you can say that a lot of drugs have active metabolites, so they have to
go through the same process with those drugs as well, so it's not that different from
what would be a normal development program.
Li Di Response: We have a lot of programs like that. The chemists are trying very
hard to optimize the potency and it gets very potent. But, for example, they won't be
able to get it into the brain and it's a CNS drug so they have to eventually drop it anyway.
And then they were trying to go to a different series but it looks less potent, and the
managers won't let them make less potent drugs, so they have no way of getting out of
this loop of getting a potent drug that doesn't get into the brain. So sometimes it's wise
to optimize properties at the same time you optimize potency instead of optimizing
potency first and limit us to a potency space, which might not be optimal for drug-like
properties.
Ed Kerns, Wyeth, Response: To address the first part of your question, there's a lot of
discussion about how to do optimization of properties and at the same time maintain
efficacy, but solutions are few The solution that John Ellingboe described is a very
creative approach to being able to do property and efficacy optimization at the same
time. We should tap into the creativity of the medicinal chemists for strategies on how
to do this at the same time. There must be additional strategies and tools out there that
can address this need.
Question: This may be a premature question, for Wyeth, maybe for Pfizer, but how
well are we doing with this? It's scientifically satisfying to feel that we're better informed
as we iterate through from wonder drug to best drug and actually into Phase III and
ultimately launching a product, but are our budgets increasing at the discovery phase?
Are we measuring success and feeling good about using this very iterative process
more effectively and more scientifically as we measure our success in the early
milestones of development? Is this a more effective approach?
Phil Burton Response: I'm not sure I have an answer to that, but if you use one
measure of success as the number of INDs and NDAs submitted and approved, that's
been declining steadily over the last couple of years despite the fact that the R&D
budget has been increasing at a fairly hefty rate. So, clearly, we're not doing some
things very well. And I think what was addressed here are strategies to try to address
some of the inefficiencies. I think that stems ultimately from implementing some poor
decision-making practices several years ago in order to handle the large numbers of
data that we were generating after implementation of high-throughput screening. That's
a personal opinion, but I think it's probably borne out by the evidence.
Val Stella Response: I did a historical analysis a number of years ago of NDAs and
INDs produced per year and there were two major dips that I saw. One major dip
occurred, and I don't remember the years, but it was about 7 years after we discovered
peptide chemistry. So every drug company in the world was making peptide and
peptide mimetic drugs, and, as we know now, most of those failed because even though
we thought they were metabolically stable we couldn't deliver them. So there's a dip in
INDs that came about 7 years after that technology hit because all of a sudden we could
make a lot of receptor-based compounds that were active, but we couldn't deliver them
in vivo. The other one is the dip that Phil just mentioned that's very recent. I actually
attribute that to the development of combinatorial chemistry. The idea that we could
make drugs in mass and screen them in mass and therefore find things in the box. And
we're now paying the price for that. I think what we're doing now is a lot more
reasonable and that's a balance of quantity and quality. I would hope that our chemists
continue to be very smart and come up with wonderful active compounds, but I think
we're paying the price of some decisions that were made 6 or 7 or 8 years ago, not
necessarily the decisions that are being made today.
Ed Kerns, Wyeth, Response: Anecdotally, as I've worked with discovery teams, it's
my experience that there is an increase in the awareness of the need for property
optimization. You may not be able to quantitate it, but it is now a part of the drug
discovery paradigm for people to be aware of the issues of why compounds succeed as
pharmaceutical agents. It is taking hold and it's something that every team deals with.
Jerome Hochman, Merck, Response: Just to elaborate on that a little bit more. I do
think that one of the problems, at least recently, has been a lack of flexibility in the
decision-making process and basically feeling that each of these parameters is an
absolute and is going to impact a program in the same manner. One of the things that
really struck me as remarkable is the compound azetimibe. I was recently looking at
some of the data on azetimibe and the pharmacokinetic parameters and it's a
compound that, if we took a rational approach to designing it, I don't think it would have
ever come about. It shows poor bioavailability and it is subject to extensive
glucoronidation and hepatobiliaryl recycling. At the time it was developed there was not
a pharmacological target for it, so the only thing they had was cholesterol absorption
inhibition in an animal species without knowing the mechanism at the time and no
extrapolation to man in terms of what it was going to mean. Now, call it luck, but it's a
beautiful drug. I think part of having the wisdom to be able to understand when to kill a
drug and when not to kill it and what approach to take to bring a specific drug forward—I
don't think there's anything in a book that can tell you that.
Download