Uploaded by allegarve01

aff vs brophy tj meadows doubles - Google Docs

advertisement
Teaching students how to fail at debate since 1100 B.C.
AFF CASE
LD Septober Res 2022
Written by: Aiden Kim, Timothy Lin
Thanks to: Aimee Ge, Jasmine Zhang, Adam Tomasi
——————————————————————————
I affirm:
Resolved, The United States ought to implement a single-payer universal
healthcare system.
“Single-payer” includes universal health insurance coverage of the whole popula on, with the
op on to buy supplemental insurance. The federal government would be the single bill-payer
that reimburses providers for healthcare, which is free at the point of service
Hsiao 21 (may/may not read)
William C. Hsiao (Professor of Economics at Harvard University). “Defini on of Single Payer and How it Compares with Mul -Payer Models.”
California Health and Human Services. 2021.
h ps://www.chhs.ca.gov/wp-content/uploads/2021/06/Defini on-of-Single-Payer-and-How-it-Compares-with-Mul -Payer-Models.pdf, AT
There is confusion in the literature as to what is a single-payer health care system and the impacts of such a system. Writers/speakers are using
different defini ons, thus producing different results and cri cisms. Writers and commentators using the same words but with diverse meaning
muddle the public discussion and inhibit ci zens in reaching agreement on the be er design of health care systems to achieve universal health
we can NOT do meaningful research or debate [on] the merits and demerits of
a single payer health system and its feasibility unless we agree on the defini on of single payer systems.
coverage. I’d argue that
Various defini ons of single-payer have appeared in the literature. Some define it as a single source of revenue for a given popula on (Sherry
Glied, 2009), others as a single purchaser of all health care services (Kutzin, 2001), and others define it a system where an organiza on pays all
health care bills as well as sets prices for the services. I would argue a single-payer
health system has three major
a ributes: 1. Mandatory health insurance coverage of the whole popula on of a na on (or
province) into one single risk pool with financing for insurance via tax revenues and a zero or
uniform premium rate. 2. Every ci zen in that na on/province covered with one uniform and
essen al benefit package of health care services, while people can voluntarily purchase
supplementary insurance coverages. 3. One single purchaser of healthcare who promulgates
one uniform set of rules for providers on the quality and medical necessity of reimbursable
health care and uniform payment rates to providers. Under this defini on laid out, Canada,
South Korea, and Taiwan would be classified as single-payer systems while Germany is a more of hybrid model. In
contrast, mul -payer systems operate with many different insurance plans, offering group and individual health insurance as the principal
sources of health care financing. Risks are not pooled na onwide, but instead by various popula on groups, including employers, employment
status, occupa on, community, age and sex, health status, individual preference for health insurance and willingness or ability to pay, etc. There
are numerous benefit packages but no uniform benefit package for everyone. Lastly, the mul ple insurance plans each set different rules for the
provision of services and for payment to health care providers. Complying with these mul ple rules and mul ple payment systems vastly
increases provider administra ve costs. The U.S. is the best example of a mul -payer system.
The plan is for the United States Federal Government to implement the
Canadian single payer system.
Canadian single payer is key to quality, access, and efficiency
Martin 9-18
Danielle Martin (family physician in Toronto and Vice President of Medical Affairs and Health System Solutions at Women’s College
Hospital). “Canadian Doctor to U.S.: Try Single-Payer Health Care Instead Of Trashing It.” Common Dreams. September 18th, 2017.
https://www.commondreams.org/views/2017/09/18/canadian-doctor-us-try-single-payer-health-care-instead-trashing-it
There’s a joke we sometimes tell in Canada: What’s a Canadian? An apologetic American with health care. It’s funny because we
half-believe it’s true. The United States and Canada are about as similar as two countries get. But Canada has had a publicly funded,
single-payer health care system in each of our provinces and territories since the 1960s. It works. Maybe it can work for you too. I
was in the room on Capitol Hill last week when Sen. Bernie Sanders, I-Vt., introduced new legislation that would seek to enact a
single-payer health care system in the U.S. If Sanders’ bill moves forward, all Americans would have access to Medicare, regardless
of their age or financial situation. Is it that simple? In some ways, yes. In Canada, the notion that access to health care should be
based on need, not ability to pay, is a deeply ingrained value that crosses party lines right and left, and is a source of collective pride.
The single-payer, publicly funded health care systems in Canada cover virtually every resident of our country, at a much lower cost
than the U.S. model. There are plenty of nasty rumors in the American media about the quality of health care in Canada. In fact,
Canadians live longer than Americans, and according to a recent study published in
The Lancet, the Canadian system outperforms the U.S. on quality and access to
care overall. As a practicing physician and a hospital administrator in Canada, I see the cracks in our system just as American
doctors see the cracks in yours. It’s true that sometimes Canadians wait too long for non-urgent or
elective investigations or procedures. That is why across Canada, governments, health
care providers and citizen groups are working hard to improve access for procedures like
hip replacements, cataract surgery or non-urgent advanced imaging. But we don’t believe that the solution lies in dismantling
universal health care and creating a system where some cannot afford the care they need. Most importantly,
Canadians
don’t wait for urgently-needed care. When Canadians get sick they get the care they need, and the care
they get is good. When my patients need to see a doctor, they don’t have to worry about paying for the visit. And when they
are admitted to the hospital, they don’t see a bill. Is it true that Canadians get health care in the United
States? Occasionally. When it happens, it’s often because our senior Snowbirds happen to fall ill while they are enjoying your better
winter weather in Florida or Hawaii. More
rarely — so rare that it’s hard to accurately
measure — people do travel by choice just to buy immediate care. Of course,
Americans travel for health care too, sometimes because they can’t afford it at home. The
U.S. Centers for Disease Control and Prevention has estimated that about 750,000 U.S. patients travel abroad each year for medical
treatment. That’s a larger proportion of the U.S. population than even the highest estimates of Canadians seeking treatment abroad.
Single-payer health care is also much less administratively onerous and expensive. The
billing office in the clinic where I work serves more than 40 primary care doctors, and
employs only two billing clerks — because billing is so simple. Every month my
practice sends one bill to the public insurance plan, and every month we get paid on
time. Compare that to the multiplicity of private insurance plans my American
colleagues deal with. The costs of administering this system are much lower — in
Canada overhead is less than 2% in public plans, compared to the 18% spent by private insurance firms in the
U.S. Overall, health care spending in Canada accounts for 10% of our GDP, compared to 17% in the U.S. Per capita we spent about
$4,644 in U.S. dollars in 2016, compared to $9,892 in the U.S. Yet more than 1 in 10 Americans are uninsured. Like all industrialized
countries, Canada needs to meet the challenges of an aging population while contending with rising costs, advances in technology,
and an imperative to keep people longer in their homes. No nation has figured out the magic formula: There is
much we need to do to improve our system, and even where we know our outcomes are good, it isn’t very Canadian to be boastful.
But Canadians, along with others all over the world, will be watching as the conversation around single-payer health care continues
to unfold in the U.S. As we have long ago discovered, everyone should have equal access to health care when they need it — it is a
basic human right.
Criterion/Value:
I value pragmatism because “ought” in the res implies “can”.
Goldberg 09 - Ought implies can.
Goldberg, D. S. (2009, April). Universal Health Care, American Pragmatism, and the Ethics of Health Policy: Questioning Political Efficacy
. University of New Hampshire. Retrieved September 8, 2022, from
https://scholars.unh.edu/cgi/viewcontent.cgi?article=1199&context=unh_lr
In a purely moral paradigm, the naturalistic fallacy suggests that an important distinction exists between the descriptive and the normative. The
mere fact of a phenomenon does not imply its moral goodness. However, while “is” does not imply “ought” many moral theorists have suggested
that “ought”
implies “can”. After all, if one cannot do what one ought to do, in what sense is a moral
obligation meaningful? The notion that a just social order should endeavor to establish meaningful
connections between what a moral agent ought to do and what a moral agent can do.
Bioterror Advantage
Advantage 1 is Bioterror
Only single-payer ensures a quick and coordinated response to inevitable
diseases and bioterror—multi-payer systems are too fragmented and leave
millions uninsured
Kahn 17
Laura Kahn (began as a registered nurse, works on the research staff of Princeton University’s Program on Science and Global
Security. Prior to joining Princeton, she was a managing physician for the New Jersey Department of Health and Senior Services and
a medical officer for the Food and Drug Administration). “Why access to health care is a national security issue.” Bulletin of the
Atomic Scientists. June 5th, 2017. http://thebulletin.org/why-access-health-care-national-security-issue10819
Cutting the Prevention and Public Health Fund, which deals directly with planning for bioterror attacks and pandemics, was only the
most obvious way in which the House bill attempted to undermine American security. Over the long term, there is also a movement
afoot to put basic health care out of reach of many Americans. Simply
making healthcare unaffordable may
seem less dramatic than slashing an emergency-preparedness budget, but doing so also
undermines national security. As the Congressional Budget Office report suggests, the American Health Care
Act would make healthcare essentially unaffordable for people with pre-existing conditions, because it would allow insurance
companies to dramatically increase their premiums. Ten years ago, I wrote about the security impact of the uninsured during the
George W. Bush presidency. In
2005, almost 47 million people (about 16 percent of the total US
population) were uninsured. Thanks to the Affordable Care Act passed under the Obama
administration, that number dropped to a low of 11 percent, according to a Gallup poll
taken during the first quarter of 2016. The Affordable Care Act was a big step in the right
direction, but it didn’t close the gap, and the national security and public health challenges of having a large
fraction of the population uninsured remain as relevant today as they were a decade ago. Uninsured people delay
seeking health care. Once they seek it, often in a busy emergency room, they are
typically given less attention than people with insurance. This failure to get care
becomes a danger not only for the individual but for the public at large when the
problem is a deadly infectious disease. We saw this scenario play out in Dallas during the Ebola crisis of
2014 and 2015. A poor Liberian man, infected with the virus, presented himself to Texas Health Presbyterian Hospital with severe
abdominal pain and a high fever. He was examined and sent home with a bottle of antibiotics. Amazingly, he did not set off an Ebola
outbreak in his community, though the risk that he could have was significant and the wider public shouldn’t count on being so lucky
next time. Before dying, he infected two nurses who had received inadequate training and equipment to protect themselves. During
the anthrax crisis of 2001, in which spores of the deadly disease were sent through the US mail, many people infected were federal
employees with health insurance. If these postal workers hadn’t had easy access to health care, the death toll might have been higher
than only five; 17 more were infected but survived thanks to timely medical attention. Anthrax spores do not spread from person to
person, but it’s no stretch to imagine a different scenario: Suppose a future attack involves smallpox, a highly communicable virus,
and that the initial victims are uninsured childcare workers or food handlers. The initial signs of smallpox include fever, chills, and
headache. Uninsured
victims would likely delay trying to get care, hoping for the symptoms
to pass. By waiting they would certainly expose others to the virus, potentially setting
of a pandemic. Countries like Canada, which has universal health coverage and a
well-funded public health infrastructure, are much better prepared to handle deadly
epidemics. In 2003, Canada confronted Severe Acute Respiratory Syndrome (SARS),
which originated in China. A physician from Guangdong province inadvertently infected a number of tourists with the
SARS virus, setting off a global pandemic after everyone returned to their home countries. Among the infected travelers was an
the
Canadian health system worked hard to contain the virus, treating 400 people who
became ill and quarantining 25,000 Toronto residents who may have been exposed.
Ultimately, 44 people died from the disease in Canada, but the result would have been
much worse without a quick and well-organized response. The Canadian government’s
elderly Canadian woman who returned to Toronto after a 10-day vacation in Hong Kong. Over the course of about four months,
response had its glitches—primarily in the form of poor political leadership. Mel Lastman, the mayor of Toronto and a former
furniture salesman, became angry when the World Health Organization (WHO) issued a travel advisory against his city. He railed
against the WHO’s decision on television, revealing his complete lack of knowledge about either the organization or public health in
general. As a result of Lastman’s poor leadership, he was ultimately relegated to a secondary role as the deputy mayor took his place.
Lastman’s credibility and legitimacy never recovered from the SARS outbreak. Likewise, US leaders will be judged by how they
handle a bioterrorist attack or pandemic. Unlike
Canada, America’s piecemeal healthcare and public
health systems are inherently less able to handle such crises. The Affordable Care Act
helped fill in the gaps, but really, the only way to prepare for the
eventuality of pandemics or bioterrorist attacks is with a
single-payer government-run system that covers everyone. The United States might
consider modeling its health care system after the one in Israel, a country that, given
longstanding threats, takes every terrorist risk very seriously. In 1994, it established
universal health coverage for all citizens. The country’s Ministry of Health monitors and
promotes public health, oversees the operations of the nation’s hospitals, and sets
healthcare priorities. As a result, Israel’s public health, emergency response, and
hospital systems are state-of-the-art, highly efficient, and coordinated—a
necessity when responding to terrorist attacks. The preamble to the US Constitution states the goals to
“provide for the common defense” and “promote the general Welfare.” The US government won’t fulfill either of these duties if it
fails to protect its citizens against pandemics and bioterrorism. The mandate requires a robust public health infrastructure and a
universal healthcare system that covers all Americans. The Trump Administration and Congressional Republicans threaten to
undermine this essential function of government, unnecessarily jeopardizing American lives.
Bioterrorism is inevitable—it’s only a question of coordination and
preparedness—this outweighs a nuclear attack
Sun 17
Lena Sun (reporter). “The Trump administration is ill-prepared for a global pandemic.” The Washington Post. April 8th, 2017.
https://www.washingtonpost.com/national/health-science/the-trump-administration-is-ill-prepared-for-a-global-pandemic/2017/
04/08/59605bc6-1a49-11e7-9887-1a5314b56a08_story.html?utm_term=.e9292ef095d6
The Trump administration has failed to fill crucial public health positions across the government, leaving the nation ill-prepared to
face one of its greatest potential threats: a pandemic outbreak of a deadly infectious disease, according to experts in health and
national security. No
one knows where or when the next outbreak will occur, but health
security experts say it is inevitable. Every president since Ronald Reagan has faced threats from infectious
diseases, and the number of outbreaks is on the rise. Over the past three years, the Centers for Disease Control
and Prevention has monitored more than 300 outbreaks in 160 countries, tracking 37 dangerous pathogens in 2016 alone. Infectious
diseases cause about 15 percent of all deaths worldwide. But after 11 weeks in office, the Trump administration has filled few of the
senior positions critical to responding to an outbreak. There is no permanent director at the CDC or at the U.S. Agency for
International Development. At the Department of Health and Human Services, no one has been named to fill sub-Cabinet posts for
health, global affairs, or preparedness and response. It’s also unclear whether the National Security Council will assume the same
leadership on the issue as it did under President Barack Obama, according to public health experts. “We need people in position to
help steer the ship,” said Steve Davis, the chief executive of PATH, a Seattle-based international health technology nonprofit working
with countries to improve their ability to detect disease. “We are actually very concerned.” Residents of the town of Kailahun, Sierra
Leone, gather along a river at dusk in August 2014. The town was deeply affected by that year’s Ebola outbreak. (Pete Muller/Prime
for the Washington Post) In addition to leaving key posts vacant, the Trump administration has displayed little interest in the issue,
health and security experts say. The White House has made few public statements about the importance of preparing for outbreaks,
and it has yet to build the international relationships that are crucial for responding to global health crises. Trump also has proposed
sharp cuts to government agencies working to stop deadly outbreaks at their source. The slow progress on senior-level appointments
— even those, such as the CDC director, that do not require Senate confirmation — is hobbling Cabinet secretaries at agencies across
the government. Temporary “beachhead” teams the White House installed are hitting the end of their appointments. The remaining
civil servants have little authority to make major decisions or mobilize resources. An HHS spokeswoman declined to comment on
personnel decisions. An NSC official, who was not authorized to speak publicly, said the administration recognizes that global health
security is a national security issue and that America’s health depends on the world’s ability to detect threats wherever they occur.
Trump’s NSC does not have a point person for global health security as Obama’s did, but global health security is part of the overall
portfolio of Tom Bossert, Trump’s homeland security adviser, another NSC official said. Global
health experts warn
that a pandemic threat could be as deadly as a nuclear attack — and is
much more probable. A global health crisis “will go from being on no one’s to-do list to being the only thing on
their list,” said Bill Steiger, who headed the HHS office of global health affairs during the George W. Bush administration. He spoke
at a panel on pandemic preparedness in early January. He is now part of Trump’s beachhead team at the State Department. Next
month, the G-20
governments, which traditionally focus on finance and economics, will
convene their health ministers for the first time, in part to test coordination and
preparedness for a pandemic, according to German officials, who are hosting the summit in Berlin. It’s not clear
who will represent the United States. In a speech to a security conference in Munich earlier this year, billionaire Bill Gates said a
pandemic threat needs to be taken as seriously as other national security issues. “Imagine if I told you that somewhere in this world,
there’s a weapon that exists — or that could emerge — capable of killing tens of thousands, or millions of people, bringing economies
to a standstill and throwing nations into chaos,” said Gates, who has spent billions to improve health worldwide. “Whether
it
occurs by a quirk of nature or at the hand of a terrorist, epidemiologists say a
fast-moving airborne pathogen could kill more than 30 million people in less than
a year.” The projected annual cost of a pandemic could reach as high as $570 billion. Last month, Trump met with Gates at the
White House. After the meeting, press secretary Sean Spicer said the two had “a shared commitment to finding and stopping disease
Americans are at greater risk than ever from new infectious
diseases, drug-resistant infections and potential bioterrorism organisms, despite
advances in medicine and technology, experts say. Not only has the total number of
outbreaks increased in the past three decades, but the scale, impact and methods of
transmission also have expanded because of climate change, urbanization and
globalization. The outbreak of Ebola that erupted in West Africa eventually infected more than 28,000 people and killed
outbreaks around the world.”
more than 11,000. MERS has killed nearly 2,000 people in 27 countries. Health officials around the world are monitoring a strain of
deadly bird flu, H7N9, that is causing China’s largest outbreak on record, killing 40 percent of people with confirmed infections. Of
a global influenza outbreak is everyone’s worst fear
because it could be highly lethal and highly contagious. A particularly virulent influenza
pandemic that started in 1918 killed an estimated 50 million people. Today’s H7N9 strain poses the
all emerging infectious disease threats,
greatest risk of a pandemic if it evolves to spread easily from human to human, according to U.S. officials.
Bioterror causes extinction—defense doesn’t assume advances in
bioengineering
Baum and Wilson 13
Seth Baum, PhD, Executive Director at the Global Catastrophic Risk Institute, is a Research Scientist at the Blue Marble Space
Institute of Science, recently completed a Post-Doctoral Fellowship in Geography at Pennsylvania State University and a Visiting
Scholar position at the Center for Research on Environmental Decisions at Columbia University, Grant Wilson, J.D. in international
environmental law and emerging technologies from the Lewis & Clark Law School, Deputy Director of the GCRI, “The Ethics of
Global Catastrophic Risk from Dual-Use Bioengineering,” June 28th, 2013. http://sethbaum.com/ac/fc_BioengineeringGCR.pdf
Bioengineering can be a factor in GCR, both to increase and decrease the risk. In this and¶ other regards it is a dual-use technology, meaning that it can be used both for
beneficial and harmful¶ purposes. Bioengineering has already led to breakthroughs in fields like agriculture and medicine.¶ However, bioengineering has also been used to
as biotechnology improves,¶
its capacity to cause benefit or harm improves, too. Ongoing advances are making the
technology¶ more accessible, more affordable, and more powerful than ever. All this raises the
produce weapons, such as the Soviet Union’s efforts¶ to engineer a highly virulent, antibiotic resistant form of anthrax.27 And
stakes for¶ bioengineering, including for bioengineering as a contributor to GCR.¶ A. Non-GCR Benefits of Bioengineering¶ Bioengineering has already led to an array of peaceful
technologies that are pervasive in¶ many sectors of society. In agriculture, bioengineering has led to pest resistant crops that need fewer¶ pesticides, drought resistant crops have
increased the amount of viable farming land, herbicide¶ resistant crops that helps eradicate weeds without killing crops, and virus tolerant crops.¶ Bioengineering has also
resulted in increased crop yields, with a 2010 survey showing that¶ genetically engineered crops increased crop yield by 6 to 30 percent.28 Some crops can have¶ increased
nutritional value, such as “golden rice” that can help remedy Vitamin A deficiency for¶ millions of children and pregnant women, although this particular food has been met
with¶ opposition and is not yet widespread on the market. Finally, the net economic benefit is massive:¶ fueled by increased crop production, biotechnology boosted net income
for famers by $65 billion¶ from 1996 through 2009 and $10.8 billion in 2009 alone.29¶ Bioengineering has also revolutionized many aspects of the medical field. First,¶
bioengineering has been used to create an array of medicines that were previously unavailable.¶ Insulin (formerly available only from sheep and cows), a Hepatitis B vaccine, and
a HPV vaccine¶ are just a few of the examples. Bioengineering has also opened the door to medical treatments for¶ previously incurable diseases, such as through gene therapy
techniques, which uses genes to treat¶ diseases. For example, one recent clinical trial successfully used gene therapy to attack the tumor cells of patients with recurrent B-cell
acute lymphoblastic leukemia, and three out of five patients¶ went into remission, with the two other patients being subject to unrelated complications.30¶ Finally,
bioengineering shows great potential to promote environmental health. For example,¶ scientists are developing bioengineered algae that convert CO2 into an efficient fuel source
for¶ motor vehicles. Scientists have also developed bioengineered microbes that are customized to¶ gobble up pollution, such as oil slicks or toxic waste, although these have not
successfully entered¶ the marketplace. Bioengineered crops can also have a positive impact on the environment because¶ farmers can use less harmful tilling practices, fewer
pesticides and herbicides, and can produce¶ more crops on the same amount of land.¶ On the other hand, these benefits of bioengineering may have unknown effects on human,¶
animal, and environmental health. For example, pesticide and drought resistant crops may spread¶ into unintended areas and be difficult to eradicate. Furthermore, the overall
environmental effect of¶ GE crops is still unknown, and may not be known for decades, until the impact is irreversible.¶ Bioengineered microbes in the environment could also
have vast unknown environmental¶ implications, potentially outcompeting their natural counterparts and transferring novel DNA to¶ other microbes,31 although these risks
likely fall short of a GCR.¶ B. Bioengineering as a GCR¶ While bioengineering has many benefits, bioengineering can also increase GCR. There are¶ two general risks that arise
from bioengineering: first, the risk of an accident involving a dangerous¶ bioengineered organism (a biosafety issue), and second, the risk of the purposeful release of a¶
dangerous bioengineered organism (a bioweapons issue). Biosafety is a major concern across the¶ globe. Biosafety lapses resulted in the apparent accidental release of Foot and
Mouth Disease from¶ a leaky pipe in the UK in 2007 and several instances of SARS infections from laboratories in¶ 2003.32 Meanwhile, scientists are now able to use
bioengineering techniques to create incredibly¶ deadly organisms, such as a genetically modified H5N1 virus (bird flu), which scientists engineered¶ to be airborne. The natural
H5N1 virus killed an alarming 60 percent of reported infected¶ individuals,33 which is a higher fatality rate than the 1918 influenza pandemic (Spanish flu).34¶ Although the
release of a similarly dangerous virus, could
cause a loss of life significant enough to qualify as a global catastrophe by its normal definition.¶
airborne H5N1 virus never escaped from the lab, its accidental release, or the¶ accidental
Bioweapons are also a growing threat. The technology to
create dangerous bioengineered organisms is
increasingly cheap and accessible. Custom DNA strands are available for shipment
online for several thousand dollars, and consumers can even purchase DNA synthesis machines
online, which can “print” customized strands of DNA.35 Individuals have also shown a willingness¶ to
engage in attacks using biological agents, such as the postal anthrax attacks of 2001, which killed¶ 5 people in the United States.
Furthermore, publicly
available information on how to bioengineer¶ extremely deadly viruses
can be used as an instruction manual for individuals, organizations, or governments to
create extremely deadly bioweapons.36 So with bioengineering technologies¶ advancing, bioterrorists or
other actors could create a biological weapon that is more deadly than anything existing
in the natural world. Therefore, bioweapons that utilize bioengineered organisms pose a GCR. C. Bioengineering to Reduce other GCRs In addition to itself
¶
¶
¶
being a GCR, bioengineering can also reduce the chances that other¶ GCRs will occur. One such GCR is climate change. Catastrophic climate change scenarios could¶ involve sea
level rise of up to 10 meters, droughts, increased extreme weather events, loss of most¶ threatened and endangered species, and temperature increases of 6 degrees Celsius.37
Still worse than that would be outcomes in which large portions of the land surface on Earth become too warm¶ for mammals (including humans) to survive.38 And the worst
scenario could involve climate¶ engineering backfiring to result in extremely rapid temperature increase.39¶ Despite the risks of climate change, the international community has
struggled to¶ satisfactorily address the issue, for a variety of political, technological, and economical reasons.¶ Bioengineering may be able to help. An army of bioengineered
algae that is specifically designed to¶ convert carbon dioxide into a “biocrude” fuel ready to be made into fuel for any vehicle type – a¶ technology that Craig Venter’s Synthetic
Genomics, Inc. is developing with a $600 million¶ investment from ExxonMobil – could remove greenhouse gases from the atmosphere and provide a¶ plentiful, carbon-neutral
fuel source that does not pose many of the downsides of today’s biofuel¶ options (although this technology has its own risks).40 Or, despite being a bizarre proposition,¶ humans
could be genetically engineered to reduce our CO2 output, such as by engineering humans¶ to be intolerant to meat or to be smaller in size.41¶ Likewise, while a deadly
bioengineered virus has the potential to escape from a laboratory¶ and cause a global catastrophe, such research may be necessary to create vaccines for viruses that¶ could cause
worldwide pandemics. For example, the Influenza Pandemic of 1918-1919 (the Spanish¶ flu) killed about 50 million people worldwide.42 Would modern bioengineering
technology have¶ been able to avoid this global catastrophe? In fact, researchers justified the airborne H5N1 virus,¶ discussed above, as helping to prevent the spread of a similar
strain that could mutate naturally.¶ Overall, there is a dynamic relationship between bioengineering and other GCRs that should be¶ assessed when considering how to respond
to these risks.¶ IV. INTERNATIONAL REGULATION¶ International regulation of bioengineering is important for at least two reasons. First,¶ bioengineering can influence
(increase or decrease) GCR, which is an inherently global issue. Thus¶ the entire world has a stake in ensuring that this technology is used safely. Second, bioengineering¶
research and development can be conducted anywhere in the world. With only scattered national or¶ regional regulation, harmful bioengineering could be conducted in
unregulated areas. A deadly¶ virus could be bioengineered by terrorists in an unregulated area, but it would have an impact on the¶ entire planet. Thus international regulation
has an important role to play in the overall management¶ of bioengineering, as with GCR in general.¶ Several international treaties already attempt to regulate various GCRs. The
U.N.¶ Framework Convention on Climate Change and the Kyoto Protocol regulate climate change; the¶ U.N. Convention on Biological Diversity (CBD) regulates biodiversity loss;
the Treaty on the Non-¶ Proliferation of Nuclear Weapons (NPT) regulates nuclear weapons; and the Biological Weapons¶ Convention (BWC) regulates biological weapons.
there is no international treaty specifically for bioengineering
However,
.¶ Nonetheless, several other treaties
regulate certain aspects of bioengineering, including the¶ CBD, the Cartagena Protocol, and the BWC. Still, the drafters of these treaties did not design them¶ with all the risks of
modern dual-use bioengineering in mind. And so these treaties provide¶ imperfect regulations that fall well short of the comprehensive international scheme that would be¶
necessary to minimize the GCR arising from bioengineering.43¶ The existing treaties fall short in several ways. First, while the Convention on Biological¶ Diversity (CBD)
requires parties to “regulate, manage or control the risks associated with the use¶ and release of living modified organisms,” the treaty does not establish specific steps that
countries¶ must take, such as requiring high-security laboratories or safety training for scientists handling¶ dangerous bioengineered pathogens (CBD, Article 8(g)). Second, the
Cartagena Protocol on¶ Biosafety expands on the CBD, requiring parties to conduct risk assessment and risk management for living modified organisms that may adversely affect
biological diversity. However, these¶ requirements only extend as far as a country’s self-determined protective goals,44 and the treaty is¶ overall too trade-focused to provide
sufficient regulation of bioengineering. Third, the Biological¶ Weapons Convention takes measures to ensure that countries do not stockpile, acquire, or retain¶ dangerous
microbial or biological agents, including dangerous bioengineered organisms, but there is¶ an exception for “prophylactic, protective or other peaceful purposes” (BWC, Article
1), so while¶ the BWC may be useful in preventing acts of bioterrorism, the treaty does not sufficiently cover¶ biosafety issues. Overall, while these treaties regulate certain risks
arising out of bioengineering,¶ they fail to provide a comprehensive regulatory regime that addresses all of the risks that arise from¶ dual-use bioengineering technologies.¶ A.
International Legal Regime¶ There are a variety of options of how to create an international legal regime for¶ bioengineering. The first is to create a framework treaty that covers
an assortment of emerging¶ technologies such as bioengineering, nanotechnology, artificial intelligence, and geoengineering.¶ Having one overarching governance scheme is
perhaps ideal because of the fundamental similarities¶ amongst emerging technologies: they are all dual-use technologies; they all have risks that increase¶ with research and
development; they call for similar forecasting techniques; and laboratory¶ transparency is central to their governance—just to name a few.45¶ Another approach to create a treaty
is to begin by first creating “soft law.” Soft laws are¶ nonbinding laws, meaning that countries have discretion over whether to follow them.46 Hard laws,¶ on the other hand, are
legally binding, such as treaties. This approach would involve creating soft¶ laws, either for bioengineering or emerging technologies in general, until there is the necessary¶
political will for a legally binding treaty, at which point the soft law could be integrated into the¶ treaty.47 Beginning with soft laws is a prudent approach when countries are
unlikely to otherwise¶ agree to a legally binding treaty. However, soft laws do not provide the sort of robust protections¶ that may be necessary to prevent a GCR from
bioengineering.¶ Finally, another option is to create a new international organization dedicated to mitigating¶ GCR and other risks from emerging technologies, including
bioengineering. This international¶ organization could be established by an existing international body like the U.N. General Assembly,¶ the World Health Organization (WHO),
the Economic and Social Council (ECOSOC), the United¶ Nations Environment Programme (UNEP), the U.N. Educational, Scientific and Cultural¶ Organization (UNESCO), the
Organisation for Economic Co-operation and Development (OECD),¶ or the Strategic Approach for International Chemical Management (SAICM), for example. Perhaps¶ the
best option is for several existing international organizations to jointly create and/or chair a new¶ body; for example, UNEP and the WHO co-chair SAICM. A new organization
that covers¶ emerging technologies could attempt to quickly begin treaty negotiations or else create soft law that¶ is later embodied in an international treaty, as described
above.¶ B. Universal Treaty¶ Finally, a treaty that regulates bioengineering (or several emerging technologies) should¶ have universal participation, meaning that every country is
a party to the treaty. As Charney48¶ argues, global threats like environmental degradation, climate change, terrorism, and nuclear¶ weapons require treaties that are binding on
all countries, or else the unregulated countries could¶ impede with the purpose of the treaty. For bioengineering, universal participation is important¶ because otherwise
dangerous bioengineering technologies could be freely developed in the nonsignatory¶ countries, putting every other country in the world at risk. This type of situation¶
threatened to play out when physicist Richard Seed threatened to clone a human in Japan or Mexico if the technique was made illegal in the United States. If near-universal
participation is achieved,¶ another option is to impose sanctions on countries that do not comply with the terms of the treaty,¶ much like the Security Council imposes sanctions
on countries that violate certain provisions of the¶ NPT.¶ Another reason that a treaty on bioengineering should have universal participation is¶ because this would help prevent
an arms race for dangerous biotechnologies. According to¶ Pinson,49 an arms race could be triggered by advanced nanotechnologies that assemble novel¶ biological or chemical
compounds to be used as weapons. And Metzl50 argues that the genetic¶ manipulation of humans could spark an arms race to create populations that are smarter, physically¶
superior, and more advanced in other ways. A global treaty with universal participation may help¶ prevent similar arms races involving extremely dangerous and powerful
bioengineering¶ technologies, because all countries would be subject to the same obligations and therefore would¶ not be threatened with comparative disadvantages.¶ Of course,
achieving universal participation in a treaty is extremely difficult: Most of the¶ international community suffers from treaty fatigue, and developing countries have reservations¶
about diverting even more resources to implement treaties. Furthermore, an outright ban on¶ dangerous technologies does not always work, because this could hand over a
monopoly to rogue¶ actors49 and because countries want make beneficial use of dual-use technologies, including as¶ defensive mechanisms against noncompliant countries.51
Despite these hurdles, forces such as the¶ threat of an imminent global catastrophe from bioengineering or massive public pressure could¶ create the requisite political will for
countries to conclude a global treaty on bioengineering or all¶ emerging technologies.¶ C. Precautionary Principle¶ If the international community does decide to create an
international treaty on¶ bioengineering or all emerging technologies, they should consider the precautionary principle (or¶ the “precautionary approach”) as a tool to reduce
GCR.52 There are many interpretations of the¶ precautionary principle, but the crux of the meaning is that precautionary measures should be taken¶ in the face of scientific
uncertainty, or, inversely, that scientific uncertainty should not be an excuse¶ for forgoing precautionary measures. Applied to bioengineering, this could require, for example,¶
that scientists develop dangerous bioengineered pathogens only once they can affirmatively¶ demonstrate that there is a satisfactorily small possibility that the pathogens will
escape from the¶ laboratory. Similarly, a treaty could impose moratoriums on certain bioengineering applications,¶ such as the release of bioengineered microbes into the
environment, until proponents of these¶ technologies can satisfactorily prove their safety or, at minimum, that they will not cause a global¶ catastrophe.¶ V. CONCLUSION¶
Advances in bioengineering significantly increase humanity’s capacity to affect itself and
the world it lives in – for better or for worse. Bioengineering could revolutionize medicine and help¶ slash global
greenhouse gas emissions, which would allow humans to live long, healthy lives on a¶ clean planet. Or, it could cause an
unprecedented pandemic from a biosafety lapse or bioterrorist attack, which could
devastate the global population and have other consequences like societal collapse. This
paper develops GCR as a key consideration for dual-use bioengineering. According to¶ a range of consequentialist ethics views, reducing GCR should be
the top priority for society today.¶ However, other ethics views do not consider GCR to be a priority. Indeed, some views even suggest increasing GCR.
The question of how bioengineering should proceed is fundamentally an ethics¶ question, with the issue of GCR being a major factor.¶ Given the global
nature of bioengineering and of GCR, regulating bioengineering should be¶ a global endeavor. Several existing international legal structures can help,
including the CBD, the¶ Cartagena Protocol, and the BWC. However, these structures are insufficient to cover the entirety of¶ bioengineering GCR. New
international laws could help with this, whether applied solely to¶ bioengineering or all emerging technologies, but creating them could be difficult. In
the meantime,¶ other options include soft law and new international organizations, either of which could eventually¶ transition into a new international
treaty.¶ Perhaps no formal governance structure could ever offer complete protection from¶ bioengineering-related GCRs. Only the most intrusive
surveillance regime could hope to watch over¶ every decision made by every researcher; such a regime would itself pose many ethics issues and¶ could
even lead to global catastrophe.53 In the absence of such a regime, it is incumbent that¶ everyone in the bioengineering community – researchers,
institutional homes, funders, and others –¶ keep all the implications of bioengineering in mind, making preventing a GCR from bioengineering¶ their
top priority. The
survival of human civilization could well be at stake.
Pandemics advantage (May/may not read)
CDC surveillance is failing to comply with the International Health
Regulations—timely reporting is key
Sullivan 16
Ryan Sullivan (J.D., University of Georgia). “Implementing the International Health Regulations (2005) with Search Engine-Based
Syndromic Surveillance.” 2016. http://digitalcommons.law.uga.edu/cgi/viewcontent.cgi?article=2371&context=gjicl
To effectively contain an epidemic, the epidemic must be identified quickly. Traditionally,
an outbreak is identified only after a patient comes in contact with the disease, develops symptoms, visits her healthcare provider for
a diagnosis, and the patient's health information is forwarded to a higher authority. 17 Next, the higher authority must interpret
these routine reports, eventually finding a pattern among a population and concluding that the population is in danger of an
In the United States, it takes the Center for Disease Control and Prevention
approximately two weeks to detect a disease through this process. 19 If the process in
the United States is unacceptably slow, the process in developing countries is even slower. Unfortunately, countries that are faced
outbreak.
with responding to the outbreak Of a deadly communicable disease are Often those with the fewest resources.20 As a result, it sometimes takes longer to detect an outbreak and
longer still to communicate this information to national and global authorities.21 To effectively control a potential international outbreak, disease monitors need to uncover
critical information concerning potential outbreaks faster than traditional methods of disease surveillance allow. Fortunately, traditional methods of disease surveillance, such
as those discussed above, are no longer the only option. Incredibly, scientists have proven that in some cases the outbreak of pandemic disease could have been detected within a
regional population before it was diagnosed in an individual.22 Gunther Eysenbach's Infodemiologv: Tracking Flu-Related Searches on the Web for Syndromic Surveillance
revealed a strong correlation between the number of clicks on search-term determined links and incidences Of influenza in Canada_23 Eysenbach coined the term
"infodemiology" to describe the collation and analysis of this Internet search-term data to predict disease outbreak_24 Eysenbach's results have been repeated in several more
studies with different Internet forums. In 2009, a study was published showing that surveillance based on Internet search terms was able to predict influenza outbreaks in areas
with many web search users faster than traditional surveillance methods In another study, surveillance based on Internet search terms was able to predict outbreaks Of dengue
fever and influenza up to two weeks faster than traditional surveillance methods 26 Another study showed that the SARS epidemic could have been predicted more than two
months faster using search terms rather than the traditional methods the World Health Organization employed_27 This information has proven so reliable and useful that
Google has even created tools to help users track trends Of search terms associated with both influenza2S and dengue fever29 in different languages around the world. Clearly,
infodemiology's potential to predict deadly outbreaks weeks before such epidemics would otherwise be detected with traditional surveillance methods demonstrates it would be
a valuable supplement to traditional disease surveillance. As a result, this Note argues that the World Health Organization should make use of infodemiology by adding it as a
surveillance measure to its 2005 International Health Regulations (IHR (2005)). Indeed, as a surveillance technique, infodemiology offers a more centralized approach, further
protection of individual liberty, and greater emcacy than traditional measures. This Note explains some of the benefits of infodemiology in predicting the outbreak of deadly
diseases, particularly within the framework of IHR (2005). It also argues that infodemiology should be used more aggressively, in spite of privacy concerns, to identify potential
travelers that should be investigated for possible isolation or quarantine before travelling internationally. This Note depends on Eysenbach's theory that disease outbreaks can be
predicted using Internet search terms.30 This Note assumes that Internet use is widespread enough throughout the world for search terms to be meaningfully tracked. Finally,
this Note will assume that infodemiology is possible in enough languages that it can be a useful tool in tracking worldwide disease, in spite of the fact that people speak many
different languages in the parts of the world where pandemics tend to originate. These last two assumptions rely on the basic idea that with technological improvement, Internet
use will increase and analysis Of data will improve. Although infodemiology is a recent area Of study, access to Internet and methods Of analyzing Internet data have already
increased significantly. Part Il Of this Note will explore the history, implementation, and goals Of the IHR (2005), and will identify how IHR (2005) has failed to stop the spread
Of international disease. Part Ill argues that the World Health Organization should analyze data from Internet search engine query logs from around the world to predict
outbreaks Of deadly disease and focus further investigation at national points Of entry and exit in accordance with the framework set up by the IHR (2005) and the applicable
national law. Part IV concludes by discussing the consequences Of ignoring this technology at a time when pandemics can spread faster than ever. Il. BACKGROUND The IHR
(2005) became effective in June of 2007.31 The IHR (2005) are legally binding on 194 countries, including all World Health Organization member states.32 The regulations
represent a step forward in recognizing, treating, and preventing the spread of communicable disease across the world.33 Whereas the IHR 1969 only mandated the reporting of
three diseases, the IHR (2005) require the reporting of any potential public health emergency of international concern (PHEIC).34 The fact that the IHR ( 1969) merely required
cholera, plague, and yellow fever to be reported to the World Health Organization3S offers a reflection of a world in which few diseases posed a risk Of being quickly and
unexpectedly spread across the globe.36 The mandatory reporting Of certain enumerated health events, regardless Of local analysis Of the dangers posed by these events, is
additional evidence Of an increase in international authority at the expense Of national autonomy. 37 Due to the importance of quick reactions to the outbreak of diseases that
have historically caused extensive damage around the world, all cases of smallpox and SARS, as well as previously unidentified polio and flu viruses, must be reported to the
World Health Organization. 38 The IHR (2005) require the reporting Of a much broader category Of disease: any PHElC.39 States Parties to the IHR (2005) are now required to
analyze any potentially dangerous health event according to very general terms, and to report those events to the World Health Organization.40 The World Health Organization
expects national surveillance systems to report variations in local conditions. These new requirements reflect the globe's modern susceptibility to a variety of unexpected
diseases, and the preference for analysis that is universal and consistent rather than regional and fragmented.41 In addition to the much broader focus, the IHR (2005) sought to
modernize global health surveillance by developing national focal points for increased national accountability and communication, facilitating communication between national
focal points and World Health Organization contact points, predetermining responses to a PHEIC, and requiring each member state to achieve eight "core capacities."42 These
core capacities represent the abstract capabilities that each nation must maintain in order to effectively control the spread of disease internationally. 43 When the IHR (2005)
went into force in 2007, States Parties were given until June 2009 to assess their existing national structures and resources in an effort to develop a plan Of action to meet the
regulation's new requirements 44 For the next phase, States Parties had until June 15, 2012 to implement the functioning core capacities that they lacked. Otherwise, they were
required to seek an extension'S In spite Of this deadline, national governments have struggled to implement the IHR (2005) requirements, especially at the local level in
developing countries.46 Gaps in surveillance data combined with a fear Of the World Health Organization infringing on the state sovereignty have proven to be costly barriers to
international response to epidemics. For example, the World Health Organization did not declare the 2014 West African Ebola epidemic a global emergency until August 2014,
Slow response times to international disease epidemics
suggest insufficient implementation of the IHR (2005)'s third core capacity:
surveillance This core capacity entails the rapid detection, risk assessment and notification of PHElCs'9 The IHR (2005) require three levels of disease
after a thousand African patients had already died."
surveillance in each country.S0 The first level is the most localized and defined by both geography (within the community) and function (primary public health response).' 1 The
local community level is responsible for detecting levels of disease morbidity or mortality above normal levels for the area, reporting these events to a higher surveillance level,
and initiating preliminary control measures.52 The intermediate public health response level distinguishes between urgent and non-urgent reports.S3 This level orchestrates
the national level must determine within
forty-eight hours if the event is potentially of international health concern, and if it is
statistically abnormal or unexpected." If the event is determined to be statistically
abnormal and of international concern, the nation must notify the World Health
Organization of the event within twenty-four additional hours. 56 Each level is dependent
additional control measures and reports urgent PHEICs to the national level? Finally,
on the information that is reported from the level below; therefore, the World Health Organization's international response to a
disease depends on an incredible diversity of locally sourced data.
Single-payer is key to disease surveillance—monitoring fails with private
plans
Long et al 14
Millie D. Long, MD, MPH,1 Susan Hutfless, PhD,2 Michael D. Kappelman, MD, MPH,1 Hamed Khalili, MD, MPH,3 Gil Kaplan, MD,
MPH,4 Charles N. Bernstein, MD,5 Jean Frederic Colombel, MD,6 Lisa Herrinton, PhD,7 Fernando Velayos, MD, MPH,8 Edward V.
Loftus, Jr., MD,9 Geoffrey C. Nguyen, MD, PhD,10 Ashwin N. Ananthakrishnan, MD, MPH,3 Amnon Sonnenberg, MD, MSc,11
Andrew Chan, MD, MPH,3 Robert S. Sandler, MD, MPH,1 Ashish Atreja, MD, MPH,6 Samir A. Shah, MD,12 Kenneth Rothman,
DMD, PhD,13,14 Neal S. Leleiko, MD, PhD,12 Renee Bright, MS,6 Paolo Boffetta, MD,6 Kelly D. Myers,15 and Bruce E. Sands, MD,
MS6. “Challenges in Designing a National Surveillance Program for Inflammatory Bowel Disease in the United States.” 2014.
http://pubmedcentralcanada.ca/pmcc/articles/PMC4610029/
The Calgary Health Zone is a population-based health authority under a public,
single-payer system, and provides all levels of medical and surgical care to the residents of the city of Calgary and over
20 nearby smaller cities, towns, villages, and hamlets. The estimated population of the Calgary Health Zone is over 2 million people.
As these data are identifiable, they can be linked to
other databases or to the electronic medical record to validate disease exposures
and outcomes. The Data Integration, Measurement, and Reporting (DIMR) hospital
discharge abstract administrative database captures all hospitalizations in the Calgary
Health Zone of Alberta Health Services, Canada. The DIMR database contains 42 diagnostic and 25
Administrative data are collected within the province.
procedural coding fields. The International Classification of Disease, Ninth Revision (ICD-9) was used up to 2001, whereas the
ICD-10-CA (Canadian adaption of ICD-10) and the Canadian Classification of Health Intervention coding were used after 2001.
Within the province, prevalence and inception cohorts were developed, with age- and sex-matched controls. Validation studies were
performed (via chart review) to confirm cases and exposures. Validation studies are essential to evaluate biases in an administrative
database. For example, in a recent validation study from within this cohort, administrative data identified the same risk factors as
This data source has been used in studies of IBD,
including outcomes, risk factor analysis, surveillance of comorbidities or
complications, tracking of infections, health services utilization and health
economics/costs.29-32 For example, the incidence of IBD has been reported to be 25/100,000 person-years from
chart review, but overestimated the magnitude of risk.28
2003-2011 in this region (equating to 800 new cases per year). Data from this cohort have also shown that risk factors for
advantages to the use of
administrative data as an initial data source include the large sample size, the ability
to completely capture any billed aspect of disease management (such as surgery) and
detailed information on costs. The disadvantages include the need for validation of individual level data via chart
postoperative mortality in IBD patients include age and disease severity.33 The
review and the costs and time involved to do so. Detailed clinical information, such as phenotype, disease activity, and biological
Unfortunately, due to the
inability to gain access to identifiable data within conglomerate
administrative data in the US, this type of prospective inception/prevalence cohort
is not possible. Median time on a given commercial health plan in the US is only a few
years. As we do not have the ability to track an individual across plans, long-term
outcomes data would not be available.
samples, are not available within administrative data, unless linked to other repositories.
Scenario 1 is Disease
National capacity building under the IHR enables a coordinated, global
response to diseases
Katz and Dowell 15
Rebecca Katz and Scott Dowell (George Washington University, Millken Institute School of Public Health, and Bill and Melinda
Gates Foundation). “Revising the International Health Regulations: call for a 2017 review conference.” May 8th, 2015.
http://www.thelancet.com/pdfs/journals/langlo/PIIS2214-109X(15)00025-X.pdf
The revised International Health Regulations (IHR)1 entered into force on June 15,
2007, obligating (now) 196 States Parties to detect, assess, report, and respond to
potential public health emergencies of international concern (PHEIC) at all
levels of government, and to report such events rapidly to the WHO to determine
whether a coordinated, global response is required. In the 8 years since its entry into force, there
have been three declared PHEIC, including pandemic infl uenza H1N1 in 2009, re-emerging wild-type poliovirus in April, 2014,
Ebola virus disease in west Africa in August, 2014, and the emergence of new diseases such as Middle East respiratory syndrome
Implementation of the IHR
has been tested under real world conditions. The regulations have served as a valuable
guidepost for national and international capacity building, coordination, and
collaborations for global health security. Other international fora have also recognised the importance of the IHR as a
coronavirus and infl uenza H7N9, with still uncertain risks to global population health.
global framework, and have focused discussions among nations on IHR-related core capacities in meetings of the Biological and
Toxin Weapons Convention, the Global Health Security Initiative, the North American Plan for Pandemic and Avian Infl uenza, the
Convention on Biological Diversity, and United Nations Security Council Resolution 1540, in addition to debate at the World Health
Assembly. Other new partnerships have formed to strengthen the global response to public health threats. In February, 2014, the
USA, along with almost 30 nations and the Directors-General of the WHO, Food and Agriculture Organization, and Organization for
Animal Health, launched the Global Health Security Agenda to address several high-priority, global infectious disease threats.
Foundations have become increasingly engaged in global health security, becoming primary funders for capacity building around the
world, and the World Bank is playing an increasingly more prominent role in global health preparedness and response. Yet by 2012,
only 42 nations (21%) reported that they had fully implemented the IHR and built appropriate core capacities to detect, assess,
report, and respond to public health emergencies. With follow-up reporting in 2014, only 64 nations reported that they had fully
implemented the IHR—an increase of only 10% over 2 years. The other 67% of nations either requested another 2-year extension
(81) or reported nothing at all (48).2 National compliance statistics are themselves an indicator of the challenges
associated with IHR implementation, particularly the paucity of mandated
funding to support capacity building. Additionally, it has become clear that the methods for assessing
health security preparedness leave substantial room for interpretation, and there are ongoing disagreements over the mandate for
the airport or port health certifi cation programme outlined in Article 20. Failure of the global community to respond rapidly and eff
ectively to the Ebola virus disease outbreak in west Africa demonstrates that there remain major implementation challenges, even
beyond funding and political will. It is time to consider whether or not aspects of the foundation for global health security embodied
in the IHR (2005) are too vague, missing, or need to be strengthened in order for IHR to stay relevant and useful.
Early warning is key—cures will never come
Walsh 17
Bryan Walsh (contributor to TIME. Previously, he was TIME’s International Editor, its energy and environmental correspondent
and was the Tokyo bureau chief in 2006 and 2007. He lives in New York). “The World Is Not Ready for the Next Pandemic.” Time.
May 4th, 2017. http://time.com/4766624/next-global-security/
Since it can require years of testing and well over $1 billion to successfully develop
a single vaccine against a single pathogen, drug companies have increasingly
shied away from the business. "There's just no incentive for any company to
make pandemic vaccine to store on shelves," says Dr. Trevor Mundel, president of the
global health division at the Bill and Melinda Gates Foundation. That's why most
infectious-disease experts aren't hanging their hopes solely on new treatments or
vaccines. After all, that's not what ultimately contained the most recent lethal outbreak of Ebola. It chiefly fell to health workers
on the ground and to Frieden, director of the CDC for eight years under President Obama. And on no day did that effort come closer
to failure than on July 23, 2014. That was the day Frieden received news that Ebola had arrived in the Nigerian megacity of Lagos.
The virus had been killing people for months in Guinea, Liberia and Sierra Leone, but Ebola in Lagos--the biggest city on the African
continent, with a metro population of 21 million--represented a threat of an entirely different magnitude. "If it got out of control in
Lagos, it could spread through Nigeria and the rest of Africa," says Frieden. "It could still be going on today." But it isn't, thanks
largely to the herculean efforts of thousands of expert health workers--U.S. staff from the CDC and Nigerian officials who had been
trained in the international effort to stop polio--who were quickly diverted to fight Ebola. This is why Frieden, Gates and others are
so bullish about investing in science and foreign aid. Without aid, Nigeria would not have been able to stem the spread of Ebola. And
without the next-generation science that helped track the outbreak, far more people would have died. "It's very important that this
Make no mistake: for all our
high-tech isolation units, top-tier doctors and world-class scientists, the U.S. health
care system is not ready for the stresses of a major pandemic. As the infectious-disease
kind of work continues," says Frieden, "or America is going to be less safe."
expert Osterholm notes, a pandemic is not like other natural disasters, which tend to be confined to a single location or region.
Disease can strike everywhere at once. In the event of a pandemic, even the best hospitals could rapidly run out of beds and
mechanical ventilators. The U.S. does have a national strategy for pandemics, and there have been welcome steps taken since the
bioterrorism fears that followed 9/11. In February, the military think tank DARPA launched a program aimed at producing effective
medicines within 60 days of the identification of a new, pandemic-causing pathogen. But the country hasn't been truly tested yet.
Melissa Harvey, who heads the division of national health care preparedness programs at HHS, is in charge of helping U.S. hospitals
while hospitals were able to handle a handful of sick
people during Ebola, a truly major crisis would be a different story. "In a situation like
get ready for the next big threat. She notes that
the 1918 pandemic, the expectation is that the resources are not going to be there for everyone." If you look at the numbers, it's clear
that right now the U.S. government doesn't spend in a way that says fighting pandemics is a consistent national priority. Instead,
money gets issued on a disease-by-disease basis, often after a crisis has started. During Ebola, for instance, Congress appropriated
more than $5 billion in much-needed emergency spending--but it did so nearly five months after international health groups had
called it a crisis. The drawbacks of this scattershot way of investing in pandemic response became even clearer during Zika, when it
took nearly nine months for Congress to finally allocate $1.1 billion to fight a disease that had already begun spreading in the U.S.
Even then, Congress required that some of that come from existing Ebola funding that had been going to pandemic preparation. "We
literally had to rob Peter to pay Paul," says Ron Klain, who served as Ebola czar during the Obama Administration.
Pandemics cause extinction—US response is key
Dhillon et al 17
Ranu S. Dhillon (MD, is an instructor at Harvard Medical School and a physician at Brigham and Women’s Hospital in Boston. He
works on building health systems in developing countries and served as an advisor to the president of Guinea during the Ebola
epidemic), Devabhaktuni Srikrishna (the founder of Patient Knowhow, which curates patient educational content on YouTube. He
worked on the response to the Ebola outbreak in Guinea), and David Beier (is a managing director of Bay City Capital. He previously
served in several leadership roles at the intersection of government, policy, and technology, including chief domestic policy advisor
to then-Vice President Al Gore, vice president for government affairs and policy at Genentech, senior vice president of global
government affairs at Amgen, and counsel to the U.S. House Judiciary Committee). “The World Is Completely Unprepared for a
Global Pandemic.” Harvard Business Review. March 15th, 2017.
https://hbr.org/2017/03/the-world-is-completely-unprepared-for-a-global-pandemic
We fear it is only a matter of time before we face a deadlier and more contagious
pathogen, yet the threat of a deadly pandemic remains dangerously overlooked. Pandemics now occur with
greater frequency, due to factors such as climate change, urbanization, and
international travel. Other factors, such as a weak World Health Organization and potentially massive cuts to funding for
U.S. scientific research and foreign aid, including funding for the United Nations, stand to deepen our vulnerability. We also
face the specter of novel and mutated pathogens that could spread and kill faster
than diseases we have seen before. With the advent of genome-editing technologies,
bioterrorists could artificially engineer new plagues, a threat that Ashton Carter, the
former U.S. secretary of defense, thinks could rival nuclear weapons in deadliness. The two
of us have advised the president of Guinea on stopping Ebola. In addition, we have worked on ways to contain the spread of Zika and
have informally advised U.S. and international organizations on the matter. Our experiences tell us that the world is unprepared for
these threats. We urgently need to change this trajectory. We can start by learning four lessons from the gaps exposed by the Ebola
and Zika pandemics. Faster Vaccine Development The most effective way to stop pandemics is with vaccines. However, with Ebola
there was no vaccine, and only now, years later, has one proven effective. This has been the case with Zika, too. Though there has
been rapid progress in developing and getting a vaccine to market, it is not fast enough, and Zika has already spread worldwide.
Many other diseases do not have vaccines, and developing them takes too long when a pandemic is already under way. We need
faster pipelines, such as the one that the Coalition for Epidemic Preparedness Innovations is trying to create, to preemptively
develop vaccines for diseases predicted to cause outbreaks in the near future. Point-of-Care Diagnostics Even with such efforts,
vaccines will not be ready for many diseases and would not even be an option for novel or artificially engineered pathogens. With no
vaccine for Ebola, our next best strategy was to identify who was infected as quickly as possible and isolate them before they infected
others. Because Ebola’s symptoms were identical to common illnesses like malaria, diagnosis required laboratory testing that could
not be easily scaled. As a result, many patients were only tested after several days of being contagious and infecting others. Some
were never tested at all, and about 40% of patients in Ebola treatment centers did not actually have Ebola. Many dangerous
pathogens similarly require laboratory testing that is difficult to scale. Florida, for example, has not been able to expand testing for
Zika, so pregnant women wait weeks to know if their babies might be affected. What’s needed are point-of-care diagnostics that, like
pregnancy tests, can be used by frontline responders or patients themselves to detect infection right away, where they live. These
tests already exist for many diseases, and the technology behind them is well-established. However, the process for their validation is
slow and messy. Point-of-care diagnostics for Ebola, for example, were available but never used because of such bottlenecks. Greater
Global Coordination We
need stronger global coordination. The responsibility for
controlling pandemics is fragmented, spread across too many players with no unifying
authority. In Guinea we forged a response out of an amalgam of over 30 organizations, each of which had its own priorities. In
Ebola’s aftermath, there have been calls for a mechanism for responding to pandemics similar to the advance planning and training
that NATO has in place for its numerous members to respond to military threats in a quick, coordinated fashion. This is the right
thinking, but we are far from seeing it happen. The errors that allowed Ebola to become a crisis replayed with Zika, and the WHO,
which should anchor global action, continues to suffer from a lack of credibility. Stronger Local Health Systems International actors
are essential but cannot parachute into countries and navigate local dynamics quickly enough to contain outbreaks. In Guinea it took
months to establish the ground game needed to stop the pandemic, with Ebola continuing to spread in the meantime. We need to
help developing countries establish health systems that can provide routine care and, when needed, coordinate with international
responders to contain new outbreaks. Local health systems could be established for about half of the $3.6 billion ultimately spent on
creating an Ebola response from scratch. Access to routine care is also essential for knowing when an outbreak is taking root and
establishing trust. For months, Ebola spread before anyone knew it was happening, and then lingered because communities who had
never had basic health care doubted the intentions of foreigners flooding into their villages. The turning point in the pandemic came
when they began to trust what they were hearing about Ebola and understood what they needed to do to halt its spread: identify
those exposed and safely bury the dead. With Ebola and Zika, we lacked these four things — vaccines, diagnostics, global
coordination, and local health systems — which are still urgently needed. However, prevailing political headwinds in the
United
States, which has played a key role in combatting pandemics around the world, threaten to
make things worse. The Trump administration is seeking drastic budget cuts in funding for foreign aid and scientific research. The
U.S. State Department and U.S. Agency for International Development may lose over one-third of their budgets, including half of the
funding the U.S. usually provides to the UN. The National Institutes of Health, which has been on the vanguard of vaccines and
diagnostics research, may also face cuts. The Centers for Disease Control and Prevention, which has been at the forefront of
responding to outbreaks, remains without a director, and, if the Affordable Care Act is repealed, would lose $891 million, 12% of its
overall budget, provided to it for immunization programs, monitoring and responding to outbreaks, and other public health
initiatives. Investing in our ability to prevent and contain pandemics through revitalized national and international institutions
should be our shared goal. However, if U.S. agencies become less able to respond to pandemics, leading institutions from other
nations, such as Institut Pasteur and the National Institute of Health and Medical Research in France, the Wellcome Trust and
London School of Hygiene and Tropical Medicine in the UK, and nongovernmental organizations (NGOs have done instrumental
research and response work in previous pandemics), would need to step in to fill the void. There is no border wall against disease.
Pandemics are an existential threat on par with climate change and nuclear conflict.
We are at a critical crossroads, where we must either take the steps needed to prepare for this threat or become even more
vulnerable. It is only a matter of time before we are hit by a deadlier, more contagious pandemic. Will we be ready?
EHR advantage V2
Electronic health records are inevitable, but they’ll fail absent
interoperability
Hussain et al 17
Sadath Hussain, Thilini Ariyachandra, and Mark Frolick (Management Information Systems, Xavier University). “Gateway to
Clinical Intelligence and Operational Excellence through a Patient Healthcare Smart Card System.” Journal of Information Systems
Applied Research. April 2017. http://jisar.org/2017-10/n1/JISARv10n1p29.pdf
1. INTRODUCTION The use of information technology in our daily lives has been on
the rise, patient care information systems have now become an integral part of the patient care support in the developed world
(Ash, Berg, & Coiera, 2014). In modern healthcare systems, "automation systems in hospitals and medical centers serve the purpose
of providing an efficient working environment for healthcare professionals" (Kardas & Tunali, 2006). Some argue that the use of
information technology is essential for keeping patients' records (Dick & Steen, 1991; Armony, Israelit, Mandelbaum, Marmor,
Tseytlin, & Yom-Tov, 2015). The use of information technology is believed to have increased the quality of health care services, and
decision support system for health care management, health education and research (Jones, Rudin, Perry, & Shekelle, 2014).
Information technology in healthcare has been greatly instrumental in enhancing the ability to apply the vast resources of
Healthcare providers that have
seen great success in the area of research and treatment have tremendously benefitted from the use of
information technology in complex and sustained health management situations.
big data and analytics (Kayyali, Knott, & Van Kuiken, 2013). IT capabilities have empowered providers to be able to maneuver the
medium to their advantage. The widespread adoption of IT has also led to a better grasp on handling unintended and unexpected
issues when they arise (Bates, Saria, Ohno-Machado, Shah, & Escobar, 2014).). Additionally, IT capabilities encourage more abstract
Yet this data is not fully integrated for
access during regular patient visits or for decision support and research analytics (Jensen, Jensen, &
Brunak 2012). The United States continues to have large integrated transactional and
decision support databases but they are NOT integrated across the nation (Koebnick,
thinking about health care data; especially for research purposes.
Langer-Gould, Gould, Chao, Iyer, Smith, & Jacobsen, 2012). This paper proposes the creation of a distributed healthcare
information system that is grounded in past literature. It also lays out the process through which the adoption of an universal
medical record access and analytics (UMRAA) system would be created. The implementation of such a system will involve a pilot
study in a single US state. The potential success of the proposed plan in this particular state could be leveraged to help sell the idea to
the rest of the nation. The paper first discusses the state of healthcare and IT as presented in past literature done in the field. The
next part of the paper describes smart card use in healthcare, the proposed system, advantages and challenges as well as the plan for
the development and implementation of the Unified Medical Record Access (UMRAA) card in detail. Finally the potential challenges
that the proposal could face are discussed. 2. STATE OF HEALTHCARE AND IT Keeping in mind the current state of the US
healthcare system, it
is very timely and paramount for a information systems and analytics to
continue to be a major contributor to every decision-making process that goes on in the
healthcare industry (Himss 2014). The growth of data collection and the inclusion of information technology in healthcare
continues to grow at a rapid pace. It is expected to reach $31.3 billion by 2017 (Bernie Monegain, 2013). The current state of
healthcare predominantly revolves around the following issues: (1) actual costs associated with getting quality care, (2) the
accessibility and availability of healthcare across the continental US, and (3) the education provided to individuals about
maintaining sound health and obtaining precautionary health check-ups in order to prevent major medical costs (Shi, & Singh,
2014). Another major issue with healthcare would be the potential inability to cater to the needs of the next wave of senior citizens
(Ou, Shih, Chin, Kuan, Wang, & Shih 2013). Additionally, the rise of medical errors and the potential for maltreatment due to lack of
availability of complete patient health information is also one of the major issues in the healthcare industry (Agha 2014). The
dissatisfaction with the healthcare system could increase over the next few years as a result of increased out-of-pocket expenses
associated with the weakening economy and increasing prescription drug prices (Haren, McConnell, & Shinn, 2009). The gradual
increase in uninsured individuals is only going to add to the increasing costs of health care in the future. Keeping these factors in
mind, bolstering
the US healthcare structure with the support of an integrated
information technology solution for access and analysis would be the potential solution
moving forward (IOM, 2009). One such change in the past that has shown enormous
success is switching from paper medical records to electronic medical records that
provided a centralized location for storing patient information that in turn streamlined healthcare management (Frolick, 2005;
Jacobus, Braun, & Cobb, 2014). The implementation of information system in health care practices is fraught with numerous risks.
The stored data on multiple locations in health practices can be a challenge to report essential and strategic information for various
stakeholders (AHRQ, 2006). The healthcare information management system in many developed countries like the US, Canada, and
many European countries is not well integrated (Brown 2003). Due to various patient information flows and routes, there has been
an inherent difficulty to integrate and report the data essential for the management, clinicians, policy makers and researchers (Poon,
Jha, Christino, Honour, Fernandopulle, Middleton, & Kaushal, 2006). . 3. SMART CARDS AND MEDICAL HEALTH RECORDS The
idea of having a complete medical record on a smart card based system has been considered for several years now (Smart Card
Alliance 2012). Computer systems that could store medical histories on a smart card were invented more than a decade ago in
countries such as Hungary, France, and Spain (Naszlady & Naszlady 1998). However, the US has yet to implement such a system at a
national level. In 1998, a study involving an electronic chip card was carried out where 5000 chronically ill patients throughout
Hungary received a smart card that had entire patients’ medical history stored in it (Naszlady & Naszlady 1998). The goal was to
achieve complete patient information and also to support the growing need for an integrated healthcare delivery model.
Currently, there is no national health care smartcard system in place in the US.
However, in some European countries such as Britain and France, pilot programs had been established over a decade ago. These
pilot programs have proved to be highly useful and easily implementable (Neame, 1997; Marschollek & Demirbilek 2006; Liu, Yang,
Yeh, & Wang. 2006). Today’s, health smartcards in France have served the purpose of carrying information related to health
insurance, and some ongoing health records and basic emergency health information. Furthermore, strengthening the evidence of
their usefulness, the Exeter Care Card (ECC) pilot program that was funded by Britain's Department of Health and carried out by
Exeter University (Hopkins, 1990), showed tremendous advantages of having such a smart card system in the health care industry.
The advantages of the ECC pilot were the reduction in the cost of prescribing; reduced cost of carrying out investigations; reduction
in risk of iatrogenic cases of illness; reduced times taken for data communication; ready access to necessary medical records
(Neame, 1997). A valuable addition to the result of the study was that it also showed high patient satisfaction levels. In 2006,
another study illustrated to the US healthcare industry the possibility of solving one of
the major hurdles to smartcard technology - which is interoperability (Marschollek &
Demirbilek, 2006). Since healthcare organizations do not use the same health information system software, there are
multiple sets of ways to code for the same information based on the type of software that is being used. These challenges can easily
be overcome using standardized software and technologies in order to facilitate interoperability with multiple healthcare
information systems, such as used in the German Health Card pilot program (Marschollek & Demirbilek, 2006). Another study
conducted by Wei Chen et al (2012), proposed to establish a portable electronic medical record system that applied streaming media
technology to access medical images and transmit them via the Internet. This is an example of a distributed information
management systems in healthcare. Figure one shows a graphical representation of the structure of the portable electronic medical
record (EMR) system. The study proposed a system that is composed of the EMR query system, data exchanging, and the EMR
streaming media system. The proposed architecture provided local hospital users the ability to acquire EMR text files from a
previous hospital. It also helped access medical images as reference for clinical management. The proposed architecture shown in
figure one provides a diagrammatic illustration of what a distributed information system could look like (Wei Chen, 2010). One
major limitation to the system shown in the study is the system’s dependency on the internet for its data transfer functionality.
However, the concept proposed in this paper does not require the internet for its operability and functionality. The factors that have
been referenced from all the various studies described provides a compelling argument to implement a comprehensive, consolidated
and secure model for healthcare information system that can be easily and quickly made available and accessible to healthcare
providers. Adding
portability to the electronic medical record system in the form of the
UMRAA card maximizes efficiency and streamlines the whole patient-doctor
experience at a national level.
The complexity of status quo billing codes means only single-payer ensures
interoperability for EHRs
Geyman 16
John Geyman, MD. “Electronic Health Records: Panacea vs. Unintended Consequences.” Physicians for a National Health Program.”
October 25th, 2016. http://pnhp.org/blog/2016/10/25/electronic-health-records-panacea-vs-unintended-consequences/
Electronic health records (EHRs) have become adopted for widespread use by a growing majority of U. S. physicians. It has been assumed that the wider adoption of EHRs
would improve efficiency and patient safety, reduce diagnostic testing and medical errors, improve continuity and quality of care, and save money. Their use was accelerated by
the Affordable Care Act (ACA) after its passage in 2010. The Centers for Medicare & Medicaid Services (CMS) have further stimulated their adoption by developing “meaningful
use” criteria tied to reimbursement levels. To be fair, EHRs have brought some useful capabilities to U. S. physicians, including electronic prescribing of medications, receiving
clinical test results, electronic lab orders, electronic administration tools, and communication with patients. They have been helpful in home monitoring of high-risk patients,
especially those with congestive heart failure, in reducing hospital re-admisssions. A 2015 survey of 600 U. S. physicians found that one in four physicians offered telemonitoring
devices to patients to enable them to monitor their health care. That same study, however, found that less than one-half of surveyed physicians believed that EHRs improved
patient outcomes. (1) Some adverse impacts on medical practice Although EHRs have largely replaced paper records and brought some efficiencies to the process of delivering
health care, there are some important problems that call into question some of the assumptions made by their architects. These are some of the unintended consequences of the
widespread adoption of EHRs as they now are: A 2016 national study in four specialties (family medicine, internal medicine, cardiology, and orthopedic surgery), however,
identified some growing frustration with EHRs. Physicians were spending almost two hours of each clinic day for every hour of direct face-to-face time with patients. (2) Data
entry is time consuming and inefficient; physicians are forced to type into the computer during patient visits; when that burden becomes too diverting from relating to and
A 2013 report from the RAND Corporation
confirmed the inefficiency of EHRs, noting inadequate exchange of health information
and interoperability, and concluded that template-based notes degrade the quality of clinical documentation and care. (3)
examining the patient, scribes are brought in to deal with computer entry.
A 2014 study found that less than one-half of U. S. hospitals can transmit a patient care document and that only 14 percent of
physicians can exchange patient data with outside hospitals or other providers. (4) Exchange of health information and
interoperability are problematic as manufacturers resist standardization and customize EHRs to their clients; as one example,
Wisconsin-based Epic, with the largest market share in the country for EHRs, has placed three different systems in nearby
Madison’s three hospitals, requiring attending physicians to learn each system. The hassle of dealing with EHRs has contributed to
increasing frustration and burnout of one-third of their physician users. (5) A 2012 study found that physicians’ access to EHRs did
not reduce their ordering of unnecessary tests. (6) Another 2012 study found that EHRs led to increased costs of tests performed,
and that many hospitals raised their ER billings to Medicare. Based on the above, we need to conclude that EHRs have brought
some efficiencies to U. S. health care but at a high cost, including high administrative costs and time, as well as adverse impacts on
have also become a billing
tool that is vulnerable to gaming the reimbursement system by physicians and hospitals,
the doctor-patient relationship without evidence to date of improved patient outcomes. They
contributing to the ACA’s inability to contain health care costs. What lessons can we draw from this mixed experience? Despite
The above
adverse results are symptomatic of our profit-driven multi-payer financing
system that reimburses physicians, hospitals, and other health care professionals and
facilities within a hugely bureaucratic, fragmented and unaccountable health care
system. EHRs have become a billing instrument for a system out of control. For separate reasons that add further complexity,
believe it or not, we now have 140,000 different billing codes (not a typo!) (7) All of the
above outcomes will continue unchecked until we fundamentally change the financing
system by adopting single-payer Medicare for All and simplified administration, including standardized EHRs
that are interoperable and based on evidence-based services. Improvement of EHRs will probably require this
level of financing reform before they can include the kind of readily accessible
information about evidence-based services as well as sufficient personal information about patient preferences. EHRs should
grudging acceptance of EHRs by most physicians, they are here to stay. Nobody wants to return to paper records.
become useful from physician to physician and among health care facilities anywhere in the country. Their content needs to be
re-thought so that repetitive templates of unnecessary clinical information are eliminated. Quality measures should be improved so
as to be better aligned with outcomes of care. (GAO, Report to Congressional Committees, October, 2016.) Billing codes need to be
reduced to rational and meaningful levels. Such
useful medical and billing records have been achieved
by many other advanced countries around the world with one or another form of
universal access based more on a service ethic than a competitive profit-maximizing
business “ethic.” They should be achievable if we have the political will, and should be the goals of our society on a
non-partisan basis for the common good.
Data linkages aren’t possible with for-profit multi-payer systems like the US
Long et al 14
Millie D. Long, MD, MPH,1 Susan Hutfless, PhD,2 Michael D. Kappelman, MD, MPH,1 Hamed Khalili, MD, MPH,3 Gil Kaplan, MD,
MPH,4 Charles N. Bernstein, MD,5 Jean Frederic Colombel, MD,6 Lisa Herrinton, PhD,7 Fernando Velayos, MD, MPH,8 Edward V.
Loftus, Jr., MD,9 Geoffrey C. Nguyen, MD, PhD,10 Ashwin N. Ananthakrishnan, MD, MPH,3 Amnon Sonnenberg, MD, MSc,11
Andrew Chan, MD, MPH,3 Robert S. Sandler, MD, MPH,1 Ashish Atreja, MD, MPH,6 Samir A. Shah, MD,12 Kenneth Rothman,
DMD, PhD,13,14 Neal S. Leleiko, MD, PhD,12 Renee Bright, MS,6 Paolo Boffetta, MD,6 Kelly D. Myers,15 and Bruce E. Sands, MD,
MS6. “Challenges in Designing a National Surveillance Program for Inflammatory Bowel Disease in the United States.” 2014.
http://pubmedcentralcanada.ca/pmcc/articles/PMC4610029/
The Calgary Health Zone is a population-based health authority under a public,
single-payer system, and provides all levels of medical and surgical care to the residents of the city of Calgary and over
20 nearby smaller cities, towns, villages, and hamlets. The estimated population of the Calgary Health Zone is over 2 million people.
As these data are identifiable, they can be linked
to other databases or to the electronic medical record to validate disease
exposures and outcomes. The Data Integration, Measurement, and Reporting (DIMR)
hospital discharge abstract administrative database captures all hospitalizations in the
Calgary Health Zone of Alberta Health Services, Canada. The DIMR database contains 42 diagnostic and
Administrative data are collected within the province.
25 procedural coding fields. The International Classification of Disease, Ninth Revision (ICD-9) was used up to 2001, whereas the
ICD-10-CA (Canadian adaption of ICD-10) and the Canadian Classification of Health Intervention coding were used after 2001.
Within the province, prevalence and inception cohorts were developed, with age- and sex-matched controls. Validation studies were
performed (via chart review) to confirm cases and exposures. Validation studies are essential to evaluate biases in an administrative
database. For example, in a recent validation study from within this cohort, administrative data identified the same risk factors as
This data source has been used in studies of IBD,
including outcomes, risk factor analysis, surveillance of comorbidities or
complications, tracking of infections, health services utilization and health
economics/costs.29-32 For example, the incidence of IBD has been reported to be 25/100,000 person-years from
chart review, but overestimated the magnitude of risk.28
2003-2011 in this region (equating to 800 new cases per year). Data from this cohort have also shown that risk factors for
advantages to the use of
administrative data as an initial data source include the large sample size, the ability to
completely capture any billed aspect of disease management (such as surgery) and detailed information on
costs. The disadvantages include the need for validation of individual level data via chart review and the costs and time involved to
do so. Detailed clinical information, such as phenotype, disease activity, and biological
samples, are not available within administrative data, unless linked to other
repositories. Unfortunately, due to the inability to gain access to identifiable
data within conglomerate administrative data in the US, this type of prospective
inception/prevalence cohort is not possible. Median time on a given commercial health
plan in the US is only a few years. As we do not have the ability to track an individual
across plans, long-term outcomes data would not be available.
postoperative mortality in IBD patients include age and disease severity.33 The
Interoperable health records are necessary for new breakthroughs in
precision medicine
Bresnick 16
Jennifer Bresnick (editor-in-chief of HealthITAnalytics). “Big Data Interoperability a Must for Precision Medicine Progress.”
HealthITAnalytics. November 16th, 2016.
https://healthitanalytics.com/news/big-data-interoperability-a-must-for-precision-medicine-progress
The United States healthcare system still has a long way to go before it develops the
seamless health data interoperability required to take on the most pressing
precision medicine challenges, including finding cures for cancer. Big data interoperability will help precision
medicine, cures for cancer According to a November report from the President’s Cancer Panel, health IT could be an extremely
effective weapon in the fight against cancer and other serious diseases, but a lack
of big data interoperability,
patient access to data, and a fragmented research environment may be holding back
stakeholders from making key breakthroughs. “We live at a most exciting and critical time of
technological advances with potential to help individuals manage and improve their own health and support high-quality,
patient-centered cancer care,” wrote Barbara K. Rimer, DrPH, Hill Harper, JD, and Owen N. Witte, MD. “But today, many patients
cannot access or share their own health information; care teams experience electronic health record fatigue and frustration due to
lack of interoperability, among other challenges; and researchers do not have a central location to compile, analyze, or even access
Data siloes often prevent providers, patients, public health agencies, and
researchers from collaborating to generate actionable insights from disparate sources of information, the Panel
critical data.”
added, and many members of the care continuum lack the tools and resources they need to leverage big data for measurable
improvements. But the Panel believes that these obstacles can and will be overcome as the healthcare system fine-tunes its
approaches to creating, storing, sharing, and analyzing big data. “Connected health technologies have the potential to maximize the
value of our nation’s investments in cancer by supporting empowered individuals and patients. The report concludes that connected
health is truly about people more than technologies, and that timely and equitable access to data is imperative to improve health
outcomes. In addition, a culture of collaboration is essential to accelerate progress.” The
report outlines a series of objectives and action items that may help to bring providers, patients, and researchers into alignment
around their use of big data and their ability to reach common precision medicine goals. Interoperability among institutions and
individuals is the first step – not just between clinicians and researchers investigating new therapies, but between providers and
patients from prevention to survivorship. “The potential benefits of interoperable connected health tools and systems are
particularly great for oncology because the delivery of care across the cancer continuum depends on access to accurate and complete
information, as well as extensive coordination among patients, caregivers, and diverse teams of providers,” the report states. Health
IT stakeholders should continue to focus on improving the nation’s health information exchange capabilities, overcoming technical
and policy barriers to seamless data exchange, and developing standards-based tools and APIs to create opportunities for precision
medicine apps and systems that support cancer care. Patients must also be actively involved in this process, which means that big
data must flow to and from the healthcare consumer – and consumers must have the mechanisms required to check for errors in
aggregated medical records and report concerns to their providers. “Connected health tools are needed to ensure that people at risk
for cancer, cancer patients, and cancer survivors have access to the information they need when they need it and in formats that
meet their needs,” the Panel says. Patient engagement and shared decision-making are key features of any cancer treatment plan,
and health IT organizations should pay close attention to developing intuitive, engaging, and useful tools to help patients participate
in their own care. The Panel also highlights the need to ensure that patients and healthcare organizations have regular, reliable
access to broadband internet in order to facilitate engagement and data sharing. “The full benefits of connected health cannot be
achieved unless everyone in the United States who wants to participate and the organizations that support health and deliver
healthcare have adequate access to high-speed Internet service.” “For individuals, access to online tools, such as patient portals, is
necessary to receive information from and communicate with healthcare providers. For healthcare providers and systems, robust
broadband access is needed to facilitate collection and sharing of increasing quantities of health-related data.” The report supports
initiatives that expand broadband access into underserved areas. Stronger broadband infrastructure and increased access to data
sharing tools are important for facilitating surveillance and public health research that can help to identify trends in disease
development, treatment, and outcomes. Breaking
down data siloes is also critical for fostering
large-scale surveillance and population health management, the report
stresses. As electronic health records allow providers to collect more and more clinical information, researchers must have
access to big data that can be mined for hidden insights. “In the past, health data remained wherever they were collected and
generally were used in limited ways to serve the specific needs of whoever collected them,” the Panel says. “These silos represent a
significant missed opportunity. Connected
health technologies have an important role to play by
facilitating linkages of systems and data sets and creating tools that enable
researchers, clinicians, and patients to use data in meaningful ways.” Big data analytics and
connected health technologies should eventually help the nation develop more fluid, centralized, and actionable data stores that
allow researchers and providers to improve their approaches to precision medicine challenges. “To achieve the development of a
national infrastructure to support sharing and processing of cancer data, technical and logistical challenges to data integration must
be overcome, and the cancer community must foster a culture of collaboration that encourages data sharing and free exchange of
ideas,” the Panel concludes. “The Panel urges all stakeholders – health IT developers, healthcare organizations, healthcare providers,
researchers, government agencies, and individuals – to collaborate in using connected health to reduce the burden of cancer through
prevention and improve the experience of cancer care for patients and providers.”
Single-payer generates the breadth and depth of data required for
personalized treatments
Stuss et al 15
Donald T. Stuss, PhD (Ontario Brain Institute), Shiva Amiri, PhD (Ontario Brain Institute), Martin Rossor, MD (University College
London), Richard Johnson, M.S., J.D. (C.E.O. of Global Helix LLC) and Zaven Khachaturian, PhD (Alzheimer’s Association). “How
we can work together on research and health big data: Strategies to ensure value and success.” Edited by Geoff Anderson and Jillian
Oderkirk, Dementia Research and Care: Can Big Data Help?, OECD Publishing. February 3rd, 2015. Google Books.
5.4 New value proposition: Moving from silos to systems The value lies in the creation of a system, integrating all partners from the
very beginning of research to catalyse, facilitate, and maximise scientific, health care and policy, and commercialization efforts. The
standardization of assessments and the sharing of data have many positive outcomes. Assessment
standardization in all of the clinical centres involved in the research means consistency of clinical evaluation at
the research level across regional and national boundaries and an increase in the
number of individuals involved in research activities. The sample size increase has
obvious benefits for research power, and the study of mechanisms of disorders across diseases. With a greater number of
individuals involved, and careful standardized characterization, the potential exists for good data and high quality
clinical trial platforms. There is an increased opportunity to observe the variability
and heterogeneity of disease expression (Georgiades et al., 2013; Stuss and Binns, 2008), and develop
well-characterised sub-groups. A direct and completely linked corollary is improved diagnosis
and treatment. This should be attractive to improve clinical trials and commercialization of neuroscience research, in both
neurotechnology and neurotherapeutics, i.e., the potential benefits of targeted pharmacological and behavioural treatments. In
there is a real opportunity for product development based on a “personalised
medicine” approach. And this will only be enhanced if the full data sets from all past clinical trials are shared (Eichler et al.,
2013). Equally important is the need to link both research “deep data” and an individual’s
and population “broad data” (defined as the data in the health system—often the
greatest breadth of data is in single payer arrangements—of the patient’s medications, usage of
essence,
the health system, changes in personal health over time, the existence of co-morbidities, and the associated cost of this usage) about
AD and dementia with the vast amounts of data generated during clinical trials. It is important to take advantage of the new policies
adopted by many biopharmaceutical companies, social philanthropists, and government funders to increasingly share clinical data.
This provides a unique opportunity for health policy and health service delivery research. The OECD should identify and catalogue
these new polices and trends across different regulatory jurisdictions. For example, the US National Academies recently released a
new report proposing guiding principles for responsible sharing of clinical trial data (National Research Council, 2014).
Precision medicine solves antibiotic resistance
Bresnick 17
Jennifer Bresnick (editor-in-chief of HealthITAnalytics). “Precision Medicine Test for Bacteria May Aid Antibiotic Stewardship.”
HealthITAnalytics. June 30th, 2017.
https://healthitanalytics.com/news/precision-medicine-test-for-bacteria-may-aid-antibiotic-stewardship
Antibiotic stewardship – or the lack thereof – has become a major topic of concern among patient safety advocates as the number of
deaths due to antibiotic-resistant infections each year consistently tops 20,000. While
standardized protocols for escalating patients through a series of different antibiotics can help to ensure that providers don’t overuse
the most powerful weapons in the healthcare industry’s arsenal to begin with, a new approach to clinical decision support may help
By applying precision medicine
techniques to bacteria, not just to the patients who host them, Boston University
researcher Ahmad (Mo) Khalil hopes to enhance providers’ abilities to use the most
effective antibiotic the first time– reducing the chance for superbugs to
develop further resistance to the limited number of treatments available for infectious
diseases. “With rising rates of drug-resistant infections, there is pressing need for new diagnostic methods that can rapidly
to target the right antibiotic to the right bacteria more frequently.
determine the most effective therapy for an infection,” says Khalil in an abstract posted on the National Institutes of Health’s online
research reporting portal. In 2016, Khalil received an NIH Director’s New Innovator Award for his work, which was also recently
showcased in an NIH blog post penned by Director Dr. Francis Collins. “Unfortunately, the current method for performing antibiotic
susceptibility testing (AST) involves growing microorganisms from clinical samples and determining their sensitivity to antibiotics
through cell growth,” Khalil continued. “This ‘gold standard’ technique is extremely time-consuming (minimum 48-72 hours) and
can result in significant delays in appropriate therapy, prolonged illness, greater risk of death, inappropriate antibiotic use, and
increased spread of resistance.” For patients with some infections, he added, AST is not even performed. Instead, providers start off
with a common antibiotic and only shift gears if the treatment is unsuccessful, which takes additional time that may contribute to
the risk of mortality. More than three quarters of hospitals may be routinely overusing antibiotic treatments, the CDC said in 2014.
In 2016, the Pew Charitable Trusts pointed out that up to half of all antibiotics may be prescribed inappropriately to patients, many
of whom are actually suffering from viruses, not bacterial infections. The CDC has a slightly more conservative estimate, stating that
about 30 percent of the 154 million routine antibiotic prescriptions written each year in the US may be unnecessary. Instead of
leaving providers to rely on guess work and familiar habits, Khalil is planning to develop a new diagnostic tool based on sequencing
the transcriptomes of a variety of bacteria. “He then uses that information to produce a panel of RNA sensors specific to each
particular bacterial strain, and freeze-dries those sensors onto strips of testing paper, creating what he thinks will be a highly specific
diagnostic test with a very long shelf life,” Collins explains. Providers could obtain a sample from a patient and expose it to a certain
antibiotic, wait 20 minutes, then add the sample’s cells to a test strip. “That liquid would serve to reconstitute the freeze-dried RNA
sensor reactions embedded on the paper, and those sensors would light up if the sample contains a bacterium that is a good
candidate for the antibiotic,” Collins writes. “Clearly,
to select the best antibiotic as quickly as
possible, doctors may want to prepare several aliquots of a patient sample, expose each
of them to a different antibiotic, and run several of the ‘freeze-dried’ RNA sensor tests in
parallel.” The technique has the potential to limit the dangerous practice of exposing
bacteria to a firehose of common antibiotics, which allows the bugs to mutate
and develop resistance. “If the source of the infection proves to be one of those common bacteria that is highly sensitive
[to common antibiotics], there’s a good chance the antibiotic will work,” Collins says. “But if not, the bacterium may survive and
grow more resistant to antibiotics. Not only might this make the patient’s own infection harder to treat, it could threaten the health
of others by adding to our growing public health problem of antibiotic-resistant infections.” Khalil plans to begin his work by
targeting bacteria of major concern to the CDC, such as N. gonorrhoeae. Currently, there are no existing clinical AST techniques to
identify this set of bacteria, he pointed out, which makes it even more important to define the bacteria’s’ RNA susceptibility
While the approach is still early in its development, the research
illustrates how precision medicine concepts are opening new doors for treatment
and patient care in a variety of different healthcare arenas. Allowing providers to access
speedy and accurate clinical decision support tools at the point of care may
change the game for antibiotic stewardship advocates, reducing the worrisome
rise of superbugs and improving the quality of care for patients.
signatures for the first time.
Single payer independently solves ABR
Mike 6 (July 29, 2006, “Antibiotic Resistance and National Healthcare”, Mike is a
long-time writer and Scientist for ScienceBlogs, He insists upon anonymity and explains
why here http://scienceblogs.com/mikethemadbiologist/2011/09/01/program-announcement-i
m-moving/, below is a story that he quotes from Guernsey news, accessed 9/14/17, jmg)
While the main reason to use antibiotics only when needed is to preserve their
effectiveness, it’s always nice to have an economic incentive coupled with proper use of
these important drugs. From the Guernsey Press and Star: The States prescribing
support unit is claiming success in a campaign to encourage islanders to think more
carefully about their need for the drugs. Prescriptions fell by 983 courses – a reduction
of 3.3% on the previous year – between October 2005 and March 2006, reducing costs
to the States by £30,000. ‘That is a significant reduction in what is the peak season for
antibiotics use,’ said prescribing adviser Geraldine O’Riordan. Social Security minister
Mary Lowe said: ‘The saving to the Guernsey health service fund is very welcome. ‘But
the professionals will tell you that there are also important health reasons for using
antibiotics only when they are really needed. ‘It is vitally important that we continue to
reinforce this very important message. I hope that all will think more carefully about our
use of antibiotics to treat minor illnesses.’ Director of public health Dr David Jeffs, who
also chairs the island’s infection-control committee, said that resistance to antibiotics
was an increasing problem and poor prescription and administration were major
contributing factors. ‘Antibiotics certainly aren’t indicated for uncomplicated viral
infections, while many milder bacterial infections will resolve in a couple of days
without more active treatment. ‘It makes little sense getting a script for a five-day course
of antibiotics for something which would have got better in two, especially if this is
going to contribute to the growing problem of antibiotic resistance.’ The aim of the
winter campaign was to support local doctors in changing patient expectations about
antibiotic treatment. One advantage of a national healthcare system that is not built
around the profit motive is that it is able to anticipate and address emerging problems.
Because all the costs in a national system are internalized, whereas in a for-profit system
the costs are often externalized, the economic burden of antibiotic resistance can not be
pawned off onto someone else. Of course, it’s not just about the money: 14,000 people
per year die from hospital-acquired antibiotic resistant bacterial infections–and this is
probably a massive underestimate due to reporting ‘errors.’ Antibiotic resistance: one
more reason why the U.S. needs a national healthcare system.
ABR causes extinction
Srivatsa 17
Kadiyali M. Srivatsa (doctor, inventor, and publisher. He worked in acute and intensive pediatric care in British hospitals).
“Superbug Pandemics and How to Prevent Them.” The American Interest. January 12th, 2017.
https://www.the-american-interest.com/2017/01/12/superbug-pandemics-and-how-to-prevent-them/
It is by now no secret that the human species is locked in a race of its own making with “superbugs.” Indeed, if popular science
fiction is a measure of awareness, the theme has pervaded English-language literature from Michael Crichton’s 1969 Andromeda
Strain all the way to Emily St. John Mandel’s 2014 Station Eleven and beyond. By a combination of massive inadvertence and what
can only be called stupidity, we must now invent new and effective antibiotics faster than deadly bacteria evolve—and regrettably,
they are rapidly doing so with our help. I do not exclude the possibility that bad actors might deliberately engineer deadly
superbugs.1 But even if that does not happen, humanity faces an existential threat largely of its own making in
the absence of malign intentions. As threats go, this one is entirely predictable. The concept of a “black swan,” Nassim Nicholas
Taleb’s term for low-probability but high-impact events, has become widely known in recent years. Taleb did not invent the concept;
he only gave it a catchy name to help mainly business executives who know little of statistics or probability. Many have embraced the
“black swan” label the way children embrace holiday gifts, which are often bobbles of little value, except to them. But the threat of
inadvertent pandemics is not a “black swan” because its probability is not low. If one likes catchy labels, it better fits the term “gray
rhino,” which, explains Michele Wucker, is a high-probability, high-impact event that people manage to ignore anyway for a raft of
social-psychological reasons.2 A pandemic is a quintessential gray rhino, for it is no longer a matter of if but of when it will challenge
us—and of how prepared we are to deal with it when it happens. We have certainly been warned. The curse we have created was
understood as a possibility from the very outset, when seventy years ago Sir Alexander Fleming, the discoverer of penicillin,
predicted antibiotic resistance. When interviewed for a 2015 article, “The Most Predictable Disaster in the History of the Human
Race, ” Bill Gates pointed out that one of the costliest disasters of the 20th century, worse even than World War I, was the Spanish
Flu pandemic of 1918-19. As the author of the article, Ezra Klein, put it: “No one can say we weren’t warned. And warned. And
warned. A pandemic disease is the most predictable catastrophe in the history of the human race, if only because it has happened to
the human race so many, many times before.”3 Even with effective new medicines, if we can devise them,
we must contain
outbreaks of bacterial disease fast, lest they get out of control. In other words, we have a
social-organizational challenge before us as well as a strictly medical one. That means getting sufficient amounts of medicine into the
right hands and in the right places, but it also means educating people and enabling them to communicate with each other to prevent
any outbreak from spreading widely. Responsible governments and cooperative organizations have options in that regard, but even
individuals can contribute something. To that end, as a medical doctor I have created a computer app that promises to be useful in
that regard—of which more in a moment. But first let us review the situation, for while it has become well known to many people,
there is a general resistance to acknowledging the severity and imminence of the danger. What Are the Problems? Bacteria are
among the oldest living things on the planet. They are masters of survival and can be found everywhere. Billions of them live on and
in every one of us, many of them helping our bodies to run smoothly and stay healthy. Most bacteria that are not helpful to us are at
least harmless, but some are not. They invade our cells, spread quickly, and cause havoc that we refer to generically as disease.
Millions of people used to die every year as a result of bacterial infections, until we developed antibiotics. These wonder drugs
revolutionized medicine, but one can have too much of a good thing. Doctors
have used antibiotics recklessly,
prescribing them for just about everything, and in the process helped to create strains of
bacteria that are resistant to the medicines we have. We even give antibiotics to cattle that are not sick and
use them to fatten chickens. Companies large and small still mindlessly market antimicrobial products for hands and home, claiming
that they kill bacteria and viruses. They do more harm than good because the low concentrations of antimicrobials that these
products contain tend to kill friendly bacteria (not viruses at all), and so clear the way for the mass multiplication of surviving
unfriendly bacteria. Perhaps even worse, hospitals have deployed antimicrobial products on an industrial scale for a long time now,
the result being a sharp rise in iatrogenic bacterial illnesses. Overuse
of antibiotics and commercial
products containing them has helped superbugs to evolve. We now increasingly face microorganisms
that cannot be killed by antibiotics, antifungals, antivirals, or any other chemical weapon we throw at them. Pandemics are the major
risk we run as a result, but it is not the only one. Overuse of antibiotics by doctors, homemakers, and hospital managers could mean
that, in the not-too-distant future, something
as simple as a minor cut could again become
life-threatening if it becomes infected. Few non-medical professionals are aware that antibiotics are
the foundation on which nearly all of modern medicine rests. Cancer therapy, organ transplants,
surgeries minor and major, and even childbirth all rely on antibiotics to prevent infections. If infections become
untreatable we stand to lose most of the medical advances we have made over the past
fifty years.
Download