Questions for USA Today on - State/Local Air Pollution Control

advertisement
Response
By Brad Heath, Blake Morrison and Linda Mathews / USA TODAY
Feb. 6, 2009
Questions for USA Today on “The Smokestack Effect: Toxic Air and America’s Schools”
and associated Background Information
Submitted by the National Association of Clean Air Agencies
Members of the National Association of Clean Air Agencies (NACAA) have submitted
the following questions related to USA Today’s article entitled, “The Smokestack Effect: Toxic
Air and America’s Schools” and the associated background information. Please provide
responses to Mary Sullivan Douglas of NACAA at mdouglas@4cleanair.org.
QUESTION 1
In USA Today’s FAQ, the response to the question “How accurate are the school
rankings?” is as follows:
The EPA model produces estimates of how each school ranks nationally based on
its exposure to industrial air pollution, and the hazards those chemicals can pose.
But those rankings are estimates. They are based on emission reports companies
filed each year with the EPA. The model also makes other assumptions about how
those chemicals will travel through the air. For more information about how the
model works and its limitations, see a description of our methodology.
In some places, the model likely overstates the levels of toxic chemicals in the air.
For example, the model ranks Ashland City Elementary School in Ashland City,
Tenn., as having the worst air in the country. That estimate is based on emissions
reports by a hot water heater manufacturer in the city. A spokesman says most of
the manganese is trapped in tiny shards of steel that are swept up from the shop
floor and discarded, but is not emitted into the air, as its report said. USA
TODAY monitored the air near Ashland twice, and both times found manganese
levels significantly lower than what the model predicted (emphasis added). That
said, environmental experts such as Johns Hopkins University scientist Patrick
Breysse stress that monitoring for longer periods is necessary before residents can
know for certain what's in the air.
The reporters admit that the report is in error regarding the school having the worst air in
the country. This same error also affected the results for other schools in the area. However, the
website continues to report this school as having the worst air in the country. Do they intend to
correct known errors on the website or continue to report results known to be inaccurate and
misleading?
Rob Raney (Nashville, TN)
We've believe we've been very careful to report what we know, what we don't know, and to
avoid drawing inferences our evidence does not support. Our reporting, both in print and online,
accurately reflects what we know about Ashland City. We offered this summary in a story
published Dec. 9:
Other locations appeared less troubling. In Ashland City, Tenn., for instance, the EPA
computer model indicated the air at Ashland City Elementary School was rife with
manganese. The model ranked it among the very worst schools in the nation for industrial
pollutants.
USA TODAY monitored the air twice near the school. Although the snapshot samples
aren't definitive, both tests found levels of manganese thousands of times lower than what
the model estimated would be in the air there.
Why the vast discrepancy? The EPA model relied on reports submitted by A.O. Smith, a
water heater manufacturer in Ashland City. The company reported to the EPA that it
released 33,788 pounds of manganese into the air in 2005.
A spokesman for the company, Mark Petrarca, says its emissions reports are accurate but
that its manganese is trapped in flakes that usually fall to the shop floor and are moved
off the site. Only "trace amounts," he says, would be emitted from the plant.
That's consistent with what USA TODAY found and underscores the need to monitor
before concluding that the air outside any school is dangerous.
Like you, we think our monitoring results offer a strong indication that manganese levels around
Ashland City Elementary School are not as high now as the model suggests they were in 2005.
But we believe it is important to interpret our conclusions narrowly, and as compelling as they
seem, they cannot entirely foreclose the possibility that levels were higher there in 2005. As you
know, even in Ashland City, we collected samples for only two weeks. Additionally, the
emission reports used in the RSEI model predate our monitoring by three years. We believe our
report accurately conveys what we know: That the EPA model shows high levels of manganese
in the air. That the facility responsible for those emissions says its reports are accurate, but that
they overstate the impact. And that our own monitoring did not find significant levels of the
chemical during our sample periods last year.
In two cases, companies have confirmed to us that they have either contested or revised the
emissions reports they filed with the EPA under the Emergency Preparedness and Community
Right to Know Act. (EPCRA) In both instances, we updated our database to indicate that the
facilities now claim their emissions were overstated. But that is not what happened in Ashland
City. There, not only has A.O. Smith not disputed its EPCRA report, it has insisted repeatedly
that it was accurate.
Even had A.O. Smith contested the data, our ability to update the database is limited. For
example, the calculations about which chemicals are likely to be in the air in a given place and
2
their levels were made by the EPA when it produced the RSEI model. We are unable to recompute them. More to the point, had we arbitrarily determined that Ashland City Elementary
School should not have the ranking the EPA model gives it, what ranking should we have
ascribed?
That said, we believe the data presented on our website are precisely what we represent them to
be: The conclusions of the EPA's computer model, subject to all its inherent strengths and
weaknesses.
QUESTION 2
For the information they took from the Toxics Release Inventory (TRI) reports, did they
only include the chemicals/amounts “released to air” or did they include the total TRI reportable
number for each facility? (I have read the methodology and cannot confirm this).
Dona Bergman (Evansville, IN)
The air emissions component of RSEI looks primarily at chemicals released into the air. It
further differentiates between stack and fugitive releases. The model's technical documentation,
available online, offers additional details. That said, technical questions about RSEI -- its data
inputs, its dispersion modeling, etc. -- are best addressed by the EPA's Office of Pollution
Prevention and Toxics, which maintains the model. The office may be contacted at (202) 5648790.
QUESTION 3
Are there actual tests capable of detecting the levels of each of the chemicals mentioned in the
report as impacting schools (those for each school listed) in the ambient air (so that if parents ask
for testing to be performed…is it actually possible to test for the chemicals they have reported)?
Dona Bergman (Evansville, IN)
A variety of sampling techniques are available to check for the chemicals identified in the EPA's
model (and in our database). Some are more complicated than others. We used relatively simple
passive monitors to measure the levels of volatile organic compounds (VOCs) in the air near 95
schools. In addition, we used somewhat more sophisticated pumps and filters to measure levels
of metals and polycyclic aromatic hydrocarbons (PAHs). (For more details about our monitoring,
see the answer to Question 9, below.) Other methods certainly are available. For example, state
and federal regulatory agencies have developed a network of sophisticated monitoring sites
capable of detecting a wide variety of hazardous air pollutants. For simpler sampling, we
provided a link from our website to Global Community Monitor, an organization that has helped
residents conduct basic air monitoring in communities across the country.
QUESTION 4
3
For the monitors that were in place for several days/hours, did they review and/or see any
differences in ambient air quality at different times of the day? Can they provide hour-by-hour
results or only total levels? (This attempts to get at the root of whether the levels are higher when
kids are being dropped off/picked up).
Dona Bergman (Evansville, IN)
Our monitors operated 24 hours a day, but were capable only of estimating total levels for the
monitoring period, which ranged from 72 hours to seven days, depending on the test. We were
not able to differentiate the results by date or time.
QUESTION 5
I view the results of the USA Today research with a lot of skepticism because of the
distant proximity of the sources to the schools. In one example I looked at - Cony HS in Augusta,
Maine - the sources that were identified were a plant in Lewiston (1 hour drive away to the
southwest) and two paper mills (again at least an hour’s drive away to the north and northwest).
The results of the model may have used school locations as the receptor points for the model –
but in actuality the whole community is being exposed to similar levels of ambient air pollution
from those emission sources. The study seems to use the school receptor points as a way of
sensationalizing the issue and bringing people’s attention to the fact that air pollution for all of
us, is a significant health concern.
I would agree that results of air pollution nearby schools which are located in close
proximity to a source could be a very significant source of concern for all kinds of HAPs from
solvents and other chemicals used in industrial processes, but think that in many of the cases the
broader concern is that we’re all breathing tainted air in these communities. In addition, I believe
there are also a lot of emissions sources right nearby the schoolyard that wouldn’t have been
included in the model results that can have a big impact on local air quality and health.
My first question is; what tools from the model or other evaluations can be used to more
specifically identify cases where there are more immediate and direct impacts to a school yard’s
air quality, that might differentiate it from cases where the situation simply reflects the
transported background air people are breathing in a community?
Deb Avalone-King (Maine)
We did not focus on schools to be sensational. Rather, we did so because children are widely
regarded as being far more susceptible than the rest of us to harm from air pollution.
You are certainly correct that the model does indicate some schools feel the effects of emissions
by industrial facilities many miles away. In most of the cases we've seen, the effects from distant
facilities are comparatively small. Those determinations are the product of the EPA's calculations
when it produced the RSEI model. Technical questions about RSEI -- such as those relating to
4
dispersion modeling -- are best addressed by the EPA's Office of Pollution Prevention and
Toxics which maintains the model. The office may be contacted at (202) 564-8790.
In the case of Cony High School in Augusta, Maine, RSEI shows very low levels of overall
toxicity. That school ranks in the 73rd percentile. In other words, while RSEI may show an
impact from facilities that are a significant distance away, that impact remains slight. The
presence of other emissions sources closer to the school could be detected through monitoring.
We did not conduct monitoring at Cony.
We agree that in many locations, schools can be affected by emissions that do not appear in the
RSEI model. Those emissions could come from mobile sources, or from smaller industrial
facilities not required to report under EPCRA because their emissions do not exceed regulatory
thresholds. In our own monitoring, we found significant levels of chemicals such as benzene and
carbon tetrachloride near several schools, even though they did not appear in the RSEI model
output for those locations. Our monitors were not capable of differentiating the specific sources
of those emissions.
QUESTION 6
Where spot monitoring was done in school yards, were local emissions from diesel buses
and other vehicles as well as other air pollution sources identified as contributors to the pollutant
concentrations? Was any attempt made to partition out the contributions from such local sources
that were not included in the modeling emission factors when comparing the data results?
Deb Avalone-King (Maine)
The monitors were generally placed within 100 yards of each school, but, with one exception,
not on school grounds. We were not able to identify the sources of the chemicals identified by
those monitors, nor were we able to differentiate point sources from mobile or other area
sources.
QUESTION 7
Can you explain why the impacts of some of the facilities are so far-reaching (up to 50
miles or so)? We do not usually see these kinds of impacts in our modeling and risk assessment.
Olga Boyko (NJ)
Please see the answer to Question 5, above.
QUESTION 8
Carcinogens seem to be way down on the list of air toxics affecting New Jersey schools.
In our facility-specific assessments, and even in NATA, the carcinogens are almost always the
primary risk dirvers. I have looked at EPA's "Risk-Screening Environmental Indicators
Methodology" (dated October 2007), section 4.1.3, "Algorithm for Calculating Toxicity
5
Weight." Although it shows inhalation adjustment constants of 1.8/RfC and IUR/0.00014, it does
not specify how these were derived. It also states that the "toxicity scoring system implies that
exposure at the level of the RfD is equivalent to a 2.5 x 10-4 cancer risk" (page 27), but does not
address the comparison of the inhalation RfC to a specific cancer risk. Presuming that the
inhalation risk is also equivalent to a 2.5 x 10-4 cancer risk, this is drastically different from the
target of one in a million cancer risk used for risk assessment by New Jersey and many other
states. It would be helpful if we could find out if this difference that is built into RSEI is the
reason for the relatively insignificant impact of carcinogens in this USA Today study. It would
help us explain to our management and the public why the chemicals in the USA Today study
are different from the ones we are usually concerned with in our permitting and risk assessments.
Olga Boyko (NJ)
We see some distinctions that could explain the difference between what you've seen in the
National Air Toxics Assessment and what the RSEI model suggests. Most obvious is that NATA
also includes estimates of emissions from a wide array of area sources, whereas RSEI covers
only reported emissions from large industrial facilities. However, technical questions about how
the EPA derived its unit risk values and toxicity weights are best addressed to the EPA's Office
of Pollution Prevention and Toxics, which maintains the model. The office may be contacted at
(202) 564-8790.
QUESTION 9
I read through the USA Today website materials on “The Smokestack Effect, Toxic Air
and America’s Schools” and find them lacking in an explanation of actual scientific methods
used. They did not describe in detail the types of samplers used, how the samples were prepared
and handled, the training of their operators, if there was a quality assurance project plan (QAPP)
in place, or even if the operators followed an SOP.
As you know, the type of sampler and laboratory methods used can introduce artifacts
and losses that can bias the sample. It would be nice to know at least the name, and model of
each sampling instrument used, the laboratory methods, the filter media, and what sampling
protocols were used. For example, the USA Today methodology states that, "Filters that picked
up metals were analyzed by Johns Hopkins" but, it does not describe the filter media (e.g.,
Teflon, glass fiber, quartz fiber, or another), the manufacturer of the filters (Whatman, Gelman,
etc.). What sampler did they use for the filter based metals? Was it a FRM or FEM for each
method? Was it located in a secure area following EPA guidance for siting criteria? What siting
scale (micro, middle, neighborhood, urban) was used at each site to locate the samplers?
There are many more questions that could be answered if they could provide the QAPP
or an SOP for each sampling and laboratory method. If this study is to be considered scientific,
perhaps they should consider publishing it in a peer reviewed journal.
Patrick R. McGraw (Colorado)
6
The monitoring program was designed by scientists at Johns Hopkins Bloomberg School of
Public Health and the University of Maryland School of Public Health. The field workers who
carried it out, primarily reporters from USA TODAY and local newspapers owned by Gannett
Co., USA TODAY’s parent company, followed a protocol written by the scientists. Linda
Mathews, USA TODAY’S senior enterprise editor and the editor in charge of the overall project,
recruited the field workers, managed the distribution and return of the samplers and handled all
questions from the field. She and the two authors of the series, Blake Morrison and Brad Heath,
did about 25 percent of the monitoring themselves -- and nearly all of the more complex metals
and PAH monitoring -- after being trained at Hopkins and Maryland.
We used the RSEI model to identify schools where we monitored. About 70 percent of the
monitored schools were in the top 20 percent of the RSEI model, those where the model
suggested students were exposed to the highest levels of toxic chemicals. On the advice of
Hopkins and Maryland, the other 30 percent of the monitored schools were chosen from the
model’s lower quintiles -- 6 to 8 from each quintile.
We placed VOC monitors near 106 schools, in each case specifying that the monitor should be
shielded from the rain and wind and located within 100 yards of the school grounds. Each was
placed 5 to 7 feet above the ground, in an open area where air would circulate freely. We
provided tin buckets and messenger bags to protect the monitors from the elements and, in the
case of the messenger bags, required that they be hung on porches, under wide eaves or in
carports as long as there were no cars close by during the testing period. Because of the increased
complexity and cost of metals and PAH testing, we collected those samples at a smaller subset of
the 106 schools.
At 11 of the schools where we monitored, the equipment was stolen by passersby or damaged by
the elements -- in two cases, by the remnants of a hurricane. Thus, the lab analysis was run for
monitors from only 95 of the sites.
Our examination was based on findings from three types of monitors:
1. VOC samples were collected using the passive Organic Vapor Monitor (3M CO, St. Paul,
MN). The sampling durations were either 4 days (23 schools) or 7 days (72 schools). The
OVM monitors were placed outside the selected homes at a height of 5 to 7 feet. These
monitors were checked periodically by the field staff to ensure they were not damaged.
After completion of the sampling, the monitors were collected, sealed in a canister, and
shipped to the laboratory overnight in ice packs to Dr. Amir Sapkota's lab at the
University of Maryland.
At Dr. Sapkota's lab, the OVM monitors were analyzed using a previously published
method (Payne-Struges et al 2004). In brief, the OVM monitors were spiked with 1 ug of
internal standard and the adsorbed VOCs were extracted from the activated charcoal pads
with a mixture of acetone and carbon disulfide (2:1, v/v), using ultrasonicator. Following
extraction, samples were analyzed using a Shimadzu QP2010 Gas Chromatograph Mass
Spectrometer (GC/MS) that was operated in selective ion monitoring (SIM) mode. The
limit of detection (LOD) was obtained by multiplying the SD of the seven spiked samples
7
by the Student’s t value associated with 99 percent confidence interval with 6 degrees of
freedom. All samples that were below the LOD were assigned a value that was ½ the
LOD. All reported values were corrected for field blanks.
2. PAH samples were collected using an active sampling method. The PAH sampler
consisted of PUF/XAD-2/PUF cartridges (SKC, Eighty Four, PA), connected to personal
a sampling pump (SKC Aircheck XR5000, SKC, Eighty Four, PA) operated at 4 L/min.
The duration of sampling for PAH samples was generally between 72 and 96 hours.
Flows through the samplers were measured in the field using the BIOS primary calibrator
prior to and after completion of the sampling process. Upon completion of the sampling
process, each PAH sampler was wrapped in aluminum foil to protect it from light and
shipped to the laboratory overnight using ice packs, where it was stored at -20º C until
analysis. When all the PAH samples were received from the field, they were shipped to a
commercial laboratory for analysis. At the laboratory, the PAH samplers were spiked
with internal standards and soxhlet-extracted with 6 percent diethyl ether in hexane and
concentrated to 1 mL. The extracts were analyzed using an Agilent 7890A GC equipped
with an Agilent 5975C Mass Spectrometer that was operated in SIM mode.
3. To monitor for metals, 37 mm air sampling cassettes (we get them from Pall Life
Sciences but they can be bought from an number of vendors) were loaded with 37 mm
2um pore size Teflon filters (Pall Life Sciences). The filter cassettes were connected via
tubing to AirChek (SKC incorporated) pumps, with the airflow set at 4.5 L/min. The flow
was calibrated at the beginning of the monitoring process and rechecked at the end. The
flow rates and timing were noted on log sheets filled out in the field. The devices were
hung 5 to 7 feet above the ground, in protected places 100 yards or less from the schools.
The pumps were placed inside messenger bags, with the monitors exposed to the air
outside the bags. The pumps ran on electricity from outdoor power outlets. Roughly 10
percent of samples were field blanks. At the end of the monitoring, the monitors were
repacked and returned to at Johns Hopkins Bloomberg School of Health for analysis.
Particles were extracted from filters by microwave digestion using a MARS XPress
microwave (CEM Corporation, Matthews, NC) with Optima Grade nitric acid and
hydrofluoric acid (Fisher Scientific, Columbia MD). The digested samples were analyzed
for total metal content via ICP-MS analysis (Agilent 7500ce, Agilent Technologies, Santa
Clara, CA). The resulting concentrations of field blank samples were subtracted from
particle sample concentrations. Standard reference material (SRM) NIST material 2709
San Joaquin soil (National Institute of Standards, Rockville, MD) was digested and
analyzed using the same method. A correction factor was applied to sample metal
concentrations if SRM recoveries differed by ≥10% from the target recovery. All sample
metals were run against a calibration curve with ≥4 points. Each metal limit of detection
(LOD) was obtained by determining 3*sd of a 5 ppb standard multiplied by the digest
volume and divided by the nominal volume of air sampled (27 m3). The established
LOD’s for this analysis are: Be 0.026, Al 0.27, V 0.030, Cr 0.076, Mn 0.048, Fe 0.024,
Co 0.066, Ni 0.059, Cu 0.035, Zn 0.064, As 0.035, Se 0.194, Mo 0.053, Cd 0.047, Sb
0.026, and Pb 0.012 ng/m3.
8
QUESTION 10
My only question is whether we can have RSEI model input files. We cannot reproduce
their work based on the current TRI reports alone. As we work with the schools in the top 1% of
the study, we’d like to have that information so we can tell whether we’re identifying and
addressing the correct issues.
Jim Hodina (Linn County, IA)
Unfortunately, we cannot provide you with the raw RSEI data. While we appreciate your interest
and understand the value the raw data could have, two considerations constrain us. First, our
general policy is to not disseminate unpublished information gathered during the course of our
reporting. Second, the University of Massachusetts Amherst researchers who originally obtained
the data from the EPA's contractor did so after a considerable investment of time and expense.
We're therefore not sure it would be appropriate for us to further disseminate their data.
That said, the university's Political Economy Research Institute is under no such constraints, and
may be willing to share the RSEI microdata with you, in whole or in part. You may reach the
Institute at (413) 545-6355. Further, you could certainly obtain a copy of the full model from the
EPA.
Modeling
Concerning the use of the TRI data for the model, was the model using just air emissions or the
on- and off-site totals of each chemical?
See the answer to Question 2, above.
Sampling:
What was the brand, model #, or item # of the badges, pumps and filters used?
What were the detection limits of the sampling methods used?
See the answer to Question 9, above.
What were the sample durations for each sample?
The duration varied slightly by location. Where we monitored for metals and PAHs, the samplers
generally were placed on a Monday morning and collected the following Friday afternoon.
Where we monitored only for VOCs, the samplers were in place for seven consecutive days.
Where were the specific sampling locations?
For several reasons, including the privacy of the people who aided our sampling efforts, we have
decided not to disclose the specific addresses at which we conducted sampling. However, each
9
sampler was set up on private property, generally within 100 yards of school property. In
accordance with guidance from Johns Hopkins University and the University of Maryland, all
were placed in outside area where they could be kept dry, generally 5 to 7 feet from the ground.
What were the sampling protocols (sampler height, placement protocols, what to do when it
rains, etc.)
Is there a sampling SOP (Standard Operating Procedure) and can we get a copy?
Is there a sampling QA (Quality Assurance) Document and can we get a copy?
See the answer to Question 9, above.
Analysis:
What are the lab detection limits for the compounds reported?
Is there a laboratory SOP (Standard Operating Procedure) and can we get a copy?
Is there a laboratory QA (Quality Assurance) Document and can we get a copy?
See the answer to Question 9, above.
Sampling Data and Risk
Do blanks in the data table (that the DEP received from USA TODAY) mean that the compound
was not detected?
Yes
What were the URF's (Unit Risk Factors) and/or RfC's (Reference Air Concentrations) used with
the sampling results to rank the schools in the two tables shown on the
web?(http://www.usatoday.com/news/nation/environment/school-air-snapshotchart.htm)
The risk factors and reference concentrations were derived primarily from the EPA's prioritized
chronic dose response guidance. (See http://www.epa.gov/ttn/atw/toxsource/table1.pdf.)
What are the list of schools that had a less than 1-in-a-100,000 risk from the sampling results (in
other words, what other schools were tested at that didn't have appreciable risk?)
Unfortunately, we are unable to share the complete list, in keeping with our general practice of
not disseminating information we have not published.
How were non-detects treated in the risk analysis? Was 1/2 the detection limit substituted?
Non-detects were excluded entirely from any published analysis.
10
Download