E-Reporting Study Draft

advertisement
Version 1
DRAFT – DO NOT CITE
June 6, 2011
Does Electronic Reporting of Emissions Information Generate Environmental Benefits?
Wayne Gray, Ron Shadbegian, and Ann Wolverton
ABSTRACT
In recent years, regulatory agencies, including the Environmental Protection Agency, have
become increasingly interested in moving from paper to electronic reporting of various types of
information, including emissions. An often cited reason for moving to electronic reporting is
the potential for administrative cost savings for both regulated entities and government.
Another reason, for which far less empirical evidence exists, is that electronic reporting has the
potential to improve regulatory compliance and environmental quality. If electronic reporting
is perceived by regulated entities as increasing the overall ability of the agency to effectively
monitor and enforce regulations, those entities may put additional effort into improving their
environmental compliance. In addition, the ability to respond more quickly to ongoing noncompliance may increase compliance pressures on already-inspected plants. In this paper, we
study whether and how electronic reporting of water discharge monitoring report (DMR) data
affects compliance behavior of regulated entities. In particular, we examine whether or not the
adoption of an electronic reporting requirement increases the probability that regulated
entities are in compliance and reduces the length of time they spend out of compliance.
In this paper, we test whether or not electronic reporting requirements generate improved
compliance behavior with water discharge regulations by taking advantage of the fact that
some states have adopted electronic reporting requirements, while others have not. In
particular, we use a difference-in-differences approach to examine whether compliance
behavior for facilities in Ohio, before and after it adopted mandatory electronic reporting, is
similar to compliance behavior for facilities in other states that did not adopt such a system.
We conduct the analysis using annual compliance data from 2005 to 2010. We obtain facility
level information for DMR reporting rates and compliance status from EPA’s Permit Compliance
System (PCS).
1
Version 1
I.
DRAFT – DO NOT CITE
June 6, 2011
Introduction1
Administrative cost and time savings for Federal and state regulators as well as regulated
entities are commonly cited benefits of electronic reporting government initiatives (e.g., Office
of Management and Budget 2010). Recently, government agencies have raised the possibility
that moving from paper to electronic reporting could also result in improved compliance with
existing regulations. There is little empirical evidence on how electronic reporting may manifest
itself in compliance behavior. This paper explores whether any improvements in compliance
due to electronic reporting are discernible in the context of waste water discharges by
comparing facilities in Ohio, where electronic reporting of discharges has been mandatory since
late 2007, with facilities in states that do not offer electronic reporting as an option for
reporting to state regulators.
Expected improved compliance through electronic reporting is based on the argument that
there will be improved data quality through reduced manual error of paper written forms, the
ability to cross-check data quickly with permitted discharge levels to identify potential errors or
compliance problems, and timely feedback to regulated entities if anomalies arise to rectify the
problem. In addition, if the advent of electronic reporting is viewed by regulated entities as an
increase in the ability of regulators to effectively monitor and enforce existing regulations, then
the entities may decrease their discharges to reduce the probability that they are out of
compliance (referred to in the literature as specific deterrence). Even facilities already in
compliance or less likely to be audited may reduce their discharges in response to a perceived
increase in monitoring or enforcement capabilities (referred to as general deterrence).
Empirical research on the effect of monitoring and enforcement activity suggests that it has
historically resulted in substantial emissions reductions and therefore improvements in
environmental quality due to specific and general deterrence (see Shimshack 2007 for an
overview of this literature).
This paper is organized as follows. Section II discusses the literature on the benefits of
electronic reporting. Section III discusses state programs that allow for the electronic reporting
of waste water discharge information. Section IV lays out a methodology for empirically
investigating whether and how electronic reporting of discharge monitoring report data affects
compliance behavior. The data used for this research is discussed in section V. Section VI
presents summary statistics. Results are presented in section VII. Challenges and limitations of
the empirical approach taken in this paper are discussed in section VIII. Finally, section IX
concludes.
1
The views expressed in this paper are those of the authors and do not necessarily represent those of the US
Environmental Protection Agency. This paper has not been subjected to EPA’s review process and therefore does
not represent official policy or views.
2
Version 1
II.
DRAFT – DO NOT CITE
June 6, 2011
Existing literature on the potential benefits of electronic reporting
When thinking about how a move to electronic reporting could generate improved compliance
with environmental regulations, it is important to isolate the effect of moving to a mandatory
electronic reporting system from the broader array of benefits associated with having a
reporting requirement. For instance, in the case of waste water discharges EPA already
requires regulated entities to submit regular discharge monitoring reports (DMRs) via the
states. Thus, while the literature that addresses the benefits of implementing a reporting
system generally is not applicable, the literature that evaluates the effects of moving from a
paper to an electronic system is relevant. However, the published literature examining the
benefits of moving to an electronic reporting or filing system is remarkably thin. To date, we
have found published literature that analyzes the benefits of moving to electronic filing of tax
returns; the electronic submission and availability of environmental information; the availability
of quarterly financial data via EDGAR; and electronic access to medical patient records and the
recording of clinical patient diaries electronically. In some cases, these studies are unable to
distinguish between the benefits of electronic reporting itself versus the benefits coming from
the availability of data in an electronically-accessible data base. However, since electronic
reporting facilitates quick release of such information to the public, we have included it in this
discussion as a closely related benefit of e-reporting.
Electronic filing of tax returns
According to Internal Revenue Service (IRS) officials, e-reporting of digital data has simplified
the Service’s ability to cross-reference the e-reported data against other data sources, allowing
errors to be caught and corrected more efficiently. The IRS notes that the error rate for
electronically filed returns is less than 1 percent, compared to an error rate for paper returns of
about 20 percent (e.g., Internal Revenue Service 2010). Electronic filing has also expedited
processing of tax payment and refunds. Given these benefits, the IRS has established an 80%of-taxpayers E-file goal (Internal Revenue Service 2008). One paper also studied the empirical
implications of electronic filing with regard to the earned income tax credit (EITC), which was
substantially under-utilized by qualifying households in the early 2000s. Making use of
differences across states with regard to electronic filing programs (i.e. they use a difference-indifference empirical approach), they found that access to electronic filing had a significant and
positive effect on EITC claims (Kopczuk and Pop-Eleches 2007).
3
Version 1
DRAFT – DO NOT CITE
June 6, 2011
Electronic submission and availability of environmental information
Evidence of the benefits of e-reporting in the environmental context is mostly qualitative in
nature, but still instructive. These papers point to increased efficiency in processing and
reviewing emissions data so that information is made available more quickly for facilities
regarding compliance and the public regarding potential environmental hazards. Whether this
translates into measurable improvements in compliance and, thus, environmental outcomes
has not been well-studied.2
An Environmental Council of the States funded project reviewed the quality and efficiency of
five technologies used by EPA to facilitate electronic exchange of information for states, tribes
and local agencies. 3 It found that automating the exchange of data has allowed states to
dramatically improve the quality, timeliness, and availability of environmental data
(Environmental Council of States 2006). EPA also has noted that use of electronic reporting in
the specific context of reporting of Toxic Release Inventory emissions data has reduced the
errors relative to what was observed for manual entry and facilitated easier detection of
remaining errors via automated audits instead of relying on more costly methods such as site
visits (U.S. EPA 1997).4 TRI data are also made available to the public more quickly due to
decreased time required to compile, verify, and analyze data (Karkkainen 2001). Finally,
electronic reporting of SO2 allowance and emissions data in the cap-and-trade system for
electric utilities has been credited with improving processing and verification. For instance, the
introduction of an allowance tracking system allowed EPA to reduce processing time for
transactions to reconcile actual emissions with the quantity of allowances held from up to 5
days to 24 hours for the vast majority of transactions. Likewise, the emissions tracking systems
enabled utilities to submit quarterly reports electronically, which allow EPA to perform
automated compliance checks, provide feedback, and resolve any compliance issues quickly
(Perez Henriquez 2004).
2
Brennear and Olmstead (2008) use a difference-in-difference approach to examine the impact of information
disclosure to households through annual consumer confidence reports (CCRs) on drinking water violations. While
utilities serving the largest populations are required to both mail the CCRs directly to households and post them
online, and medium-sized utilities are only require to mail them, the authors do not attempt to differentiate
between the separate effect of electronic availability from direct mailing. Instead they examine the effect of direct
mailing relative to lesser requirements on utilities serving even smaller populations. The authors find that medium
and large utilities reduced total violations by 30-44 percent and more severe health violations by 40-57 percent as
a result of direct mailing and, at times, electronic posting of data.
3
These are Air Quality System (AQS); Resource Conservation and Recovery Act (RCRA); Safe Drinking Water
Information System (SDWIS); Toxics Release Inventory (TRI); and Electronic Discharge Monitoring Report (eDMR).
4
EPA’s Information Streamlining Plan projected dramatic reductions in the costs of reporting, storing, and
analyzing conventional forms of information as a result of a planned transition to electronic data interchange and
web-based reporting.
4
Version 1
DRAFT – DO NOT CITE
June 6, 2011
Electronic availability of financial data
A number of papers evaluate the effect of making quarterly financial data available to market
participants via an online system called EDGAR. The prior filing method would have required an
individual interested in the financial health of a company to request the data from the Security
Exchange Commission or the firm itself. These studies differ in the degree to which they find a
market response attributable to immediate posting of financial data via the electronic
database, EDGAR. One study compares a firm’s filing via EDGAR to a previous year’s filing via
the traditional paper method. The authors do not find a market response to firm financial data
when it is filed via the traditional method, but detect a discernible market response when the
data are filed via EDGAR. They also find that quarterly financial data are filed more quickly
through EDGAR than was the case with the earlier method (Asthana and Balsam 2001; Griffin
2003). Another study finds, however, that the market response attributed to the posting of
quarterly financial data via EDGAR may actually be attributable to the release of earnings
information by the firm on the same day (Li and Ramesh 2009).
Electronic access to medical data
Evidence of the potential benefits of moving to electronic patient records is suggestive of
benefits but inconclusive. A recent survey of the literature found that while a number of
benefits have been posited – ranging from improved efficiency, time and cost savings, and
reduced errors to improved patient safety and quality of care – very few of these have been
empirically demonstrated (Uslu and Stausberg 2008). For instance, of twenty studies identified
that examine the impacts of electronic patient record systems, only four examine how they
affect treatment quality. All four find indirect positive effects due to improved communication
and ability to monitor patients, and improvements in data quality. However, the extent to
which electronic patient data results in measurable health gains has not been well established
as few published papers investigate this end point (e.g. Congressional Budget Office 2008). One
case study suggests there may be reasons to be cautious (Miller et al. 2005; Congressional
Budget Office 2008).5 Similarly, while a study of the use of electronic diaries by participants
suffering from chronic pain to record their symptoms in clinical trials resulted in substantially
improved compliance with protocols when paired with reminders to record information in a
timely manner and feedback on compliance compared to paper diaries (Stone et al. 2003),
other medical studies have not replicated similar improvements in compliance due to the use of
electronic diaries (Green et al. 2006; Blondin et al. 2010).
5
CBO (2008) also cites this explanation as a reason why one study did not find reduced adverse drug events after
the introduction of an electronic health system.
5
Version 1
III.
DRAFT – DO NOT CITE
June 6, 2011
State electronic reporting of DMR data
While the Clean Water Act delegates authority to the states to regulate surface water
discharges, it requires that entities discharging wastewater hold a NPDES permit and monitor
and report their discharges to the states. Prior to 1999, all monitoring data was collected via
paper forms. Ohio was the first to implement electronic reporting in 1999, followed a short
time later by Florida and Michigan. Both Ohio and Michigan have since moved to new,
improved electronic reporting systems that allow for state regulators to quickly provide
feedback to regulated entities and to transfer DMR data to the U.S. EPA.
Twenty-four states currently have electronic reporting of DMR data (see Table 1), twelve of
which began in 2009 or 2010 and one of which is still in the testing stage (i.e., Maine). Of these,
13 states transfer their DMR data for both major and non-major entities to the U.S. EPA.6 Three
types of electronic reporting systems dominate – 83 percent of states with electronic reporting
as of 2010 use either the e2, eDMR, or NetDMR system. In most cases, states do not require
that DMR data be submitted electronically, but they make it available as an option. Ohio is one
exception to this, though paper submission can be requested – though as of October 2010 only
20 NPDES permit holders out of roughly 3,250 continue to use paper submissions. There also
may be additional states with plans to move to electronic reporting not captured in this list (for
instance, it has been reported that Alaska is considering a move to the e2 electronic reporting
system). However, for the analytic purposes of trying to identify differences in facility
compliance behavior across states with and without electronic reporting, we plan to rely on
available historical data. As a result, very recent adoption of an electronic reporting system (i.e.,
2009 – 2010) will also likely not be captured by this study.
The voluntary movement of a large number of states to electronic reporting of DMR data
suggests the existence of potential net benefits. In particular, cost savings for government in
administering the system and for regulated entities in submitting data have been noted.
Anecdotal evidence also points to the perception of environmental benefits associated with
moving to such a system. For instance, Michigan and Florida both point to improved accuracy
of compliance data through the elimination of potential errors that might otherwise be
introduced through manual data entry and improved overall program effectiveness due to an
enhanced ability to more quickly analyze data, assess compliance, and act on this information.7
6
Major municipal dischargers include all facilities with design flows of greater than one million gallons per day and
facilities with EPA/state approved industrial pretreatment programs. Major industrial facilities are determined
based on specific ratings criteria developed by EPA/state.
7
See posted questions and answers on electronic reporting for wastewater facilities for these two states. For
Michigan, see https://secure1.state.mi.us/e2rs/config/helpdoc/e2rs_faq.pdf, and for Florida, see
http://www.dep.state.fl.us/water/wastewater/wce/edmrqa.htm. Accessed 03/07/2011.
6
Version 1
DRAFT – DO NOT CITE
June 6, 2011
Table 1. States with Electronic Reporting as of 2010
System Attribute
System Type
Year Electronic
Reporting Began
States
AL, FL, MI, OH, OK, PA, VA
IL, IN, MS, NC, WV, WY
AR, CT, HI, LA, TN, TX, UT
CA, SC, WA
CA, FL, MI, MS, OH, PA, VA,
WY
AL, AR, CT, HI, IL, IN, LA, NC,
OK, SC, TN, TX, UT, WA, WV
AL, AR, CT, HI, IL, LA, MI, OH,
TN, TX, UT, WA, WV
CA, FL, ME, MS, NC, OK, PA,
VA, WY
e2
e-DMR
NetDMR
Other
Prior to 2008
2009 - 2010
Transfers all DMRs to
U.S. EPA
Yes
No
Source: Email exchange, Office of Enforcement and Compliance Assurance, U.S. EPA.
Ohio requires that monthly and semi-annual DMR reports be submitted electronically via the e2
reporting system. Therefore it has received relatively more attention than other state ereporting programs. As of 2005 – 2006, prior to the implementation of its current system, Ohio
had collected 70 percent of its water discharge information electronically via emailed forms
through a downloadable software program called SwimWare. However, SwimWare, while it
required entering information via computer, bore little resemblance to modern electronic
reporting systems (Ross Associates 2010).8 It was fairly labor intensive as it required a high
degree of technical knowledge to run. It also did not allow for easy updating to correct data
errors or to reflect changes in permit conditions. A common complaint with the 30 percent of
DMR data still submitted via paper reporting was that these data were illegible or incomplete.
After doing substantial outreach and training, Ohio moved to its current electronic system,
which resides online, guides regulated entities through an “easy-to-use data entry wizard,” and
has been paired with an automated compliance tool that sends an email to permit holders
within 24 hours if they are above allowable limits or have a data error, and allows them to
correct data quickly if it was submitted in error. The emailed information also instructs the
regulated entity on what to do if there is an actual permit violation (Enfotech News 2008).
Table 2 presents information on the phase-out of the SwimWare system and the introduction
of the new eDMR system in Ohio by month over a 12 months period between October 2007
and September 2008. Roughly, 3,250 permit holders reported wastewater discharges over this
8
An online article also reported that SwimWare required regulated entities to build NPDES permit requirements
into the software, which proved challenging for many, and was emailed to the state regulator in a zipped file
(Enfoteach News 2008).
7
Version 1
DRAFT – DO NOT CITE
June 6, 2011
time period.9 The Ohio EPA reportedly rolled out the new eDMR system by introducing it to a
new EPA district each month beginning in November 2007, of which there are five in total (Ross
Associates 2010). While SwimWare was still the dominant form of reporting at the end of 2007,
it was only used by 11 percent of regulated entities by March 2008. Seventy percent of all
reporting facilities were using eDMR by this point in time. By September 2008, only 2 percent
of reporting facilities still used SwimWare and 83 percent were using eDMR. It also is worth
noting that, in addition to the replacement of SwimWare with eDMR over this time period,
Ohio also made efforts to reduce paper reporting: it had declined from 24 percent of the
sample in October 2007 to 15 percent a year later.
Table 2: Percent of Ohio Facilities Submitting by Paper, SwimWare, or eDMR
Oct.
2007
Nov.
2007
Dec.
2007
Jan.
2008
Feb.
2008
Mar.
2008
April
2008
May
2008
June
2008
July
2008
Aug.
2008
Sept.
2008
Paper
24%
24%
24%
22%
20%
19%
19%
18%
18%
17%
16%
15%
SwimWare
70%
64%
59%
33%
18%
11%
9%
8%
6%
4%
3%
2%
eDMR
6%
12%
17%
45%
62%
70%
72%
74%
77%
79%
81%
83%
Source: Enfotech News, 2008.
As of October 2010, 99 percent of all NPDES permit holders in Ohio reported their DMR data
electronically. Only 20 permit holders continue to use paper submissions. The case study by
Ross Associates notes a marked increase in the accuracy of Ohio’s compliance data after the
implementation of the current electronic reporting system, with the number of errors falling
from approximately 50,000 per month to 5,000 per month.
IV.
Empirical approach
To discern whether or not e-reporting requirements could generate improved compliance with
environmental regulations we take advantage of the fact that some states have already
adopted e-reporting requirements, while others have not. Differences across states in ereporting requirements allow us to examine statistical differences in compliance behavior. In
particular, we examine whether there is a statistical difference between facilities in states with
and without e-reporting for a variety of end-points including DMR submission rates, compliance
rates, the degree of non-compliance, and the amount of time facilities spend out of
compliance.
9
The intial number of entities reporting discharges in Ohio in October 2007 was 3,214. The number of entities
reporting peaked at 3,308 in January 2008 and hit its lowest point of 3,146 in September 2008 (though the
previous month had 3,237 entities reporting).
8
Version 1
DRAFT – DO NOT CITE
June 6, 2011
We use a ‘difference-in-difference’ estimation method. This empirical technique examines
compliance behavior before and after the year electronic reporting was introduced in Ohio as a
function of whether a facility is in the treatment state (those in Ohio that are required to ereport) or the control state (those in a state without e-reporting), and any other relevant
factors that vary over time. To isolate the effect of e-reporting on compliance behavior, it is
important to control for other potential reasons for better compliance such as stricter
enforcement of regulations, ease of complying, and the cost of complying. One appeal of this
model is that any facility heterogeneity, including unobserved heterogeneity, that is constant
over time does not have to be accounted for in the regression since it does not explain
differences in compliance across facilities in the treatment and control states over time. This
feature greatly reduces the amount of information (i.e., number of explanatory variables) we
need to include in the analysis. For instance, factors such as the year a facility was established,
its industry, and its size (if it has not changed drastically over time) are not needed to account
for alternate explanations for changes in compliance.
One limitation of the difference-in-difference approach is that it requires the assumption that
states with and without e-reporting are not too different from each other so that any observed
differences can be adequately accounted for by the model. If states vary too widely – for
instance, if we attempt to compare the compliance behavior of facilities in Ohio and Texas – we
may be over-extending the empirical technique’s usefulness. Another potential problem with
the difference-in-difference approach is that it greatly reduces the amount of variation in the
explanatory variables that can then be used to explain variation in compliance behavior across
facilities in e-reporting and non-e-reporting states. In general, anything that removes large
amounts of variation in the explanatory variables has the potential to magnify measurement
error. However, as long as the explanatory variable of interest – in our case the existence of ereporting – affects a well-identified segment of the sample at a specific time, which should be
true for our analysis, the bias due to measurement error should be small.
There are several challenges in trying to identify how electronic reporting has affected facilitylevel compliance. First, in 1999 Ohio had already put in place an early electronic system.
Descriptions of the older and newer systems suggest that they are fundamentally different and
that the main mechanisms through which we would expect electronic reporting to affect
compliance behavior existed in the new system but were not present in the old system.
Nonetheless, it is possible that we could understate the benefits of moving to an electronic
system. Second, it is also possible that we do not have a long enough time series to be able to
identify changes in behavior after electronic reporting was introduced in Ohio, though
9
Version 1
DRAFT – DO NOT CITE
June 6, 2011
intuitively it would seem likely that regulated entities would make the largest adjustment at or
close to the introduction of the new electronic reporting system.
We begin with a simple model where compliance in year t by facility i is denoted complianceit .
The variable α captures the average compliance of plants in our control state. Define T as a time
dummy that is set to 1 in the post-policy time period (2008-2010). It captures any general
factors that result in changes in facility compliance over time in either state apart from the
electronic reporting requirement in Ohio. The variable Ohio is a dummy that is set to 1 when a
facility is in the treatment group (i.e., located in Ohio). It captures any pre-policy differences
between facilities in Ohio and those in the control-group. When we interact these two
variables, T * Ohio, we get a dummy variable, referred to as e-report in equation (1), that is
equal to 1 when a facility is in the treatment group (i.e., is required to submit an electronic
DMR report) in the second period. Finally, eit is defined as the residual error term.
complianceit = α + β1T + β2Ohio + β3e-report + eit
(1)
The difference-in-difference estimate is the parameter β3, where
Β3 = (complianceOhio=1,T=2 – complianceOhio=1,T=1) - (complianceOhio=0,T=2 – complianceOhio=0,T=1 ) (2)
Compliance is underlined to denote that the parameter measures the expected value (or
average) difference-in-differences across the two groups.
A second regression takes advantage of the panel nature of our data set by adding a facilityspecific fixed effect, ai , and year-specific dummy variables, dt . The inclusion of the fixed effect
and year dummies means that we can no longer independently identify the coefficients on Ohio
or T, so they drop out of the specification:
complianceit = β3e-report+ ai + dt_+ eit
(3)
Following Bennear and Olmstead (2008), we also explore replacing the time dummies with a
polynomial time trend to reintroduce T into the specification and the ability to estimate β1 .
V.
Data and Variable Definitions
Information on DMR reporting rates and compliance status over time for facilities in Ohio and
the comparison states is taken from EPA’s Permit Compliance System (PCS). The EPA began
migrating states from PCS to a new system, the Integrated Compliance Information System
(ICIS), in June 2006. While much of the same information is reported in the new system, it has
10
Version 1
DRAFT – DO NOT CITE
June 6, 2011
a markedly different structure. Data for Ohio prior to 2011 are in PCS. To ensure we have data
that are reported in a way consistent with those for Ohio, we have limited the control states to
those in PCS over the sample period. Later versions of this paper will explore expanding to
include comparison states from ICIS as well.
The comparison states are chosen from the 24 continental states where electronic reporting is
not available even on a voluntary basis. As mentioned above, we limit the universe of possible
comparison facilities to those with data reported in PCS at least in the first and last years of the
data set, 2005 and 2010.10 Since Ohio requires monthly reporting of water discharges, we also
limit our choice of comparison facilities to those with similar requirements (i.e. facilities with
annual reporting requirements are dropped from consideration). Finally, we selected states
that are broadly similar in terms of the percent of economic activity coming from
manufacturing, while ruling out states that are vastly different in terms of overall level of
economic activity. We are left with five states in our control group: Kansas, Kentucky,
Minnesota, Missouri, and New Jersey.
Facilities that are listed as inactive and do not submit monthly discharge monitoring reports
(DMRs) over the 2005 – 2010 time period are dropped from the data set. We do this to ensure
that we do not artificially inflate the counts of non-compliance by including facilities that are
not required to report. There are a few instances where a facility is listed as inactive but
continues to submit monthly DMRs. We have included these facilities in our data set, though
we will examine whether the results are sensitive to their inclusion. Finally, we only have data
through September 2010. To make the data for 2010 comparable to other years, the data for
January – September have been scaled up to represent the full 12 months.
We use three measures of the degree of compliance with federal Clean Water Act requirements
as dependent variables. Percent Violation is defined as the percent of monthly DMRs submitted
by a facility in a given year that resulted in a violation. Percent Permit is defined as the percent
of a facility’s permits for which discharges were fully reported in a given year. Finally, Violation
Dummy is defined as equal to 1 if a facility has at least one violation in a given year, and zero
otherwise.
In the simplest cases of the difference-in-difference estimations, the independent variables are
dummy variables representing the post-policy time period, T, facilities in Ohio, Ohio, and the
interaction between the two dummy variables to represent facilities subject to electronic
10
We do not impose the requirement that a facility report data in every year, which results in an unbalanced
panel. However, to ensure that the facility was in business and active in both the beginning and end years we
impose the requirement that a facility appear in the data set in both 2005 and 2010. Trends observed in the
summary statistics are invariant to the imposition of a balanced panel.
11
Version 1
DRAFT – DO NOT CITE
June 6, 2011
reporting requirements in Ohio in the post-policy period, E-Report. In addition, we explore
interacting the post-policy time dummy, T, and with several facility-specific characteristics.
First, we define a dummy variable to indicate a major facility, Major (equal to one for major
facilities and zero otherwise). Major facilities generally face greater scrutiny by regulators,
even absent electronic reporting. With more reliable and readily available data through
electronic reporting, these facilities may improve compliance relatively more than other
facilities in light of the attention received by regulators. Alternatively, major facilities could be
less sensitive to a change in the form of reporting than other facilities as they are already
regularly inspected by regulators. On the other hand, minor facilities that previously faced far
less oversight may adjust compliance behavior relatively more in reaction to the auditing
component of the new system. Second, we define a dummy variable to indicate governmentowned facilities, Public (equal to one for publicly-owned facilities and zero otherwise). Publiclyowned facilities such as waste-water treatment plants run by a municipality are operated under
a different incentive structure than privately-held facilities or those operated by private and
government partnerships, though it is hard to predict what this may mean for compliance after
electronic reporting is introduced. We also explore using industry dummies in several of the
regressions, though these results are not reported in the paper.11
VI.
Summary Statistics
As previously mentioned, we compare the compliance behavior of facilities in Ohio with those
from five other states, Kansas, Kentucky, Minnesota, Missouri, and New Jersey. Recall that all
facilities in Ohio are required to submit discharge monitoring reports electronically by mid
2008, while this option is not available - even on a voluntary basis - in the comparison states.
Table 1 presents the mean and standard deviation for key variables in Ohio and the five
comparison states reported as a group. First, note that the number of Ohio facilities in the data
set that are required to submit DMRs is larger than those in the five comparison states
combined. There are about 2,500 Ohio facilities in 2005 and 2010, while almost 1,100 facilities
are located in the comparison states in these years.
11
Including industry dummies had no effect on the qualitative nature of our results – these results are available
upon request.
12
Version 1
DRAFT – DO NOT CITE
June 6, 2011
Table 3: Summary Statistics for Ohio and Comparison States
Ohio
Comparison States
2.3
(3.9)
2.1
(3.3)
4.6
(14.1)
56.3
(28.8)
60.9
(24.1)
58.1
(29.3)
7.4
(13.1)
7.3
(12.4)
13.3
(25.1)
45.5
(41.7)
57.0
(38.2)
39.0
(25.1)
Independent Variables
Number of Major Facilities
Number of Publicly-Owned Facilities
Number of Facilities in Manufacturing
380
1,195
364
408
701
102
Total Number of Facilities in 2005, 2010
2,544
1,073
Dependent Variables
Percent Violation, 2005
Percent Violation, 2008
Percent Violation, 2010
Percent Permit, 2005
Percent Permit, 2008
Percent Permit, 2010
It is also worth noting that facilities in Ohio tend to be out of compliance less often (Percent
Violation) and submit a higher percent of complete discharge reports (Percent Permit) than
facilities in other states both before and after electronic reporting is introduced in 2008. The
trends in these variables are similar across Ohio and the comparison group, however. The
percent violations declines from 2005 to 2008 but then increases between 2008 and 2010.12
Likewise, the percent of facilities’ discharge reports that are complete increase between 2005
and 2008 but decrease from 2008 to 2010. In other words, compliance behavior worsened in
the post-policy time period. Furthermore, the increase in violations and the decreased number
of completed reports is much larger in the comparison states. In terms of the time-invariant
independent variables we interact with the post-policy time period dummy, Ohio has a lower
percent of its DMRs filed by major facilities (15 percent versus 38 percent), has fewer publiclyowned facilities reporting (47 percent versus 65 percent), and has a larger percent of its
facilities in manufacturing (14 percent versus 10 percent).
12
These trends continue to hold true when the panel is artificially constrained to be balanced. The increases and
decreases in compliance over time are not due to non-reporting by some facilities in the intervening years
between 2005 and 2010.
13
Version 1
VII.
DRAFT – DO NOT CITE
June 6, 2011
Results
We present a series of regression results in Tables 4 and 5. Column 1 in each of the tables
presents pooled regression results using ordinary least square regressions, essentially ignoring
the panel nature of the data set for the two dependent variables, Percent Violation and Percent
Permit. The standard errors for the polled regression results have been corrected for
heteroskedasticity and cross-sectional autocorrelation. Columns 2 and 5 rely on panel fixed
effect regression techniques. Specifically, column 2 includes year fixed effects. Because Ohio
and T fall out of the regression, we also include a case in column 3 that adds a quadratic time
trend (specifically, time + time-squared) to allow for the inclusion of the post-policy period
dummy, T, in the regression. Column 4 is identical to the regression in column 3, except Major
and Public are now included by interacting them with T. The standard errors for the panel
regressions in columns 2 through 4 are corrected for heteroskedasticity and within-facility serial
correlation. Finally, we convert Percent Violation into an indicator dummy, Violation Dummy
and run a panel logit model. These results are reported in columns 5 and 6 of Table 4. We do
not do something similar for Percent Permit, since large numbers of positive values close to
zero are not an issue in this case.
The results from the pooled regressions have the expected sign and are highly significant in all
four regressions. We begin with a discussion of the Percent Violation results in column 1 of
Table 4. Based on the summary statistics, it is not surprising to find that Percent Violation is
positively related to the post-policy time period, T (2008 and beyond). As expected, we also
find that Percent Violation is negatively related to facilities located in Ohio regardless of time
period. The summary statistics indicated that Ohio facilities tended to have much greater rates
of compliance than facilities in the comparison states. Our key policy variable, E-Report, is
negatively related to Percent Violation. In other words, facilities that are required to
electronically report DMRs in the post-policy period have higher rates of compliance. This
provides initial evidence that electronic reporting by itself is correlated with improved
compliance, apart from time, location, or industry differences across facilities. Finally, publiclyowned facilities are positively related while major facilities are negatively related to Percent
Violation, though only major facilities are related to compliance behavior in the post-policy
time period. Specifically, facilities listed as major are also negatively related to Percent
Violation after electronic reporting is introduced. The signs of Major and Major*T are
consistent with the story that major facilities may have a better compliance record due to the
greater scrutiny received from regulators. With the advent of more reliable and readily
available data through electronic reporting, these facilities see a further improvement in
compliance, perhaps in part due to the risk that they will face even greater scrutiny.
14
Version 1
DRAFT – DO NOT CITE
June 6, 2011
Table 4: Regression Results for Percent Violation
Pooled
With
Controls
(1)
Ohio
T
E-Report
Public
Major
Public*T
Major*T
Year
dummies
Quadratic
time trend
Observations
Groups
-0.06 ***
(0.003)
0.013 ***
(0.004)
-0.01 ***
(0.004)
0.01 ***
(0.001)
-0.03 ***
(0.002)
0.001
(0.003)
0.03 ***
(0.004)
Year
Effects (2)
-0.2 ***
(0.004)
Panel Fixed Effects
Time Trend
Time
Logit with
(3)
Trend with
Year
Controls
Effects
(4)
(5)
-0.001
(0.003)
-0.02 ***
(0.004)
-0.01 ***
(0.003)
-0.01 ***
(0.004)
0.28 ***
(0.10)
-0.31 ***
(0.08)
0.001
(0.002)
0.03 ***
(0.004)
Logit with
Controls
(6)
0.17
(0.12)
-0.24 ***
(0.09)
-0.003
(0.08)
0.28 ***
(0.09)
N
Y
N
N
N
N
N
N
Y
Y
Y
Y
21,099
--
21,099
3,617
21,099
3,617
21,099
3,617
21,099
3,617
21,099
3,617
*** denotes significance at the 1 percent level.
The comparable results for the pooled regressions using Percent Permit, the percent of a
facility’s required DMRs that are completely reported in a given year, as the dependent variable
are presented in column 1 of Table 5. The results are consistent with those from the Percent
Violation regressions. Note that the signs flip because the higher Percent Permit the better a
facility’s compliance record, while it was just the opposite for Percent Violation. The post-policy
period, T, is negatively related to the percent of complete reports filed, consistent with the
15
Version 1
DRAFT – DO NOT CITE
June 6, 2011
summary statistics in Table 3. Facilities in Ohio and those that electronically reported in the
post-policy period are positively related to percent of complete reports filed. Major facilities are
positively related while publicly-owned facilities are negatively related to percent completed
DMRs filed.13 However, neither type of facility is significantly associated with a further change
in compliance in the post-policy period.
Table 5: Regression Results for Percent Permit
Pooled
With Controls (1)
Ohio
T
E-Report
Public
Major
Public*T
Major*T
Year dummies
Quadratic time
trend
Observations
Groups
0.04 ***
(0.007)
-0.13 ***
(0.01)
0.12 ***
(0.01)
-0.13 ***
(0.007)
0.14 ***
(0.007)
-0.01
(0.008)
0.01
(0.01)
N
N
21,099
--
Year Effects (2)
Panel Fixed Effects
Time Trend (3)
Time Trend With
Controls (4)
-0.15 ***
(0.01)
0.12 ***
(0.008)
-0.15 ***
(0.01)
0.12 ***
(0.009)
Y
N
N
Y
-0.009
(0.006)
0.007
(0.008)
N
Y
21,099
3,617
21,099
3,617
21,099
3,617
0.12 ***
(0.008)
*** denotes significance at the 1 percent level.
When we compare the pooled regression results to those that make use of the panel nature of
the data either through the inclusion of fixed effects and either year dummies or a quadratic
13
We also replaced the dummy variable for publicly owned with dummies indicating whether a facility was in
manufacturing (SIC 20-39) or electric, gas, and sanitary services (SIC 49). The dummy for manufacturing was
significant in both cases, while the dummy for electric, gas, and sanitary services was only significant for Percent
Permit. All other variables retained the same sign and significance.
16
Version 1
DRAFT – DO NOT CITE
June 6, 2011
time trend in columns 2, 3 and 4, we find that the sign and significance of the key policy
variable of interest, E-Report (recall this variable is the interaction between two dummy
variables, Ohio and T), is remarkably consistent. It continues to be negatively related to Percent
Violation and positively related to Percent Permit under every specification. Likewise, when
Major and Public are interacted with T, we find that whether a facility is publicly-owned still has
no significant explanatory power. Whether a facility is listed as major continues to be
significant and positive in the Percent Violation regressions. It is still the case that neither
variable is significant in the Percent Permit regression. Results for the variable T, indicating the
post-policy time period, switch sign and are sometimes insignificant in the Percent Violation
regressions. However, it continues to be significant and negatively related to Percent Permit
under all specifications
We face a similar challenge to that identified in Bennear and Olmstead (2008) for some
measures of our dependent variable – data on violations could be dominated by zeros. They
address this issue by running panel Poisson and negative binomial models. We have defined
our dependent variable in percent terms, which converts it to a continuous variable. Because
we have taken data reported on monthly basis and aggregated it by year, we also have far
fewer cases where there are no violations. However, for many facilities the Percent Violation is
quite close to zero. We are still exploring the econometric challenges that this presents but as
an initial test have converted the continuous measure of violations to an indicator variable
called Violation Dummy, equal to 1 when Percent Violation is positive and zero otherwise, to
apply panel logistic techniques.14 The results are reported in columns 5 and 6 of Table 4 and
are remarkably consistent with those that use Percent Violation, reported in columns 3 and 4.
There is one exception: the sign and significance of T, the dummy indicating the post-policy
time period, is sensitive to both the specification used and whether additional control variables
are included. However, the finding that electronic reporting in the post-policy time period is
significant and reduces the likelihood of a violation in a given year remains.
VIII. Conclusion and Next Steps
Initial results suggest that the use of mandatory electronic reporting in Ohio has a significant
and positive effect on the compliance behavior of facilities. We base this conclusion on results
from difference-in-difference estimations and two measures of compliance, the percent of
DMRs that resulted in violations and the percent of filed reports that were complete. We have
not yet addressed the question of whether the benefits of introducing electronic reporting are
relatively small or large. This will be an important question to address to help inform
policymakers. It is also our intention to explore the relationship between electronic reporting
14
We are currently exploring the use of other possibly more appropriate econometric models such as the Tobit
model, several Type-II Tobit models, and a version of the Heckman Sample Selection model.
17
Version 1
DRAFT – DO NOT CITE
June 6, 2011
and other measures of compliance in future versions of this paper, including the amount of
time that a facility remains in compliance, and the degree of non-compliance. We dropped
from our sample facilities that are not required to report DMRs on a monthly basis. However,
the data show that many facilities with annual or quarterly reporting requirements actually
submit monthly DMRs. An important question is whether our results continue to hold when
these facilities are added back into the data set. We also intend to investigate whether
additional control states could be added from the ICIS data.
This analysis only examines the introduction of a particular form of electronic reporting in Ohio.
However, some states have voluntary systems in place. The effectiveness of the two systems is
likely to differ substantially. Furthermore, electronic reporting systems vary in their essential
attributes. For example, Ohio’s e2 reporting system provides near instantaneous feedback from
state regulators on the submitted DMR results, allowing facilities to fix any reporting errors or
rectify a compliance violation relatively quickly compared to facilities in states that have ereporting systems without this feedback loop. Other states may use systems that do not have
such a high degree of feedback. As this paper expands, we will explore using facilities in states
with voluntary systems as the control group to explore the impact of these types of systematic
differences on compliance behavior.
It is also important to note that the transferability of the empirical results from this study to a
national e-reporting system will depend on how closely it resembles the state systems studied.
For example, if the national system includes fewer exemptions to electronic reporting (e.g. not
allow as much paper reporting) or provides quicker feedback than many of the current
electronic reporting systems, the state results could understate the potential benefits of a
national electronic reporting rule. Likewise, if the national e-reporting system omits automatic
auditing or feedback, the results from the current state-level analysis could overstate the
potential benefits. Moreover, it is important to note that an electronic option for reporting
DMR data is already available at the national level. Though electronic reporting will change
from a voluntary to a mandatory system, there are no additional benefits (or costs) attributable
to the rule from facilities that already make use of the electronic reporting system.
18
Version 1
DRAFT – DO NOT CITE
June 6, 2011
References
Asthana and Balsam (2001). “The Effect of EDGAR on the Market Reaction to 10-K Filings.”
Journal of Accounting and Public Policy 20: 349-372.
Blondin, J., K. Abu-Hasaballah, H. Tennen, and R. Lalla (2010). “Electronic versus Paper Diaries:
A Pilot Study of Concordance and Adherence in Head and Neck Cancer Patients Receiving
Radiation Therapy.” Head and Neck Oncology 2: 29-33.
Bennear, L., and S. Olmstead (2008). “The Impacts of the ‘Right to Know’: Information
Disclosure and the Violation of Drinking Water Standards.” Journal of Environmental Economics
and Management 56, 2: 117-130.
Congressional Budget Office (2008). Evidence on the Costs and Benefits of Health Information
Technology. Congressional Budget Office.
Enfotech News (2008). “Ohio Success Story – Computer Technologies Help Environmental
Compliance.” www.enfotech.com/enf/enfoWebApp/pages/news/news.apsx
Environmental Council of the States (2006). Environmental Information Exchange Network:
Exchange Network – Return on Investment and Business Process Analysis Final Report.
http://www.exchangenetwork.net/benefits/ROIReport_v1.0.pdf
Green, A., E. Rafaeli, N. Bolger, P. Shrout, and H. Reis (2006). “Paper or Plastic? Data
Equivalence in Paper and Electronic Diaries.” Psychological Methods, 11, 1: 87 – 105.
Griffin (2003). “Got Information? Investor Response to Form 10-K and Form 10-Q EDGAR
Filings.” Review of Accounting Studies 8: 433-460.
Internal Revenue Service (2008). Advancing E-file Study Phase 1 Report – Achieving the 80% Efile Goal Requires Partnering with Stakeholders on New Approaches to Motivate Paper Filers.
Sept. 30.
Internal Revenue Service (2010). “IRS E-File: It’s Fast, It’s Easy, It’s Time” Accessed 12/09/10.
http://www.irs.gov/newsroom/article/0,,id=218319,00.html.
Karkkainen, B. (2001). “Information as Environmental Regulation: TRI and Performance
Benchmarking, Precursor to a New Paradigm?” Georgetown Law Journal 89: 257.
19
Version 1
DRAFT – DO NOT CITE
June 6, 2011
Kopczuk, W., and C. Pop-Eleches (2007). “Electronic Filing, Tax Preparers, and Participation in
the Earned Income Tax Credit.” Journal of Public Economics 91: 1351-1367.
Li and Ramesh (2009). “Market Reaction Surrounding the Filing of Periodic SEC Reports.” The
Accounting Review 84, 4: 1171 – 1208.
Miller, R., C. West, T. Brown, I. Sim, and C. Ganchoff (2005). “The Value of Electronic Health
Records in Solo or Small Group Exercises.” Health Affairs, 24, 5: 1127-1137.
Office of Management and Budget (2010). Report to Congress on the Benefits of the EGovernment Initiatives.
http://www.whitehouse.gov/sites/default/files/omb/assets/egov_docs/FY10_EGov_Benefits_Report.pdf
Perez Henriquez, B. (2004). “Information Technology: The Unsung Hero of Market-Based
Environmental Policies.” Resources. Resources for the Future.
Ross Associates (2010). Case Study: Ohio Environmental Protection Agency’s Electronic
Discharge Monitoring Report (eDMR) System Reaches 99% Adoption.
Shimshack, J. (2007). Monitoring, Enforcement, & Environmental Compliance: Understanding
Specific and General Deterrence (prepared under contract for EPA).
http://www.epa.gov/oecaerth/resources/reports/compliance/research/meec-whitepaper.pdf
Stone, A., S. Shiffman, J. Schwartz, J. Broderick, and M. Hufford (2003). “Patience Compliance
with Paper and Electronic Diaries.” Controlled Clinical Trials 24: 182-199.
U.S. EPA (1997). Information Streamlining Plan. Environmental Protection Agency.
Uslu and Stausberg (2008). “Value of the Electronic Patient Record: An Analysis of the
Literature.” Journal of Biomedical Informatics 41: 675-682.
20
Download