FTC Throws Down the Gauntlet To Information Providers and Users

advertisement
2 February 2016
Practice Groups:
Technology
Transactions
Privacy, Data
Protection and
Information
Management
FTC Throws Down the Gauntlet To Information
Providers and Users
By Holly K. Towle
The Federal Trade Commission (FTC) started the New Year by throwing down the gauntlet
to organizations that sell, buy or otherwise provide or use “Big Data”i analytics, particularly
including employers, creditors, landlords, housing real estate agents, insurance companies
and the data or technology companies offering data analytic services to them. The challenge
takes the form of a January, 2016 FTC report entitled Big Data A Tool for Inclusion or
Exclusion? (Report).ii It is a “must read” for anyone providing or using big data analytics.
Although the FTC’s analysis creates many debatable issues, the Report makes clear at least
the FTC’s regulatory agenda.iii
The Report responds to what the FTC describes as our era of big data created by uses of
smartphones, computers, social media and the upcoming “Internet of Things” (where
sensors reside in and transmit data from a new range of objects). The Report’s focus is
commercial uses of big data analytics on consumer information and the potential to harm
individuals, particularly low income and underserved populations. Although the Report
mentions data privacy and security concerns, that is not its focus. Its focus is to “encourage”
companies “to apply big data analytics in ways that provide benefits and opportunities to
consumers, while avoiding pitfalls that may violate consumer protection or equal opportunity
laws, or detract from core values of inclusion and fairness.” iv It warns that the FTC “will
continue to monitor areas where big data practices could violate existing laws…and will bring
enforcement actions where appropriate.”v We discuss below these laws and the FTC’s view
of how data brokers and users might be covered by them.
1. Fair Credit Reporting Act
A. Background. The federal Fair Credit Reporting Act (“FCRA”) applies to
“consumer reporting agencies” (“CRA”) such as a credit reporting agency providing
“consumer reports” (e.g., traditionally, a credit report or background check) to third parties. A
third party who uses a consumer report to help determine eligibility of an individual for credit,
insurance, employment, or certain other purposes (e.g., tenant screening) also has duties
under FCRA (“User”). FCRA is the federal reason why employers and lenders must obtain
consent of an applicant before obtaining a background check and why they must deliver
“adverse action” notices if they reject the applicant or alter terms based on information in the
consumer report. FCRA is extensive, complex and, currently, a popular class action vehicle.
Companies often do not realize that they might be a CRA because they think of CRAs only
as credit bureaus such as TransUnion, Experian and Equifax. Lewis v. Ohio Professional
Electronic Network LLCvi illustrates the opposite. Lewis involved a database created by state
sheriffs’ associations from criminal records; ultimately the database was transferred to a
nonprofit that licensed several online providers to sell the data to commercial users. The
Lewis plaintiff applied for a job -- his interview went well but he was turned down as an
“unsavory character” and told not to recontact the prospective employer because the
FTC Throws Down the Gauntlet To Information Providers and Users
database confused him with a murderer. To make a long story short, the Lewis licensor was
held to be a CRA because it fit the definition: for monetary fees or on a cooperative nonprofit
basis, it regularly engaged in whole or in part in the practice of assembling or evaluating
information on consumers for purposes of furnishing “consumer reports” to third parties.
The CRA definition requires more than mere receipt and transmission of information, but the
nonprofit had partitioned information so that the public arrest portion of the criminal records
could be made available to third parties, and that was enough under this case. The online
providers were not viewed as CRAs because they, essentially, forwarded customers to the
nonprofit but did not assemble or evaluate information. However, they were viewed as
“resellers” of consumer data (i.e., persons who procure a consumer report for purposes of
reselling it or information in it).
What was the consumer report? It was the arrest information provided by the non-profit.
The court looked to the FCRA definition, describing a consumer report as: “Any …
communication of any information by a consumer reporting agency bearing on a consumer’s
creditworthiness, credit standing, credit capacity, character, general reputation, personal
characteristics, or mode of living which is used or expected to be used or collected in whole
or in part for the purpose of serving as a factor in establishing the consumer’s eligibility for …
[consumer] credit or insurance … employment purposes; or [other authorized purposes].”
In 2013 the FTC began bringing enforcement actions against data brokers alleging they were
CRAs.vii The 2016 Report is directed at all big data providers and users as a more general
warning.
B. The Report’s View of FCRA. We list below selected FTC statements to
provide a flavor of the FTC’s view of when a company might become a FCRA CRA or User:
The FTC intends to use FCRA as a vehicle to regulate big data:
“[A] growing trend in big data [is that] companies may be purchasing predictive
analytics products for eligibility determinations. Under traditional credit scoring
models, companies compare known credit characteristics of a consumer—such as
past late payments—with historical data that shows how people with the same credit
characteristics performed over time in meeting their credit obligations. Similarly,
predictive analytics products may compare a known characteristic of a consumer to
other consumers with the same characteristic to predict whether that consumer will
meet his or her credit obligations. The difference is that, rather than comparing a
traditional credit characteristic, such as debt payment history, these products may
use non-traditional characteristics—such as a consumer’s zip code, social media
usage, or shopping history—to create a report about the creditworthiness of
consumers that share those non-traditional characteristics, which a company can
then use to make decisions about whether that consumer is a good credit risk. The
standards applied to determine the applicability of the FCRA in a Commission
enforcement action, however, are the same.”viii
The FTC’s warning is both right and wrong and creates a restraint on and Catch-22 for the
use of big data. For example, the Report explains that use of a zip code could be an indirect
way of discriminating against low income communities. Even if that is correct, the question is
how can those bound to comply with FCRA know in advance which data is problematic?
The Report provides recommendations on how to vet data, but that implies that such data
2
FTC Throws Down the Gauntlet To Information Providers and Users
analytic techniques and likely outcomes are knowable, readily available, and affordable. The
reality for big data exploitation (which is not the same as inference from or hypothesis testing
by statistical sampling) is to the contrary. A feature and defect of big data is that it indicates
correlations, which are not necessarily causations. One of the values of big data is that use
of “messy” data can lead to unexpected discoveries and knowledge. But it can also lead to
improper “guilt by association” or fallacious post hoc ergo propter hoc conclusions. The
Report states that the FTC will make fact-intensive decisions whether big data has been
misused -- but those will be made in hindsight so the Report signals a perfect compliance
storm. The FTC will always know more, in hindsight, than the big data providers and users
bound to comply upfront. The FTC has developed “law” this way before as we discuss
below.
The FTC’s concerns about the impact of big data on individuals are legitimate and law
cannot be ignored. At the same time, compliance obligations need to be ascertainable
upfront. It may be that big data creates a need for new or amended laws that are better
nuanced to deal with the benefits and harms of big data.
The Report impacts business models:
“The FCRA, however, does not apply to companies when they use data derived from
their own relationship with their customers for purposes of making decisions about
them. But if an unaffiliated firm regularly evaluates companies’ own data and
provides the evaluations to the companies for eligibility determinations, the
unaffiliated firm would likely be acting as a CRA, each company would likely be a
user of consumer reports, and all of these entities would be subject to Commission
enforcement under the FCRA.”ix
Under FCRA, there is an exception for a traditional problem: employer A receives a call
seeking information about a former employee from potential employer B. Employers like A
frequently do not answer B’s questions from fear of providing a FCRA “consumer report.”
FCRA tries to encourage answering (to facilitate subsequent employment) by providing,
essentially, that as long as A’s answer pertains only to A’s experience with the ex-employee
and does not include information about third party experiences, A’s information is not a
consumer report. If A employed thousands of people and didn’t have time to talk with
employers like B, it would be interesting to research whether A would still be protected if it
used a service provider to accomplish that task for A (but who would not otherwise share or
use the data).
Whatever the traditional answer might be, the Report sets forth a “No” answer for data
brokers and Users of big data. “A” must do everything on its own. If the FTC’s position is
correct, it works at cross purposes with FCRA: unless A wants to undertake FCRA
compliance burdens and unless the service provider is willing to be a CRA, A will have to do
everything on its own and forego use of third party data analytic expertise.
2. Equal Opportunity Laws
The Report is not confined to FCRA. It also discusses laws intended to prohibit
discrimination, including disparate impacts on vulnerable segments of society. The Report
includes these examples of such federal laws: the Equal Credit Opportunity Act (“ECOA”)
and its companion Regulation B, Title VII of the Civil Rights Act of 1964, the Americans with
Disabilities Act, the Age Discrimination in Employment Act, the Fair Housing Act, and the
3
FTC Throws Down the Gauntlet To Information Providers and Users
Genetic Information Nondiscrimination Act . This is not a complete list and states have
similar or additional laws. Further, some of these laws or regulations require use of a
statistical analysis based on traditional principles -- so use of “big data,” in the sense of that
term meaning use of a non-statistical analysis, might already be precluded under them.
These kinds of laws prohibit discrimination based on protected characteristics such as race,
color, sex or gender, religion, age, disability status, national origin, marital status and genetic
information. The Report warns that “Companies should review these laws and take steps to
ensure their use of big data analytics complies with the discrimination prohibitions that may
apply.” The FTC provides this illustration based on the ECOA and Regulation B (emphasis
added):
“To prove a violation of federal equal credit or employment opportunity laws, plaintiffs
typically must show “disparate treatment” or “disparate impact.” Disparate treatment
occurs when an entity, such as a creditor or employer, treats an applicant differently
based on a protected characteristic such as race or national origin. Systemic
disparate treatment occurs when an entity engages in a pattern or practice of
differential treatment on a prohibited basis. In some cases, the unlawful differential
treatment could be based on big data analytics. For example, an employer may not
disfavor a particular protected group because big data analytics show that members
of this protected group are more likely to quit their jobs within a five-year period.
Similarly, a lender cannot refuse to lend to single persons or offer less favorable
terms to them than married persons even if big data analytics show that single
persons are less likely to repay loans than married persons. Evidence of such
violations could include direct evidence of the reasons for the company’s choices, or
circumstantial evidence, such as significant statistical disparities in outcomes for
protected groups that are unexplained by neutral factors.”x
These types of laws and legal outcomes under them are more complex than explained in the
Report, but the FTC’s basic warning is correct: various uses of big data can trigger
violations. Exactly what will be a violation will vary with the wording of each law and
available defenses. The core message, however, should be kept in mind: big data does not
come with a “free pass.” This latter point may become a contentious area to navigate given
the competing needs of big data providers to protect their algorithms and data and the need
of their customers for sufficient transparency to avoid violations of applicable laws.
3. FTC Act
Section 5 of the Federal Trade Commission Act prohibits unfair or deceptive acts or
practices. This is the law on which the FTC has created, largely through private enforcement
orders, a regime of data privacy and data security “rules” even though Section 5 does not
mention data privacy or security. Section 5 also does not mention “data” or “big data,” but
the Report signals that the FTC will attempt to build a Section 5 big data regime.
As noted by the FTC, “Section 5 is not confined to particular market sectors but is generally
applicable to most companies acting in commerce,” i.e., the challenges laid down in the
Report are confined only by the FTC’s jurisdiction. Coupled with the fact that state and
other federal regulators tend to follow the FTC’s lead when enforcing their own laws akin to
4
FTC Throws Down the Gauntlet To Information Providers and Users
Section 5, the power of any regime developed by the FTC should not be underestimated.
With that in mind, three Report statements are particularly noteworthy:
• The FTC asks: “Are you honoring promises you make to consumers and providing
consumers material information about your data practices? “xi The Report indicates
that a “No” answer can expose a company to an FTC enforcement action.
• The FTC states: “[A]t a minimum, companies must not sell their big data analytics
products to customers if they know or have reason to know that those customers will
use the products for fraudulent or discriminatory purposes. “xii The FTC does not
explain how this approach, which makes an information provider liable for the acts of
third parties, squares with competing US legal principles such as the body of
countervailing First Amendment case law. If the FTC’s broad statement is legally
sustainable, data providers should not assume that they can head off liability by
simply providing notice to customers that certain uses are prohibited.xiii
• The FTC states: “One example of a potentially unfair practice is the failure to
reasonably secure consumers’ data where that failure is likely to cause substantial
injury. Companies that maintain big data on consumers should take care to
reasonably secure that data commensurate with the amount and sensitivity of the data
at issue, the size and complexity of the company’s operations, and the cost of
available security measures.” xiv This statement harkens back to the data security
regime that the FTC has created out of Section 5 of the FTC Act (see section B
below).
A Few More Observations
The Report is incomplete as to competing aspects of the laws discussed, but that is to be
expected given legal complexities. Less excusable is the absence of discussion about
competing core societal values and constitutional issues. The Report does explain that the
FTC seeks to avoid detracting from “core values of inclusion and fairness,” but does not
discuss the competing core values. Such a discussion is necessary in order to balance the
benefits and harms of big data. For example, there is no discussion of federal and state
constitutional rights such as free speech and due process, which we note briefly below.
A.
Free Speech. Below are a few quotations from a 2011 U.S. Supreme Court
ruling regarding the federal constitutional right of free speech. They illustrate some of the
concepts that need to be part of any big data discussion:
“An individual’s right to speak is implicated when information he or she possesses is
subjected to ‘restraints on the way in which the information might be used ‘or
disseminated.”
“This Court has held that the creation and dissemination of information are speech
within the meaning of the First Amendment …. Facts, after all, are the beginning
point for much of the speech that is most essential to advance human knowledge
and to conduct human affairs.” xv
Under this case, laws restricting free speech are subjected to heightened judicial scrutiny,
including commercial speech. As explained in the case, the First Amendment does not
prevent restrictions directed at commerce or conduct from imposing incidental burdens on
speech, but laws going beyond incidental burdens are at risk, including if directed at content
5
FTC Throws Down the Gauntlet To Information Providers and Users
(marketing speech) and aimed at particular speakers. In that event, it does not matter that
the speech is commercial and a “strict scrutiny” standard is applied to determine whether the
law will pass constitutional muster.xvi The issue created by the Report is whether the FTC’s
analysis, if applied as stated in the Report, is too broad to survive a “strict scrutiny” analysis.
B. Due Process. Another constitutional issue concerns whether the Report is
sufficient to meet due process requirements for letting persons required to comply with a law,
know in advance what is required such that compliance can be achieved. The FTC
repeatedly notes that its broad statements are just that and that “only a fact-specific analysis
will ultimately determine whether a practice is subject to or violates”xvii the particular law. In
short, the FTC will know a violation when it sees it, but those bound actually to comply will
have to do their best to guess until either a court decides the matter in expensive litigation or
after years of FTC development of “big data” law via private settlements binding no one but
the defendant.
That is how the FTC developed its current regime of data privacy and security under Section
5 of the FTC Act. Companies spent over a decade ignoring that development because they
thought due process required issuance of regulations and that nonbinding enforcement
orders were just that, nonbinding. The danger of that view was recently illustrated in the
Third Circuit’s ruling in FTC v. Wyndham Worldwide Corp.xviii The court did not decide that
question but did decide a question pled, which was whether Wyndham had “fair notice” that
its data security activities might be a violation of Section 5. The answer was essentially yes
because of the decade of FTC enforcement orders.
C. Example of A and B, i.e., First Amendment and Due Process Chill Created
by the Report. To illustrate how both free speech and due process values are impacted by
the Report, consider this statement from the Report:
“Create new justifications for exclusion. Big data analytics may give companies
new ways to attempt to justify their exclusion of certain populations from particular
opportunities. For example, one big data analytics study showed that “people who fill
out online job applications using browsers that did not come with the computer . . .
but had to be deliberately installed (like Firefox or Google’s Chrome) perform better
and change jobs less often.” If an employer were to use this correlation to refrain
from hiring people who used a particular browser, they could be excluding qualified
applicants for reasons unrelated to the job at issue.”xix
It is difficult to know what to do with such a statement, other than to view it as a warning that
all non-traditional data analytics theoretically could be viewed as creating exclusion. Even if
that is true, the statement creates a Catch-22: the data vetting suggested in the Report to
eliminate prohibited discrimination or bias is not foolproof and is likely more of an art than a
science. Such vetting will not, necessarily allow employers to know in advance what data is
“good” or “bad” for purposes of FTC enforcement actions. Over time it might become clear
which data is which, but that knowledge will come too late to avoid claims against “firstmovers.” What is an employer to do with the FTC statement in light of real world
complexities:
6
FTC Throws Down the Gauntlet To Information Providers and Users
• Recalling that big data analysis techniques tend to discover correlations without
illuminating the cause of the correlation it follows that the true cause of the correlation
may be benign, random or malignant. Suppression of big data discoveries without
understanding the cause of the correlation suppressed may lead to its own bad result:
what if the “browser” factor actually is a new discovery of a way to help determine which
individuals are more likely to be highly motivated self-starters with a willingness to create
change (e.g., because they don’t settle for what comes with their device and take steps to
change that situation). If so, use of the browser data might promote inclusion, not
exclusion, because those personalities can exist in any population and at any income
level.
• The browser factor also might mean nothing at all or promote exclusion, but not on a
basis covered by anti-discrimination laws. For example, applicants applying via a work
computer might be using a mandated browser because their employer’s systems or
training is premised on it. The browser factor might also cause exclusion, such as for
applicants using a free device provided under a program for low-income users with
training or donations that assume or mandate use of one browser.
• The employer’s online application system might collects browser data for authentication
purposes. That employer will have browser data so might be accused of using it even if it
did not. By analogy, employers receiving a picture of an applicant might see race,
religion, disability or something else they had not intended to see and, accordingly, ask
applicants not to upload pictures. Yet in this hypothetical, the browser data will already
be in the system and might actually be needed or useful to website functioning or
authentication (e.g., to help authenticate that the person signing into applicant X’s
account is X based on X’s usual browser).
The FTC is correct that the browser here should not be a proxy for discrimination on a
prohibited basis. The reality, however, is that it might not even be possible for the employer
reasonably or affordably to determine if the browser is such a proxy or even to know whether
the Report has already warned against use of the browser factor. That illustrates a due
process issue. The FTC’s warning also illustrates the free speech issue, i.e., the chill on
speech created by the broad warning that if the employer uses the browser factor to take
advantage of possible new knowledge, it might be at risk of an FTC enforcement action.
What must employers actually do? That knowledge may only be gleaned from and after a
series of FTC enforcement actions taken in hindsight and creating numerous nonprecedential private settlement details that might or might not reflect actual law. As
illustrated by Wyndham, allowing such a series to build up without testing in litigation can
have a cumulative effect even on organizations entitled to ignore the settlements.
Conclusion
The Report is only one of several FTC or other governmental reports regarding big data. It
touches, however, upon laws that protect core societal values and, as to some, carry
statutory damages, allow significant fines and are the subject of frequent litigation. Although
the Report is not as fulsome as it should be regarding counterpoint information, at base it
makes an important point: the fact that big data may bring astounding and even accurate
new insights does not mean that the insights can be used legally.
7
FTC Throws Down the Gauntlet To Information Providers and Users
This is an odd result in a country devoted to the development and use of knowledge, but the
U.S. is also devoted to creating and guarding equal opportunities. Big data impacts all of
those values and companies need policies, procedures and contracts to navigate this
developing area.
That navigation will also need to do more than deal with U.S. law. First Amendment free
speech rights are constitutional in the U.S. so legislators and regulators are not empowered
to alter balances created by the U.S. constitution. Many other countries also value freedom
of speech, but have different restraints and balancing rules.
Author:
Holly K. Towle
Holly.towle@klgates.com
+1.206.370.8334
Anchorage Austin Beijing Berlin Boston Brisbane Brussels Charleston Charlotte Chicago Dallas Doha Dubai Fort Worth Frankfurt
Harrisburg Hong Kong Houston London Los Angeles Melbourne Miami Milan Moscow Newark New York Orange County Palo Alto Paris
Perth Pittsburgh Portland Raleigh Research Triangle Park San Francisco São Paulo Seattle Seoul Shanghai Singapore Spokane
Sydney Taipei Tokyo Warsaw Washington, D.C. Wilmington
K&L Gates comprises more than 2,000 lawyers globally who practice in fully integrated offices located on five
continents. The firm represents leading multinational corporations, growth and middle-market companies, capital
markets participants and entrepreneurs in every major industry group as well as public sector entities, educational
institutions, philanthropic organizations and individuals. For more information about K&L Gates or its locations,
practices and registrations, visit www.klgates.com.
This publication is for informational purposes and does not contain or convey legal advice. The information herein should not be used or relied upon in
regard to any particular facts or circumstances without first consulting a lawyer.
© 2015 K&L Gates LLP. All Rights Reserved.
8
FTC Throws Down the Gauntlet To Information Providers and Users
i
For additional columns on Big Data, see these articles in our IP Law Watch blog:
Data, Database, Metadata, Big Data, Personal Data, Data Mining…
http://www.iplawwatch.com/2014/10/data-database-metadata-big-data-personal-data-data-mining/
Big Data Round-Ups http://www.iplawwatch.com/2014/12/big-data-round-ups/
Big Data Speaks Loudly and Carries a Big Stick http://www.iplawwatch.com/2015/01/big-data-speaksloudly-and-carries-a-big-stick/
IP Rights Big Data http://www.iplawwatch.com/2015/01/ip-rights-in-big-data/
ii
“Big Data A Tool for Inclusion or Exclusion? Understanding the Issues” (January 2016) at
https://www.ftc.gov/system/files/documents/reports/big-data-tool-inclusion-or-exclusion-understandingissues/160106big-data-rpt.pdf
iii
The FTC is not the only regulator of laws discussed in the Report. The Consumer Financial Protection Bureau (CFPB),
for example, is a FCRA regulator and may or may not issue guidance on areas covered by the Report. See e.g., Chris
Bruce, “Don't Expect CFPB Fair Lending Guidance on Big Data,” Bloomberg BNA Banking Report, 106 BBR 74
(1/10/16).
iv
Id. at v.
v
Id.
vi
Lewis v. Ohio Professional Electronic Network LLC, 190 F. Supp. 2d 1049 (SD Ohio 2002).
vii
See e.g., U.S. v. Instant Checkmate, Inc., SD Cal., No. 3:14-cv-00675-H-JMA, 3/28/14 and U.S. v. InfoTrack
Information Services, Inc., ND Ill., No. 1:14-cv-02054, 3/25/14). In each case, the FTC alleged that the companies,
which sold public record information about consumers to users such as prospective employers and landlords, did not
take reasonable steps to make sure that the information in their reports was accurate and that their customers had a
FCRA permissible purpose to have them, and that each operated as consumer reporting agencies without complying
with FCRA.
viii
See e.g., Report at ii and 15-16 (emphasis added).
ix
Id. at 15 (emphasis added).
x
Id. at 18 (emphasis added).
xi
Id. at 24.
xii
Id. at 23
xiii
See Id. at 14.
xiv
Id. at 22-23.
xv
See Sorrell v. IMS Health Inc., 131 S. Ct. 2653, 2657–65, 180 L. Ed. 2d 544, 549–57 (2011).
xvi
See Id. at 180 L. Ed. 2d 544, 559 (2011).
xvii
See e.g., Id. at ii and 17 (re FCRA); iv and 23 (re FTC Act, “The inquiry will be fact-specific, and in every case, the test
will be whether the company is offering or using big data analytics in a deceptive or unfair way”).
xviii
See FTC v. Wyndham Worldwide Corp., 2015 U.S. App. LEXIS 14839 (3rd Cir 2015) (with respect to a data security
enforcement action, Wyndham argued that Section 5 of FTC Act does not authorize FTC specifically to regulate data
security and that FTC proceeded without fair notice. In evaluating whether Wyndham had constitutionally-required
“fair notice” of what practices could violate the FTC Act, the court concluded that FTC guidance and allegations
helped put companies on notice of what practices might be unfair under the FTC Act; the court did not conclude that
the practices actually were unfair or that the expanding body of FTC cybersecurity guidance and allegations in private
enforcement actions were entitled to judicial deference -- those issues remained to be determined. And they will not
be determined as framed in the case because the FTC and Wyndham settled. See also In the Matter of LabMD Inc.,,
2015 FTC LEXIS 272(11/13/15) (administrative law judge dismissal of FTC complaint for unfair act against clinical
testing laboratory LabMD for failing to provide “reasonable and appropriate” security for personal information
maintained on its computer networks; FTC’s evidence was too speculative to prove that the LabMD security caused,
or was likely to cause, substantial consumer injury as required by act governing FTC’s powers regarding unfair acts)
xix
Report at 10-11.
9
Download