Digital Targeted Marketing: Fair Lending Clickbait?

6 Min Read By: Jason M. Cover

Federal and state statutes prohibiting credit discrimination, such as the Equal Credit Opportunity Act and Fair Housing Act, generally date from the 1960s and 1970s. Given that most homes lacked a personal computer and with the widespread adoption of the Internet and smartphones still decades away, these laws understandably did not anticipate modern credit practices stemming from our digital world. This created unanticipated pitfalls and challenges for unwary financial institutions.

These risks have recently been brought into sharp relief due to a practice known as “digital targeted marketing.” At its core, digital targeted marketing is a form of marketing whereby advertisements are disseminated through a variety of online platforms such as web services, paid search, banners, and social media using sophisticated data analytics that effectively preselect a precise target audience. Preselection generally occurs either through “self-selecting” or “look alike” programs offered by online platforms such as Facebook. In self-selecting programs, the advertiser itself selects the criteria used to determine recipients based on an array of attributes and characteristics provided by the platform. In contrast, look alike programs are conducted by feeding the advertiser’s existing customer data through a black box of proprietary data analytics provided by the platform to identify recipients that “look like” the advertiser’s customer base. Because the troves of consumer data possessed by these platforms are so large that they are beyond the ability of individuals or even most software to analyze, companies increasingly use “machine learning”—a type of artificial intelligence that learns on its own as it analyzes huge amounts of data—to comb through this data to find predictive attributes and simultaneously improving its own analysis. Financial institutions increasingly use these tools because they allow advertisement placement not only with those most likely to be interested in a given financial product, but also with those most likely to qualify for it.

As with the use of any consumer attribute, fair lending concerns can arise when protected classes are excluded, whether intentionally or unintentionally, from digital targeted marketing offers due to the presence of one or more attributes that align with prohibited bases or their close proxies. A recent flurry of regulatory activity and private litigation aimed at Facebook highlights this risk and includes:

  • A 20-month investigation into Facebook’s digital targeted marketing practices by the Attorney General of Washington state. This resulted in a consent order in which Facebook agreed to cease providing advertisers with the option to (1) exclude ethnic groups from advertisements for insurance and public accommodations; or (2) otherwise utilize exclusionary advertising tools that allow advertisers with ties to employment, housing, credit, insurance and/or places of public accommodation to discriminate based on race, creed, color, national origin, veteran or military status, sexual orientation and disability status.[1]
  • A civil suit brought by the National Fair Housing Alliance, the Communications Workers of America and several other consumer groups alleging discriminatory practices in Facebook’s digital targeted marketing practices. This resulted in a settlement of $5 million and an agreement by Facebook to make changes to its look alike campaigns for housing, employment and credit-related advertisements (e.g., prohibiting attributes related to age, gender and zip codes).[2]
  • An ongoing Charge of Discrimination levied by the Department of Housing and Urban Development (“HUD”) alleging discriminatory housing practices in violation of the provisions of the Fair Housing Act that prohibit discrimination based on race, color, religion, sex, familial status, national origin or disability. Specifically, HUD alleges that Facebook (1) enabled advertisers of housing opportunities to target audiences using prohibited bases; and (2) used an ad-delivery algorithm that would independently discriminate based on prohibited bases even where advertisers did not use prohibited bases to target audiences. [3]
  • An ongoing civil suit in the Northern District of California alleging that Facebook’s direct targeted marketing practices violated the Fair Housing Act, Equal Credit Opportunity Act and California fair lending laws.[4]

While enforcement and litigation has primarily focused on Facebook and its practices to date,[5] the New York Department of Financial Services recently expressed interest in investigating financial institutions and “Facebook advertisers to examine…disturbing allegations [of discriminatory practices]…to take whatever measures necessary to make certain that all financial services providers are in compliance with New York’s stringent statutory and regulatory consumer protections.”[6] This sentiment was echoed in a recent article by the Associate Director and Counsel to the Federal Reserve Board’s Division of Consumer and Community Affairs which highlights the fair lending risk digital targeted marketing poses to financial institutions (i.e., steering and redlining) and notes that the “growing prevalence of AI-based technologies and vast amounts of available consumer data raises the risk that technology could effectively turbocharge or automate bias.”[7] The commentators further note that it is “important to understand whether a platform employs algorithms — such as the ones HUD alleges in its charge against Facebook — that could result in advertisements being targeted based on prohibited characteristics or proxies for these characteristics, even if that is not what the lender intends.”

With this in mind, financial institutions must evaluate and mitigate not only the risks associated with their own digital targeted marketing activities, but also the activities of the platforms with which they associate. In doing so, they should consider taking the following actions:

  • Evaluating the importance of digital targeted marketing to the financial institution and its risk tolerance with respect to same.
  • Attempting to obtain as much information as possible about the possible presence of prohibited bases or close proxies in digital-marketing algorithms.
  • Requiring indemnification in digital targeted marketing agreements, especially where platforms use proprietary black box analytics.
  • Where available, using “special ad audience” programs intended for industries subject to anti-discrimination laws (e.g., housing, credit, employment, etc.).
  • Considering using self-selected attribute criteria that avoid prohibited bases or close proxies in lieu of a platform’s look alike program.
  • Analyzing and testing responses to digital marketing campaigns for potentially disparate outcomes.

Given their potential benefits, financial institutions are unlikely to cease direct targeted marketing activities. But those that are prudent should engage in reasonable due diligence regarding the platforms they use—weighing the benefits against the risks of their use—while monitoring for future regulatory guidance or legal precedent.


  1. In re Facebook, Inc., No. 18-2-18287-5SEA (Consent Order) (Wa. Super. Ct., July 24, 2018).

  2. Nat Ives, Facebook Axes Age, Gender and Other Targeting for Some Sensitive Ads, THE WALL STREET JOURNAL (March 19, 2019).

  3. HUD v. Facebook, Inc., HUD ALJ No. 01-18-0323-8 (Charge of Discrimination) (U.S. Dept. of Housing and Urban Development Office of Administrative Law Judges, March 28, 2019).

  4. Opiotennione v. Facebook, Inc., Case No. 3:19-cv-07185 (JSC) (Complaint) (N.D. Cal., October 31, 2019).

  5. HUD is reportedly investigating Google and Twitter for similar violations. See Tracy Jan and Elizabeth Dwoskin, HUD Is Reviewing Twitter’s and Google’s Ad Practices as Part of Housing Discrimination Probe, THE WASHINGTON POST (March 28, 2019).

  6. Press Release, Governor Cuomo Calls on DFS to Investigate Claims That Advertisers Use Facebook Platform to Engage in Discrimination (July 1, 2019), available at https://www.dfs.ny.gov/reports_and_publications/press_releases/pr1907011. Emphasis added.

  7. Carol A. Evans & Westra Miller, Fed. Reserve Sys., From Catalogs to Clicks: The Fair Lending Implications of Targeted, Internet Marketing, CONSUMER COMPLIANCE OUTLOOK, Third Issue 2019, at 7, available at https://consumercomplianceoutlook.org/2019/.

By: Jason M. Cover

MORE FROM THIS AUTHOR

Connect with a global network of over 30,000 business law professionals

18264

Login or Registration Required

You need to be logged in to complete that action.

Register/Login