The Power of Place: Geolocation Tracking and Privacy

71 Min Read By: Paige M. Boshell

IN BRIEF

  • The protections provided by current and proposed privacy law are limited.
  • A robust secondary location-data market exists that is not currently addressed by U.S. law.
  • The risks posed by location tracking and profiling are sufficient to warrant consideration of regulatory intervention at various points.

Abstract[1]

Location data tracking is ubiquitous. The tension between privacy and innovation in this space is exacerbated by rapid developments in tracking technologies and data analytics methodologies, as well as the sheer volume of available consumer data. This article focuses on the privacy risks associated with these developments. To the extent that current and proposed privacy law protects location data, such protection is limited to location data that is identified (or in some cases identifiable) to an individual. Requirements generally apply only to the initial data collector; however, recent media accounts and enforcement actions describe a robust secondary market in which (1) identified location data is regularly acquired and used by third parties with whom the individual has no direct relationship, and (2) de-identified or anonymized location data is regularly combined with identified personal data and used by third parties with whom the individual has no direct relationship to compile comprehensive profiles of the individual. These secondary-market practices are not currently addressed by U.S. law. This article proposes that the risks posed by location tracking and profiling are sufficient to warrant consideration of regulatory intervention at the following points: collection from the individual; use by the original data collector; transfer to and among secondary-market participants; identification of anonymized data to a specific individual; profiling of the individual; and decision-making based on profiling.[2]

I. Location Data Tracking Generally

Consumer location is tracked regularly by multiple systems and devices.[3] Many mobile applications (apps) continuously track user location; Facebook, Google, Apple, Amazon, Microsoft, and Twitter all track and use location data.[4]

Individuals often opt into location tracking through personal devices and their apps, such as fitness monitors, smartphones, and GPS trackers, for the purposes of allowing the app to provide them with the underlying service, such as determining distance ran, providing the local weather forecast, and locating and obtaining directions to nearby restaurants.

Business use cases for identified individual location data include providing consumer goods or services (such as roadside assistance) and marketing and targeted advertising.[5] Aggregated location data (i.e., data that is identifiable by distinct data location points but not by individual) can help urban planners alleviate traffic problems, health officials identify patterns of epidemics, and governmental agencies monitor air quality. Commercial uses of aggregated location data include inventory and fleet control, retail location planning, and geofencing. Specified data points may be aggregated over a defined time period and then presented as an overlay to a geographic map. For example, a trucking company can view in real time the locations of its trucks and the demand for trucking services to more efficiently assign routes. Alternatively, the trucking company can geofence its trucks, which means that if a truck goes out of a designated geographical zone, the company will be alerted in real time. Location data is critical to certain types of commercial and public data analytics.

Recent journalistic investigations have revealed that location data is tracked by a wider variety of parties for a greater number of purposes in ways that exceed our understanding or control. The sheer volume of location data tracked, disclosed, and repurposed is tremendous. The widespread availability of location tracking technologies compounds this issue.[6] Furthermore, the use of multiple systems to track location, and the use of data analytics to combine location data with other personal data, enables the both the identification of anonymous data and the compilation of comprehensive and precise profiles of tracked individuals.

Are we at a point yet where place itself acts as a consumer identifier? Unique location tracking patterns can be used to identify the individual; and develop a profile of the individual. A person’s lifestyle, priorities, professional and personal endeavors, and crimes and peccadilloes can all be inferred from continuous location tracking.

The power of place: A person cannot be in more than one place at the same time.

     A. Justification for the Initial Collection of Location Data

Location data is regularly collected by devices, apps, and other online services.

Generally, the basic app model is as follows. An individual downloads a map app in order to get directions. As part of the map app download, the individual agrees that his or her location will be tracked in order to provide personalized directions via the app. The app must know where the individual’s starting point is in order to give accurate directions to the individual’s destination. The individual’s smart phone hardware and the app software use GPS and other tracking technologies to determine the individual’s geographical location: the more accurate and recent the location data, the more accurate the app service.[7]

The wireless carrier transmits this real-time location data to a third-party company (the aggregator), subject to a nondisclosure agreement. The aggregator transmits the location data to the app so that the app can generate the directions to provide to the individual. The location data is tracked and disclosed in order to provide the requested transaction (i.e., directions) to the individual. The sharing of information with third parties is limited to these purposes, and the parties are bound by written nondisclosure agreements not to otherwise use or disclose the individual’s location.

This can be referred to as the initial transaction between the individual and the data collector. The justification for this sharing is that (1) it is necessary to (a) honor the customer’s request for app services and (b) ensure consistency of app usage quality across carriers and devices, and (2) the customer has consented to location tracking as part of his or her enrollment in the app service.

II. The Pandora’s Box of Location Data: The Secondary Location-Data Market

     A. Monetization of Location Data in Secondary Market

The purpose of the initial collection of location data is to enable the data collector to provide a service to the individual; the secondary market purpose is to use that same location data to make conclusions and predictions about the tracked individual. The secondary location data market is used to monetize location data for unrelated purposes, such as enabling a subsequent buyer to compile a profile of the individual and sell access to the individual (whether the individual is identified by name or as part of a data category, like “engaged female retail shopper”). Location data analytics drive a variety of business strategies:

Business data usually contains geographical or location data which mostly goes unused. This data can be as broad as city and country or as specific as GPS location. When this data is placed within the context of big data dashboards and data science models, it allows companies to discover new trends and insights.[8]

The secondary consumer-data market is huge. IBM claims that 90 percent of all consumer data that is currently in circulation was created in the last two years. This industry is expected to generate $350 million dollars annually by 2020.[9] Location data is a big part of that business. The New York Times reported that:

At least 75 companies receive anonymous, precise location data from apps whose users enable location services to get local news and weather or other information, The Times found. Several of those businesses claim to track up to 200 million mobile devices in the United States—about half those in use last year. The database reviewed by The Times—a sample of information gathered in 2017 and held by one company—reveals people’s travels in startling detail, accurate to within a few yards and in some cases updated more than 14,000 times a day.

These companies sell, use or analyze the data to cater to advertisers, retail outlets and even hedge funds seeking insights into consumer behavior. It’s a hot market, with sales of location-targeted advertising reaching an estimated $21 billion this year.[10]

Location tracking data analytics support targeted advertising and marketing for retail and other business purposes. This profiling is intended to individualize the customer experience as much as possible to encourage purchases and loyalty:

[T]he scale of data collected by early adopters of [location tracking] technology is staggering. Location analytics firm RetailNext currently tracks more than 500 million shoppers per year by collecting data from more than 65,000 sensors installed in thousands of retail stores. A single customer visit alone can result over 10,000 unique data points, not including the data gathered at the point of sale.[11]

In addition, the potential combinations and re-use of location data is tremendous:

[B]y combining location data with existing customer data such as preferences, past purchases, and online behavioral data, companies gain a more complete understanding of customer needs, wants and behaviors than is achievable with online data only.[12]

In 2014, Shoshana Zuboff coined the term “surveillance capitalism” to describe how consumer data has become a business unto itself.[13] More recently, Zuboff explained how location data fits in this model:

[There] has been a learning curve for surveillance capitalists, driven by competition over prediction products. First they learned that the more surplus the better the prediction, which led to economies of scale in supply efforts. Then they learned that the more varied the surplus the higher its predictive value. This new drive toward economies of scope sent them from the desktop to mobile, out into the world: your drive, run, shopping, search for a parking space, your blood and face, and always … location, location, location.[14]

Data is generally sold on the secondary market as identified data (which is directly associated with a distinct individual) or as de-identified or anonymous data (which is aggregated and not associated with a distinct individual).

     B. Disclosure of Identified Location Data

          1. Disclosures by Aggregators 

Under the app model, the aggregators receive the individual’s location in order to send it to the app owner for purposes of furnishing the app service. Distribution of this data is much more widespread. Journalistic investigations reveal that aggregators routinely sell location data to a series of parties that are not intermediaries to the initial data transaction, leading to dissemination of location data beyond its intended purpose, and resulting in unrelated third-party access to the individual’s location data.[15]

One such aggregator, LocationSmart, regularly sold continuous cell tower location tracking to Securus Technologies, a prison contractor that provides and monitors calls to inmates. As an ancillary service, Securus “offers [a] location-finding service as an additional feature for law enforcement and corrections officials, [as] part of an effort to entice customers in a lucrative but competitive industry.” This service was used by a variety of law enforcement officials for a wide variety of purposes, including search-and-rescue operations, thwarting prison escapes and smuggling rings, and closing cases.[16]

The relationship between Securus and LocationSmart impacted almost all U.S. cell phone users, was unknown to them, and could not be opted out of:

So how was Securus getting all that data on the locations of mobile-phone users across the country? We learned more last week, when ZDNet confirmed that one key intermediary was a firm called LocationSmart. The big U.S. wireless carriers—AT&T, Verizon, Sprint, and T-Mobile—were all working with LocationSmart, sending their users’ location data to the firm so that it could triangulate their whereabouts more precisely using multiple providers’ cell towers. It seems no one can opt out of this form of tracking, because the carriers rely on it to provide their service.[17]

Another Motherboard investigation showed that wireless carriers also routinely sell assisted or augmented global positioning system (aGPS) location data. aGPS data is more precise location data that is collected for use with enhanced 9-1-1 services to allow first responders to pinpoint an individual’s location with greater accuracy. For example, a cellular call made to the 9-1-1 emergency service that relies solely on GPS satellites might indicate the caller’s location within a given area, such as a building, and it might take several minutes to determine that location. aGPS relies on other external and systems to provide a faster, more precise location, like a floor within a building.

Federal law expressly prohibits the sale of aGPS data.[18] The Federal Communications Commission issued an order in 2017 providing that data included in the National Emergency Address Database, which is collected using Wi-Fi and Bluetooth to locate 9-1-1 callers within a building, may not be used for any other purpose.[19] In addition, the Federal Trade Commission could enforce section 5 of the Federal Trade Commission Act prohibiting deceptive and unfair trade practices against carriers whose privacy policies were inconsistent with this practice.[20]

          2. Privacy Leaks and Security Breaches

In addition to intentional disclosures, LocationSmart exposed this real-time location data through a bug in its website, which enabled users to track anyone without credentials or authorization using a free demo and a single cell phone number:

Anyone with a modicum of knowledge about how Web sites work could abuse the LocationSmart demo site to figure out how to conduct mobile number location lookups at will, all without ever having to supply a password or other credentials.

“I stumbled upon this almost by accident, and it wasn’t terribly hard to do,” Xiao [a security researcher] said. “This is something anyone could discover with minimal effort. And the gist of it is I can track most peoples’ cell phone without their consent.”

Xiao said his tests showed he could reliably query LocationSmart’s service to ping the cell phone tower closest to a subscriber’s mobile device. Xiao said he checked the mobile number of a friend several times over a few minutes while that friend was moving and found he was then able to plug the coordinates into Google Maps and track the friend’s directional movement.[21]

Further, the Securus database was the subject of a data hack that separately exposed personal data. A Motherboard reporter obtained data that had been hacked from Securus’s database:

“Location aggregators are—from the point of view of adversarial intelligence agencies—one of the juiciest hacking targets imaginable,” Thomas Rid, a professor of strategic studies at Johns Hopkins University, told Motherboard in an online chat.

The data hack, which was attributed to a weak password reset feature, revealed personal data of thousands of law enforcement users and inmates.[22]

This means that Securus, acting as an unregulated entity and outside of the scope of its nondisclosure agreements with the wireless carriers, was responsible for innumerable disclosures of identified location data.

Other privacy failures involving identified location data can result in exposure to threats of physical danger.  A  recent privacy failure by a family tracking app (React Apps “Family Locator”) that exposed children’s identified location data for weeks; the very app that parents obtained to protect their children arguably put them at great risk:

Family tracking apps can be very helpful if you’re worried about your kids or spouse, but they can be nightmarish if that data falls into the wrong hands. Security researcher Sanyam Jain has revealed to TechCrunch that React Apps’ Family Locator left real-time location data (plus other sensitive personal info) for over 238,000 people exposed for weeks in an insecure database. It showed positions within a few feet, and even showed the names for the geofenced areas used to provide alerts. You could tell if parents left home or a child arrived at school, for instance.[23]

          3. Access by Unauthorized Third Parties

The same Motherboard reporter was able to identify the exact location of a smartphone using only the phone number and a $300 payment to a bounty hunter in an attenuated process that apparently happens regularly and in violation of the apps’ posted privacy policies and the parties’ written nondisclosure agreements.[24] In the Motherboard scenario, a wireless carrier sold an individual’s location data to an aggregator, that sold it to a skip-tracing firm, that sold it to a bail-bond company, that sold it to an independent bounty hunter. The bounty hunter had no written agreement with anyone and no relationship with the wireless carrier or the individual customer, and neither did its source.[25]

The article’s aftermath included revelations that all of the major wireless carriers sold location data to aggregators that ultimately sold the data to hundreds of bounty hunters.[26] Multiple lawmakers sent the major carriers and aggregators letters requesting an explanation of these location data sharing practices.[27]

The ensuing furor prompted the wireless carriers to commit to stop selling location data to aggregators.[28] The Wall Street Journal reported that Verizon, Sprint, T-Mobile, and AT&T all committed to end agreements with downstream location aggregators, and Zumigo (the initial aggregator in the bounty hunter scandal) cut off access by the intermediary aggregator to whom it sold the location data.[29]

          4. Privacy and Security Risks

These investigations indicate that real-time location data that is identified to a particular individual is regularly monetized and sold to third parties in a manner that is arguably inconsistent with the individual’s consent, the apps’ stated privacy policies, the data collector’s third-party nondisclosure agreements, and applicable law.

In other words, location data identified to a specified individual is routinely collected and sold by a variety of parties for a variety of purposes unrelated to the original transaction that justified the initial location data collection. This results in a myriad of privacy and security risks to the individual. Consider a stalker who tracks his or her victim’s location either by signing up for a free Securus or similar trial or by paying a bounty hunter. The victim may be taking strict precautions to elude location tracking and would not even be aware of this risk. In addition, the more entities that possess the victim’s location data, the greater the likelihood of a privacy exposure or data breach.

     B. Sales of De-identified or Anonymous Location Data

          1. Sales by App Owners

Separately, apps that receive individual user location data from aggregators frequently sell location data to third-party buyers for their own commercial purposes. The data is provided in large sets that do not identify the specific individuals who are tracked.[30] The purpose of the data set is to enable the buyer to identify patterns in location data. Such business use cases may involve allowing buyers to spot trends for investment[31] or marketing purposes.[32]

In this context, the justification for the sale and reuse is that the individual’s personally identifiable information (like phone number or name) is deleted from the data and replaced instead with a unique identifier.

The model is basically as follows. A map app organizes location data for a specified commercial neighborhood over a defined time period to show the number of people who walk through the neighborhood during the time period. This foot traffic may show times of day when foot traffic is greatest and areas in the neighborhood that may attract more or less foot traffic. This data may be sold to a retailer for purposes of deciding whether the neighborhood, or any particular part of it, would be suitable for establishing a brick-and-mortar location. The retailer purchases the data for research and investment purposes. Its interest is in the number and patterns of individuals who walk through the neighborhood.

For these purposes, the identity of the individual is not relevant to the data buyer and is not included in the data set. It is the traffic patterns or trends and not the individual’s identity that gives this data set value.

         2. Re-identification by Unknown Third Parties

Data sets may be used to identify the individual through other means, however.

In order to verify the authenticity of the data points that comprise the data set and facilitate the tracking by the app/seller of the unique location data of an individual, the individual is assigned a unique identifier, and the individual’s unique identifier can remain the same. Presumably, then, buyers could use the unique identifier to track identifiers over time and combine them with other data to identify the individual subject.

Separately, using data analytics, location data can be combined with nonlocation data to ascertain an individual’s identity. For example, the retailer that buys the anonymous data set could note that a single data point or individual goes back and forth from a nearby residential address throughout the day. Matching the individual to his address enables identification of the individual.

A more sensational example of this is the use by law enforcement of DNA information combined with location data to identify suspects in cold cases.[33]

The New York Times, with permission from a school teacher, was able to accurately associate anonymous location data with the individual teacher solely by reviewing four months’ and more than a million phones’ worth of location data and combining that with their knowledge of where she worked and lived.[34] The report posits that:

[t]hose with access to the raw [anonymized] data—including employees or clients—could still identify a person without consent. They could follow someone they knew, by pinpointing a phone that regularly spent time at that person’s home address. Or, working in reverse, they could attach a name to an anonymous dot, by seeing where the device spent nights and using public records to figure out who lived there.[35]

In fact, location data alone may be used to identify consumers in large anonymized data sets.

In 2013, MIT and Belgian researchers: “analyzed data on 1.5 million cellphone users in a small European country over a span of 15 months and found that just four points of reference, with fairly low spatial and temporal resolution, was enough to uniquely identify 95 percent of them.”[36]

As technology has evolved and the use and dissemination of location data has proliferated, reidentification of individuals included in anonymized data sets has been greatly facilitated:

With an increasing number of service providers nowadays routinely collecting location traces of their users on unprecedented scales, there is a pronounced interest in the possibility of matching records and datasets based on spatial trajectories. Extending previous work on reidentifiability of spatial data and trajectory matching, we present the first large-scale analysis of user matchability in real mobility datasets on realistic scales, i.e. among two datasets that consist of several million people’s mobility traces, coming from a mobile network operator and transportation smart card usage. . . .We show that for individuals with typical activity in the transportation system (those making 3-4 trips per day on average), a matching algorithm based on the co-occurrence of their activities is expected to achieve a 16.8% success only after a one-week long observation of their mobility traces, and over 55% after four weeks. We show that the main determinant of matchability is the expected number of co-occurring records in the two datasets. Finally, we discuss different scenarios in terms of data collection frequency and give estimates of matchability over time. We show that with higher frequency data collection becoming more common, we can expect much higher success rates in even shorter intervals.[37]

          3. Privacy and Security Risks

As tracking technologies become further developed and more widely accessible and data analytics become more sophisticated, anonymous data points (particularly when tracked over time) can be used to facilitate identification of the individual.

Consider the private investigation of various retail robberies.[38] If the retailer did not have a suspect’s name, its private investigator could identify possible suspects by:

  1. purchasing from an aggregator anonymized cell phone location data for all individuals near each robbed location during the time of each robbery;
  2. pinpointing unique IDs or data points for all phones present at some or all of the robberies;
  3. requesting extended cell phone location data for the unique IDs or data points from the wireless carriers;
  4. purchasing larger pools of anonymized data from an aggregator and reidentify data points within a given area and timeframe; or
  5. hiring a bounty hunter to track the numbers and locations of the phones tied to the unique IDs or data points.[39]

The City of Los Angeles passed rules requiring scooter companies to provide the per-trip location data of each scooter to city officials within 24 hours of the end of the trip.  Although the rider’s identity is not disclosed to the city and the location data will be treated as confidential by the city, it will be accessible in aggregated form to various city agencies and accessible in per-trip form to law enforcement, subject to a warrant, and to third parties, in response to a subpoena.   Given the sensitivity of location data and the ability of using location data itself to identify individuals, consumer advocates have framed this not as a matter between the scooter companies and the city but as a matter of governmental surveillance and debate between individual citizens and the city:

“This data is incredibly, incredibly sensitive,” said Jeremy Gillula, the technology projects director for the Electronic Frontier Foundation, a San Francisco-based digital rights group.

The vast trove of information could reveal many personal details of regular riders — such as whom they’re dating and where they worship — and could be misused if it fell into the wrong hands, the nonprofit Center for Democracy and Technology told the city in a letter.[i]

De-identified, real-time location data is regularly monetized and sold to third parties for a variety of purposes unrelated to the original transaction that justified the initial location data collection. Location tracking use cases include the following scenarios:

  1. location data point identified to a specific individual;[40]
  2. location data point identifiable to a specific individual;
  3. location data point not identified to the individual;
  4. continuous location tracking identified to a specific individual;
  5. continuous location tracking identifiable to a specific individual;
  6. continuous location tracking not identified to the individual;
  7. development of a profile based on location tracking identified to a specific individual;
  8. development of a profile based on location tracking that is identifiable to a specific individual; and
  9. location data used to compile a profile of an unidentified individual.

As described above, the distinctions among these categories become less relevant in practice, and the risks posed by transfers of anonymized location data may be as great as those posed by sales of identified location data.

III. Location Tracking: Profiling the Individual

Precise tracking of an individual’s location over time can be used to discover information about the individual that may not be otherwise available (consider repeat trips to a bar, the home of a person not the individual’s spouse, or to an oncologist), which when combined with other data, can be used to develop a fairly comprehensive profile of the individual. Even anonymized data profiles can pose these risks to the individual due to the relative ease of reidentifying an individual, as described above.

     A. Data Profiling and Decision-Making

Profiling is done for a variety of purposes; targeted advertising and marketing is the most well-known effort. For example, if an Apple customer is in geographical proximity to an Apple Store, his or her phone could provide ads for Apple TV. These ads may be more successful if the individual were located in a TV store near an Apple Store, or better yet, if the individual were located for several minutes in an Apple Store near the Apple TV demo.

Individual data profiling has become sophisticated and comprehensive, and location data is an integral part of profiling:

A profile is a combination of metrics, key performance indicators, scores, business rules, and analytic insights that combine to make up the tendencies, behaviors, and propensities of an individual entity (customer, device, partner, machine). The profile could include:

  • Key demographic data such as age, gender, education level, home location, marital status, income level, wealth level, make and model of car, age of car, age of children, gender of children, and other data. For a machine, it might include model type, physical location, manufacturer, manufacturer location, purchase date, last maintenance date, technician who performed the last maintenance, etc.
  • Key transactional metrics such as number of purchases, purchase amounts, returns, frequency of visits, recency of visits, payments, claims, calls, social posts, etc. For a machine, that might include miles and/or hours of usage, most recent usage time and date, type of usage, usage load, who operated the product, route of product usage (for something like a truck, car, airplane, or train)
  • Scores (combinations of multiple metrics) that measure customer satisfaction level, financial risk tolerance, retirement readiness, FICO, advocacy grade, likelihood to recommend (LTR), and other data. For a machine, that might include performance scores, reliability scores, availability scores, capacity utilization scores, and optimal performance ranges, among other things
  • Business rules inferred using association analysis; for example, if CUST_101 visits a certain Starbucks and a certain Walgreens, we can predict (with 90% confidence level) that there is an 85% likelihood that this customer will visit a certain Chipotle within 60 minutes
  • Group or network relationships (number, strength, direction, sequencing, and clustering of relationships) that capture interests, passions, associations and affiliations gained from using graphic analysis
  • Coefficients that predict certain outcomes or responses based upon certain independent variables found through regression analysis; for example, a machine’s likelihood to break down given a number of interrelated variables such as usage loads since last maintenance, the technician who performed the maintenance, the machine manufacturer, temperatures, humidity, elevation, traffic, idle time, etc.)
  • Behavioral groupings of like or similar machines or people based upon usage transactions (purchases, returns, payments, web clicks, call detail records, credit card payments, claims, etc.) using clustering, K-nearest neighbor (KNN), and segmentation analysis[41]

Location data analytics are used to make a variety of decisions that may impact the individual. One use case for data profiling is credit-risk analysis. Such data profiles may arguably be considered “consumer reports” governed by the federal Fair Credit Reporting Act (FCRA). As the lines have blurred between online decision making and targeted advertising, and prescreening and marketing (the former are protected by FCRA and the latter are not), it certainly appears as if credit availability depends, in part, on secondary market data that the consumer reporting agencies do not treat as “consumer reports” under FCRA.[42]

Payment-card fraud management can also be enhanced by developing profiles of each cardholder. By combining device location data with transaction histories, fraud detection is more precise:

New technologies . . . merg[e] a broader range of financial data, mobile-phone data, and even social-networking data to better establish the likelihood it’s actually you behind the transactions racking up on your cards or mobile device. Nguyen says that Feedzai’s system can improve fraud detection rates from 47 percent to almost 80 percent. ­Chirag Bakshi, founder and CEO of Zumigo, a company in San Jose, California, that provides location-based mobile services, says his company’s data algorithms reduce fraud losses by at least 50 percent.

“When fraudsters steal your identity, what they can’t do is steal your behavior,” Nguyen says. That, in fact, has long been the principle behind credit card fraud alerts. But a conventional credit card company is relying on information from your past to guess whether each attempted transaction is genuine. Today’s new technologies tap into your mobile phone and its more up-to-date information to see if your current behavior matches your purchase.

“[We can use] a SIM card as a proxy for a person,” says Rodger Desai, CEO of Payfone, which works with banks, mobile operators, and fraud detection companies to assess the legitimacy of a given payment. Payfone builds a profile of a user and tracks more than 400 types of data to create what it calls a persistent identity. Change phone company? Noted. Someone steal your phone or clone it? The company will catch that, too. Even if you’ve canceled your cellular data plan, it has ways of flagging the activity of someone who then tries to use the phone’s Wi-Fi connection.[43]

Data analytics for decreasing fraud are likely welcome to the individual. Once a “persistent identity” is created by profiling the individual’s location and related data, however, there are few limits on how that profile may be used or sold:

Mobile location data firms interviewed for this story stressed their dedication to encrypting data to prevent direct connections to individuals, yet there are no industry-wide accepted practices or U.S. government regulations preventing the use of such data in ways that weren’t originally intended. For instance, data reflecting drinking or drug use arguably could find its way into data models for targeting ads for health insurance plans, or even find its way into formulas used to calculate health or auto insurance rates or job eligibility.[44]

     B. Behavioral Influencing

Use of predictive modelling has been extended to influence behavior:

It works like this: Ads press teenagers on Friday nights to buy pimple cream, triggered by predictive analyses that show their social anxieties peaking as the weekend approaches. “Pokémon Go” players are herded to nearby bars, fast-food joints and shops that pay to play in its prediction markets, where “footfall” is the real-life equivalent of online clicks.[45]

The intrusiveness of such profiles cannot be overstated. Facebook has shown advertisers:

how it has the capacity to identify when teenagers feel “insecure”, “worthless” and “need a confidence boost”, according to a leaked documents based on research quietly conducted by the social network[, which] states that the company can monitor posts and photos in real time to determine when young people feel “stressed”, “defeated”, “overwhelmed”, “anxious”, “nervous”, “stupid”, “silly”, “useless” and a “failure.”[46]

Location data is key to this type of influencing:[47]

The next step—”from flat, to fast, to deep, is psychic,” Friedman believes. “I now know your whole psychographic from your phone. I will just push you your groceries, push you the supplies you need, push you the information you need.”

The use of profiles for behavioral targeting is likely as limitless as the use of profiles for predictive behavior:

Imagine a not-so-distant future where you’re just driving on the highway. Your car is sending real-time data about your performance behind the wheel to your insurance company. And in return, the insurance company is sending real-time driver behavior modification punishments back—real-time rate hikes, curfews, even engine lockdowns. Or, if you behave in the way they like, you get an instant rate discount.

In other words, the insurance company is shaping your behavior right then and there. Would you like that? What does it mean for our entire understanding of free will?[48]

Once the individual’s “persistent identity” is created, however, its uses are not limited. Consider another Facebook scandal: Cambridge Analytica. Cambridge Analytica combined personal user data obtained from a Facebook app developer (in violation of its nondisclosure agreement) with data combined from other sources, including location data, to compile profiles of voters around the world for the purpose of influencing elections using propaganda and direct marketing.[49] Up to 87 million Facebook users worldwide were profiled with the intent of waging “psychological warfare” against and targeting “influence operations” to these users.[50] Cambridge Analytica’s parent company’s reach exceeded “100 election campaigns in over 30 countries spanning five continents.”[51] Cambridge Analytica was a secondary market user of the location data collected from Facebook profiles and from external sources. The Facebook users had no idea that the voter profiles were being compiled or that their location data was being used to identify them for specific political campaigns for the purposes of influencing their votes.

     C. Privacy and Security Risks

Use of location tracking data to create individual profiles is not addressed under current law and poses unique risks. The eventual data buyer that compiles the data profile or identifies an individual in relation to a profile may not be in privity with either the individual or the original data collector. Further, once a profile is created for a specific purpose, there are few limits on using the profile for other purposes:

The fact is that location data is flowing around the digital ecosystem with little control. Many of the firms that have built businesses on using mobile location data for ad targeting gather the data from ad calls made by programmatic ad systems. And audience segments like “frequent quick serve restaurant visitors” could be accessed for ad targeting as easily as they could be excluded from targeting parameters for health insurance ads, for instance.  “Even though data is used just for marketing, there’s no reason to think it will only be used just for that purpose,” said Dixon. “Those formulas—they are data hungry,” she said of data models used by insurance firms or other corporations.[52]

At this point, the uses and distribution of individual profiles based on location data appears limitless, even though the individual has no control over or knowledge of them and may not opt out of data profiling or access or correct data profiles. Moreover, use of such profiles has become increasingly intrusive as secondary market participants seek to monetize their value.

IV. United States Law and Location Tracking

Federal law does not directly regulate location tracking or the collection, sale, or use of personal location data.[53] Location tracking has, however, been the focus of recent significant actions.

     A. FTC Enforcement Actions and Issuances

The Federal Trade Commission (the FTC) has focused on location tracking for several years through reports and a series of enforcement actions under section 5 of the Federal Trade Commission Act regarding unfair and deceptive trade practices (UDAP)[54] and the Children’s Online Privacy Protection Act (COPPA).[55]

          1. Sensitivity of Location Data

In its 2013 FTC Staff Report, Mobile Privacy Disclosures: Building Trust Through Transparency, the FTC stated that location tracking should be preceded by just‐in‐time disclosures made to the individual and subject to the individual’s affirmative express consent. The disclosures should clearly explain how the location tracking is conducted (i.e., one‐time versus persistent collection practices) and for which purposes.[56] In its 2012 Privacy Report, the FTC asserted that the precise location data of an individual should be considered “sensitive” information (similar to children’s data, health, and financial information) and should not be stored beyond the time period necessary for providing the service to the individual that justified the location tracking in the first place. The FTC clarified that affirmative express consent is generally required for location data collection, except when appropriate in context (e.g., when the individual searches for nearby weather or locations).[57]

          2. Misuse by Data Collectors

Uber uses real-time location tracking for the purpose of locating drivers and riders schedule and administer rides. In 2017, the FTC pursued a UDAP enforcement action against Uber for its collection of location data even when the app was not in use and use of such data for purposes other than administering rides:

The FTC entered into a consent order with Uber Technologies, Inc. regarding its use of the so-called “God View” feature of the Uber application software (“Uber App”), which implemented continuous geolocation tracking of all users (drivers and riders) at all times and allowed employee access to such tracking information, regardless of whether or not the users were actively using the Uber App or the Uber ride service.[58] The FTC complaint alleged the following unfair and deceptive trade acts and practices: Uber employees improperly accessed the user geolocation information for purposes other than picking up riders, including allegations that employees accessed the geolocation information of certain riders who were journalists critical of Uber’s business practices for the purposes of conducting “opposition research” on such journalists.[59] Uber subsequently publicized action taken to limit and monitor employee access to such geolocation information but such limits and monitoring were ultimately abandoned by Uber.[60][61]

The name “God View” is apt; real-time location tracking compiles a precise and continuous location record of the individual’s whereabouts indefinitely. (Facebook’s BOLO list (“be on lookout”) recently came under scrutiny. Like God View, BOLO uses Facebook app and website activity to monitor the real-time location of users Facebook has determined pose a credible threat to the company or its officers.[62])

          3. Access to and Use of Location Data by Third Parties

The FTC entered into similar consent orders with other companies that collected personal location data:

  1. A cell phone provider whose Chinese security vendor uploaded firmware on the phones to collect user personal data, including cell phone tower location data (UDAP).[63]
  2. A marketing enterprise platform service provider whose targeted advertising software tracked app user location data (including that of children) and combined such data with aggregated wireless network data to identify an individual user’s precise location for purposes of ad targeting (UDAP and COPPA).[64]

The FTC described the variety of systems and methodologies used to develop the marketing platform at issue in the second action above. The InMobi SDK platform allowed app developers to integrate their apps in the platform for purposes of monetizing their users’ location data by allowing third-party advertisers to target ads to the app users:[65]

So how did InMobi circumvent these protections to track the consumer’s location without consent? By creating its own geocoder database. As explained in more detail in the complaint, InMobi collected information through consumers’ devices that allowed it to map out the real-world latitude and longitude coordinates of Wi-Fi networks. InMobi then monitored the Wi-Fi networks that a consumer’s device connected to (on both Android and iOS), and in many instances, the Wi-Fi networks that a consumer’s device was in-range of (on Android). By collecting the BSSID (i.e., a unique identifier) of the Wi-Fi networks that a consumer’s device connected to or was in-range of, and feeding this information into its geocoder database, InMobi could then infer the consumer’s location. Until December 2015, InMobi used this method to track the consumer’s location even if the application that had integrated the InMobi SDK had never asked the consumer for permission to access location, and even if the consumer had turned off all location services on the device.

          4. Notice-and-Choice Model

In all of these enforcement actions, the UDAP claims against the data collector were based on:

  1. the failure to give clear disclosures to the individual;
  2. the failure to obtain valid consent or failure to honor opt-out; and
  3. collection of location data in conflict with stated privacy policies.

Notice and choice are also key to COPPA compliance. The FTC sent COPPA warning letters in 2018 to two parental tracker manufacturers for failure to give direct notice of the real-time collection of precise location information and obtain verifiable parental consent in connection with the marketing and sale of children’s smart watches.[66] (Recent media scrutiny has focused on privacy vulnerabilities associated with children’s GPS tracking.)[67] The failure to give notice and obtain verifiable parental consent was also the crux of the COPPA claim in InMobi.[68]

This notice-and-choice model focuses on whether the individual understood the extent of the real-time location monitoring and consented to or did not opt-out of it. Based on the foregoing, location tracking that is clearly disclosed and subject to effective consent may be permissible under both UDAP and COPPA.

All of these actions were against the data collector and involved location data that was (1) identified to a particular individual and (2) collected over time. The FTC consistently expressed concerned about the pervasiveness and intrusiveness of the continuous location tracking of the individuals.

Although the enforcement action in InMobi was against the initial data collector, the FTC’s concerns about combining location data with other data for purposes of monetizing the data in ways unrelated to the initial transaction between the individual and the app would also apply to secondary market use of the data. As currently enacted, however, neither UDAP nor COPPA grants the FTC enforcement authority over secondary market participants that are not in privity with the individual.

     B. United States Supreme Court: Carpenter v. United States[69]

Last year, the U.S. Supreme Court decided that law enforcement must have a Fourth Amendment probable cause warrant to obtain an individual’s long-term, real-time location data from the individual’s wireless carrier.[70]

Like the FTC, albeit in a much different context, the Court was struck by (1) the intrusiveness and pervasiveness of continuous location tracking; and (2) the use of location data for purposes unrelated to the justification for the original collection.

The facts and background of Carpenter v. United States are as follows:[71]

This case involved a series of armed robberies and an order under the Stored Communications Act (“SCA”).[72] In 2011, a group of men robbed a series of Radio Shack and T-Mobile stores.[73] A suspect gave Federal Bureau of Investigation (“FBI”) agents the names and cellular phone numbers of several accomplices, including Carpenter. Based on that information, the FBI was able to obtain an SCA court order for Carpenter’s cellular phone records, including geolocation information, during the four-month period in which the robberies occurred. (The type of geolocation information at issue here is specifically cell-site location information, which is tied to the proximity of an individual phone to each of the wireless carrier’s radio antennae.[74])

The SCA prescribes limited circumstances under which the government can compel an electronic communications service provider (“SP”) to disclose user content or data.[75] The government must obtain a warrant, subpoena, or court order under the SCA (“SCA order”) requiring such disclosure without notice to the user.[76] The effect of these SCA requirements is to give the user an expectation of privacy in his SP records.[77]

In response to the SCA order, the FBI “obtained 12,898 location points cataloging Carpenter’s movements—an average of 101 data points per day.”[78] As a result, he was arrested for multiple counts of armed robbery and the federal crime of carrying a firearm. He argued that the FBI’s seizure of the geolocation records “violated the Fourth Amendment because they had been obtained without a warrant supported by probable cause.”[79]

At issue was whether Carpenter had a “reasonable expectation of privacy” in his personal location information, which was entitled to protection under the Fourth Amendment prohibition against “unreasonable search and seizure.” If the answer to the preceding question is “yes,” then access to such records would have required a “warrant supported by probable cause,” rather than the SCA order’s less stringent showing requiring the proffer of “specific and articulable facts showing that there are reasonable grounds to believe that the contents of a wire or electronic communication, or the records or other information sought, are relevant and material to an ongoing criminal investigation.”[80]

In a precursor to Carpenter, the Court considered the applicability of the Fourth Amendment to the use of a GPS tracking device on a suspect’s vehicle.[81] In that case, FBI agents tracked the vehicle movements continuously over a 28-day period. The majority opinion held that this tracking was subject to the Fourth Amendment and relied on a property-right theory in doing so; the suspect had a reasonable expectation of privacy in his vehicle, and the FBI’s intrusion in the undercarriage of the vehicle to place the GPS violated that right.

In his concurring opinion in Jones, Justice Alito focused more on the invasion of privacy resulting from the GPS tracking itself (rather than the placement of the tracker on the vehicle) and explained persistent GPS tracking surveillance as follows:

Prolonged surveillance reveals types of information not revealed by short-term surveillance, such as what a person does repeatedly, what he does not do, and what he does ensemble. These types of information can each reveal more about a person than does any individual trip viewed in isolation. Repeated visits to a church, a gym, a bar, or a bookie tell a story not told by any single visit, as does one’s not visiting any of these places over the course of a month. The sequence of a person’s movements can reveal still more; a single trip to a gynecologist’s office tells little about a woman, but that trip followed a few weeks later by a visit to a baby supply store tells a different story. A person who knows all of another’s travels can deduce whether he is a weekly church goer, a heavy drinker, a regular at the gym, an unfaithful husband, an outpatient receiving medical treatment, an associate of particular individuals or political groups – and not just one such fact about a person, but all such facts.[82]

The Court recognized its unique challenge in Carpenter:

The issue we confront today is how to apply the Fourth Amendment to a new phenomenon: the ability to chronicle a person’s past movements through the record of his cell phone signals. . . . [L]ike GPS tracking of a vehicle, cell phone location information is detailed, encyclopedic, and effortlessly compiled.[83]

The Court focused on the pervasiveness of the information collected, the completeness of the individual profile that may be compiled using such information, and the retrospective use of data collected on an individual who had not been a suspect during the time period of collection.[84] The Court held that the “tireless surveillance” of location data collection and the collection of such data by using the individual’s smart phone, which is such a personal and intimate device as to be considered an extension of the self, “invaded Carpenter’s reasonable expectation of privacy in the whole of his physical movements.”[85]

     C. Proposed Federal Law

There are competing views as to what federal data privacy legislation should look like.

Senator Rubio proposed a Senate bill requiring the FTC to promulgate regulations to impose privacy requirements on internet service providers that would allow individuals access to and the ability to dispute inaccuracies in records relating to the individual. This law would preempt state privacy laws and exempt certain entities covered by other federal privacy laws.[86]

Senator Markey proposed a Senate bill, the CONSENT Act, that would require the FTC to issue regulations requiring: consumer consent for the sale of “precise geolocation information”; disclosures regarding the collection, use, and sharing of such data; and preservation of the anonymity of such data that has been de-identified.[87] This act would protect such data upon collection and from sale in the secondary market.

Senator Wyden has introduced, for discussion purposes, a draft of a much more comprehensive Senate data privacy bill based on the European Union’s General Data Protection Regulation.[88] This act would protect “any information, regardless of how the information is collected, inferred, or obtained that is reasonably linkable to a specific consumer or consumer device.” This definition appears to protect location data that is identified or identifiable. The act applies to “automated decision making” and disclosures to third parties, which may make it applicable to the secondary location data market.

In addition, big data companies and their industry associations have issued guidance on federal privacy legislation; Apple’s proposal is one of the few that recommends regulating data brokers and allowing individuals to access and delete data sold on the secondary market.[89]

Intel has proposed a draft “Innovative and Ethical Data Use Act of 2018,” which would be enforced by the FTC and focuses on “privacy risk” to individuals; this bill would require privacy notices to specify the intended uses of the data and limit the further use and dissemination of data. The collection and use of geolocation data (which the bill states creates “significant privacy risk” to the individual) would necessitate more explicit notice and heightened privacy protections.

This month the U.S. Senate Committee on the Judiciary held a hearing, “GDPR & CCPA: Opt-ins, Consumer Control, and the Impact on Competition and Innovation,” which squarely addressed location tracking, with Senator Josh Hawley questioning Google’s senior privacy counsel, Will DeVries, about Google’s location tracking practices:[90]

He wanted to know whether DeVries thought a user would expect that Google tracks “where she goes to work, where her boyfriend lives, where she goes to church” or to the doctor. “Do you think the user expects that? Do you think you’re communicating clearly when a user cannot turn off their location tracking?

DeVries said Google’s use of location tracking is to make its services more effective, not to make money.

It’s “necessary to make services work,” DeVries said. “If we turned those off, your phone wouldn’t work like you’d expect,” adding that the operational aspects of it are complicated

But Hawley wasn’t satisfied with that. 

“It’s not complicated,” he said. “What’s complicated is you don’t allow consumers to stop your tracking of them.”

He continued, “Here is my basic concern: Americans have not signed up for this, they think the products you’re offering are free; they’re not free. They think they can opt out; they can’t opt out. It’s kind of like that old Eagle’s song, ‘You can check out any time you like, but you can never leave.’ And that’s a problem for the American consumer; it’s a real problem. And for somebody who has two small kids at home, the idea that your company and others like it will sweep up information to build a user profile on them that will track every step, every movement and monetize that, and they can’t do anything about it, and I can’t do anything about it, that’s a big problem this Congress needs to address.”

V. State Law

State laws vary regarding whether a warrant must be obtained by law enforcement to obtain cell phone location information.[91] All 50 states have unfair and deceptive trade practices laws like the federal UDAP statute (state UDAP).[92] Recently, some state legislatures have begun to focus more specifically on the privacy of individual location data.

     A. Location Data Collection

California’s Consumer Privacy Act (CCPA) (effective July 1, 2020) provides that protected “personal information” includes “geolocation data” that “identifies, relates to, describes, is capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or household.”[93] The effect of this protection would be to enable consumers to (1) find out what types of location data are being collected and how it is used, and (2) direct companies to (a) delete location data under certain circumstances, and (b) refrain from selling location data to third parties.[94]

     B. Secondary Location-Data Market

Only one state has acted to regulate the secondary data market generally. Vermont enacted the first U.S. law governing data brokers, which was effective January 1, 2019.[95] This statute applies to companies that collect or sell “brokered personal information” regarding an individual that is not a customer of the company (the data brokers). “Brokered personal information” includes: “other information that, alone or in combination with the other information sold or licensed, would allow a reasonable person to identify the consumer with reasonable security.”[96] Arguably, previously anonymous location information that is identified or identifiable to a specific individual would be protected by the statute. Data brokers are, among other requirements, required to maintain information security programs and to register annually with the state; annual registrations should describe the manner of opt-out for individuals if opt-outs of its database are offered.[97] There is no mandatory requirement for data brokers to make disclosures to individuals or provide database opt-outs.

Effective January 18, 2019, New York has regulated the use of secondary market data by life insurers in underwriting, particularly the use of data profiling as a “proxy for traditional medical underwriting.”[98] The Department of Financial Services explained the risk to the individual of external data analytics, including profiling:

First, the use of external data sources, algorithms, and predictive models has a significant potential negative impact on the availability and affordability of life insurance for protected classes of consumers. An insurer should not use an external data source, algorithm or predictive model for underwriting or rating purposes unless the insurer can establish that the data source does not use and is not based in any way on race, color, creed, national origin, status as a victim of domestic violence, past lawful travel, or sexual orientation in any manner, or any other protected class. . . . Second, the use of external data sources is often accompanied by a lack of transparency for consumers. Where an insurer is using external data sources or predictive models, the reason or reasons for any declination, limitation, rate differential or other adverse underwriting decision provided to the insured or potential insured should include details about all information upon which the insurer based such decision, including the specific source of the information upon which the insurer based its adverse underwriting decision

     C. Proposed State Law

Several states are in the process of considering various types of privacy legislation.

The proposed Washington Privacy Act expressly provides that specific location data is a personal identifier.[99] Like the California Act, the Washington Act would give consumers more access to and control over how their identifiable location data is used.

Other states are in the process of considering privacy legislation based on the CCPA, including Hawaii, Maryland, Massachusetts, New Mexico, and Rhode Island.[100] These statutes would also protect location data that is identified or identifiable to an individual. Illinois, New Jersey, and New York are all considering statutes that apply to online services, which may cover location data tracked via app.[101]

New Jersey is considering legislation that would require operators of mobile device apps that collect GPS data to clearly disclose to the user what GPS data is collected, all third parties to whom GPS data is disclosed, and how long the GPS data is retained, and allow the user a meaningful opt-in to certain types of GPS data-sharing.[102]

Oregon is considering new legislation amending its UDAP (the Data Transparency and Privacy Protection Act, “DATPA”) to require “express written consent” from an individual prior to collecting or selling geolocation data.[103] A quick summary provides that:

Additional DATPA provisions require a business entity collecting, analyzing, deriving, selling, leasing, or otherwise transferring an Oregonian’s geolocation or audiovisual information to first disclose intent and methodology, and receive express consent from that individual. Geolocation refers to data displaying the location of a digital electronic device (cellular phone, tablet, etc.) on a map or similar depiction.[104]

The proposed DATPA would require an individual’s consent prior to “the collection, use, storage, analysis, derivation, sale, lease or other transfer” of geolocation information.[105] This would require both the initial data collector and each secondary market participant to obtain consent from an individual prior to using location data to profile the individual or transferring the location data to another party.

VI. Alternatives for Regulation of Location Data

Given the rapidly evolving and seeming limitlessness of location data tracking and usage, regulating specific technologies or types of use may not be practical.

Another regulatory model would require the default for location-based apps to limit collection, use, and disclosure to that which is necessary for the provision of app services. Individual consent could be obtained for marketing by the collector. Recipients of the location data could be directly regulated to limit use to facilitation of app services and to prohibit other use or disclosure. This would prevent sale of the individual’s location to unknown third parties for unknown purposes.

The following proposals are incomplete suggestions; one or more may act as a launchpad for regulatory discussion. A combination of approaches commensurate with the sensitivity of location data and the complexity of its uses may be appropriate.

     A. At the Point of Initial Collection or Use

          1. Ensuring That an Identified Individual Has Notice and Choice

Current laws and enforcement regulate the data collector and focus on location data that is identified to a particular individual. Pending and proposed laws may also protect location data that is personally identifiable.[106]

The current regulatory approach focuses on whether the individual has the right to know how his or her location data is collected, used, and shared, and whether he or she should have the right to opt out of or decline to consent to certain types of sharing. Heightened scrutiny is generally given to disclosures of location data by the data collector for commercial purposes unrelated to (1) the purposes of the initial collection, or (2) the relationship between the collector and the individual (i.e., secondary market usage). Several proposed laws follow this approach.

Enforcement and liability in this context may depend on the following issues: whether the privacy policy of the data collector and related settings are clear; whether the user consent is effective; whether the purpose of subsequent sharing by the data collector aligns with the disclosures or consents; and whether adequate nondisclosure agreements and security measures are in place to protect the location data from both disclosure and uses exceeding the data collector’s original notice.

Similarly, the digital marketing industry group “Data Marketing & Analytics” (DMA) has guidelines for direct, mobile location-based marketing that advise marketers to comply with the Telephone Consumer Protection Act[107] and “inform Individuals how location information will be used, disclosed and protected so that the Individual may make an informed decision about whether or not to use the service or consent to the receipt of such communications. Location-Based information must not be shared with third-party marketers unless the individual has given Prior Express Consent for the disclosure.”[108]

The consent model can be problematic in this context. Google’s consent practices for location tracking have come under recent scrutiny:

  1. If a user turns off “Location History” for Google services, this action does not stop location tracking, but only halts the user’s ability to view his or her location data going forward.[109]
  2. In order to stop location tracking by Google, the user must go to a separate setting, “Web & App Activity,” and opt out of tracking.[110]

The two settings are not in proximity to one another and do not cross-reference each other.

At issue is whether the following are deceptive: (1) the text of first setting (“Location History”); (2) the text and location of the second setting (“Web & App Activity”) (which is not in proximity to or cross-referenced with the first); and (3) the default setting for both is real-time location tracking. Google argues that its disclosures are clear and that user consents to location tracking are valid.[111]

France recently fined Google $57 million on the basis of this practice for violation of the General Data Protection Regulation’s requirements regarding clear disclosures and user consent.[112] In the United States, the FTC is under pressure to investigate Google for these practices under UDAP and an existing consent order with Google.[113] States attorneys general are beginning to consider pursuing state UDAP enforcement actions against Google.[114]

The City of Los Angeles recently sued The Weather Channel (TWC) app on the basis of “fraudulent and deceptive” trade practices. TWC app user location information was sold to advertisers and marketers for purposes of serving app users advertising targeted to their location.[115]

The TWC app location consent prompt states that location access will be used to provide personalized local weather. The consent does not reference marketing or that the location tracking would continue even when the app was not in use:

For years, TWC has deceptively used its Weather Channel app to amass its users’ private, personal geolocation data—tracking minute details about its users’ locations throughout the day and night.[116]

The notice-and-choice model may not be workable:[117]

The free and informed consent that today’s privacy regime imagines simply cannot be achieved. Collection and processing practices are too complicated. No company can reasonably tell a consumer what is really happening to his or her data. No consumer can reasonably understand it. And if companies can continue to have their way with user data as long as they tell users first, consumers will continue to accept the unacceptable: If they want to reap the benefits of these products, this is the price they will have to pay.

There are also deficiencies in the app development models espoused by several Big Tech companies in which app developers are required to provide notice and choice, but without actual oversight by the platforms making the apps available to individuals. An exhaustive study of pre-installed Andoid apps showed a lack of supervision by Google over the location data and other collection activities of apps that come automatically loaded on each Android device (which are often not susceptible to deletion by the individual).[118]  Instead, Google apparently relied on privacy and security requirements and tools provided to the app developers, much like Facebook apparently relied on disclosure limitations imposed on its app developers, all without actually overseeing the privacy practices of the app developers.

          2. Substantive Limits on Collection

One difficulty with consent in this context is that the data collector does not know what all possible uses of the location data might be. Indeed, collection of location data often seems excessive in comparison to the purpose of collection. The effect of over-collection of continuous location data has been to motivate alternate data usage and monetization.

As an alternative or complement to the notice-and-choice model, substantive limits could be placed on how much and what types of location data may be collected at the outset. This model would look toward the collector as a gatekeeper. For example, a collector could be limited to the collection of location data only as necessary to provide the service for which the location data is collected. That would limit the initial exhaustive volume of location data.

     B. At the Point of Transfer

          1. Ensuring That Individual Location Data Is Anonymous

If location data is identified or identifiable to a specific individual when collected, it may be entitled to protection under current and proposed privacy law. In order to transfer such data, the individual’s consent or failure to opt out may be required. Location data that is properly anonymized would generally be excludable from the definition of personal information under applicable law and not subject to the notice-and-choice model.

If rapidly evolving tracking technologies and data analytic methodologies enable an actual location to be used to identify a unique individual, however, then arguably unique location data is personal information. The key in this context would be identifiability, which would be dependent on the privacy and security measures taken to ensure the anonymity of the data and the likelihood of risk or reidentification of the location data to an individual.

Location tracking use cases include the following scenarios:

  1. location data point identified to a specific individual;
  2. location data point identifiable to a specific individual;
  3. location data point not identified to the individual;
  4. continuous location tracking identified to a specific individual;
  5. continuous location tracking identifiable to a specific individual;
  6. continuous location tracking not identified to the individual;
  7. development of a profile based on location tracking identified to a specific individual;
  8. development of a profile based on location tracking that is identifiable to a specific individual;
  9. location data used to compile a profile of an unidentified individual.

As described above, the distinctions between these categories become less relevant in practice.

Is regulation of data collectors sufficient to address these privacy risks? Can de-identified location data be rendered truly anonymous?

If the pervasiveness and intrusiveness of continuous location tracking indicates that location data should be subject to heightened privacy rights (as in the Jones concurring opinion, Carpenter, and the relevant FTC actions and guidance), should location tracking data be regulated regardless of whether the data is identified to a particular individual?

The analysis is even more complicated when reidentification is accomplished by a secondary market participant. In that event, who would be liable for privacy violations? Would liability rest with the data collector that did not ensure true anonymity, or with the secondary market user that was not in privity with the individual? Would it matter if data collection predated the specific technology or methodology that facilitated later identification, whether by the data collector or another party?

           2. Restricting and Prohibiting Transfer

The transfer of location data could be prohibited other than as necessary to provide the underlying service to the individual and/or for any secondary market purpose. The transferee’s use could be similarly limited. The goal would be to limit the monetization of location data in the secondary data market. As demonstrated in the wireless carrier aggregated data scenarios described above, this approach is susceptible to abuse by the parties.

     C. Upon Individual Profiling Based on Location Data

Creating individual profiles using location data poses unique risks to the individual. Precise tracking of an individual’s location over time can be used to discover information about the individual that may not be otherwise available (consider repeat visits to a casino, the home of a person not the individual’s spouse, a visit to Planned Parenthood, repeat visits to an oncologist), which when combined with other data, can be used to develop a fairly comprehensive profile of the individual.

Arguably, if a comprehensive profile is generated by the original data collector and is identified or identifiable to a specific individual, the combined data may be protected under applicable privacy law, but in the secondary market, the eventual data buyer is neither subject to current privacy regulation nor in privity with the individual or the original data collector. This includes:

  1. compilation of a data profile or adding data to the profile;
  2. identifying an individual in relation to a previously anonymous profile;
  3. resale of the profile;
  4. use of the profile to make decisions impacting the individual; and
  5. use of the profile to influence the individual’s behavior.

Very few of these activities are impacted by current U.S. privacy regulation, and none of them fit the current notice-and-choice model.

Consider again the Facebook/Cambridge Analytica scandal. Although Facebook is subject to UDAP for its collection of the data, the app developer that accessed it and the profiler that purchased it were unregulated; the risks of resale and unauthorized use were not addressed by current U.S. law.

New York’s letter to life insurance companies (described above) highlights the risks to individuals posed by secondary market usage and profiling in insurance underwriting. These same risks are present in credit marketing and underwriting, and development of use cases for data profiling is likely to explode in the same manner that monetization of data has, meaning that we cannot determine all of the possible use cases at a given point in time.

Data profiles are used daily to make decisions regarding the individual, without regard to whether the individual knows that there is a profile or that the profile is being used to make a decision affecting the individual; this precludes regulation solely through individual choice or consent. In that event, data profiling decisioning and targeting based on profiles may be better regulated directly.

Regulators could prohibit profiling using location data altogether or by secondary-market participants. Like limiting collection, use, and transfer, this would have the impact of diminishing the separate value and monetization of location data.

     D. Imposing a Duty of Care on Market Participants

Collectors, users, profilers, and third parties could be regulated directly to owe the individual a duty of care in collecting, profiling, sharing, and using location data and to have other direct obligations to the individual. In this way, privity could be created between each individual and the market participants and the risk of loss or error could be shifted from the individual to the market participant that caused the harm.  This approach has precedent in the federal regulation of consumer reporting agencies and disclosers and users of consumer reports but would be much more comprehensive and complicated in scope.

VI. Conclusions

To the extent that current and proposed privacy laws protect location data, such protection is limited to location data that is identified (or in some cases identifiable) to an individual. Requirements generally apply only to the initial data collector.

As the technological lines blur between identified and identifiable, and identifiable and not identified or anonymized, the distinctions between the categories may become less relevant. This complicates the regulatory analysis.

Moreover, recent media accounts and enforcement actions reveal a robust secondary market:

  1. identified location data is regularly acquired and used by third parties with whom the individual has no direct relationship;
  2. de-identified or anonymized location data is regularly re-identified; and
  3. location data is routinely combined with other types of personal data and used by third parties with whom the individual has no direct relationship to compile comprehensive profiles of the individual and make decisions about the individual or attempt to influence behavior of the individual.

These secondary-market practices are not currently addressed by United States law.

Profile development and decisioning or influencing based on data analytics, whether relying solely on location data or combining location data with other data, is a distinct business from the initial transaction between the individual and the data collector, and poses unique risks to the individual not present during the initial collection. These secondary market uses are also complex.

If an individual can be identified by location and further characteristics or acts may be attributable directly to the individual by virtue of his or her geographical movements, then discussions of privacy regulation should include location tracking. If parties removed from the initial transaction between the individual and the data collector subsequently reidentify data to an individual or develop a profile of the individual, or make decisions regarding the individual, then consideration should also be given to whether regulation of the initial transaction and limitation of the use and disclosure by the data collector and its service providers adequately address the risks posed by the secondary location data market. The current notice-and-choice model alone is inadequate to close this Pandora’s Box.

The power of place: location tracking and location data profiling are big business. Each poses distinct privacy risks to the individual and remain largely unregulated in the United States.


[1] The author thanks Dr. Peter Alan Jezewski for his editing contributions.

[2] Although potentially applicable to a wider variety of personal data, this article focuses solely on location data. In addition, “data profiling” can refer to the process of reviewing source data to ensure its accuracy and integrity. In this article, the term is used to describe the process of characterizing an individual using data related to the individual.

[3] The presentation materials for the Future of Privacy Forum’s class, “Location Data: GPS, Wi-Fi, Spatial Analytics,” gives an excellent overview of the types of systems, hardware, and software that are involved in location tracking. Future of Privacy Forum, Sources of Data: Mobile Sensors, Wi-Fi Analytics (Nov. 27, 2018).

[4] Security Baron, The Data Big Tech Companies Have On You (Or, At Least, What They Admit To), Security Baron, Sept. 30, 2018.

[5] These types of services are often referred to as “location-based services.” See D. Oragui, 7 Examples of Location-Based Services Apps, The Manifest, Sept. 28, 2018.

[6] Consider that you are in a store; your real-time location can be collected via your smartphone using all of a combination of the following systems: GPS (via satellite); cell tower proximity; Wi-Fi networks; Bluetooth or beacons; LED; and audio. The smartphone has distinct hardware and software to facilitate location tracking through this variety of external systems. In addition, the apps on the smartphone have their own software to facilitate collection. This data can include your precise latitude, longitude, and altitude, including location within a building. These systems are ubiquitous, and much of the technologies are relatively inexpensive.

[7] Longitudinal data may show the individual’s movements during a specified timeframe, which would be helpful for a fitness tracker that calculates distance achieved and calories burned. Alternatively, the data tracked over time may be of a particular geographical location if a certain type of data traffic is more relevant than a single individual’s location; for example, understanding the number of measles cases with a month in a specific county allows public health planners to treat and prevent the spread of the disease.

[8] Y. Vilner, Location Analytics and Retail—Friends At Last, Hackernoon.com, Oct. 26, 2017.

[9] AlternativeData.org, Industry Stats, ALT. DATA.

[10] J. Valentino-DeVries, N. Singer, M. Keller, & A Krolik, Your Apps Know Where You Were Last Night and They’re Not Keeping It Secret, N.Y. Times, Dec. 10, 2018 [emphasis supplied].

[11] T. Costa, How Location Analytics Will Transform Retail, Harv. B. Rev. (Mar. 12, 2014).

[12] Id.

[13] S. Zuboff, A Digital Declaration, FAZ.net, Sept.9, 2014.

[14] J. Naughton, “The goal is to automate us”: welcome to the age of surveillance capitalism, The Guardian, Jan. 20, 2019.

[15] The secondary data market is not limited to location data and is a topic in its own right. Discussion of that market here focuses on the risks involved when location data is sold on the secondary market, although some of these concerns may be general to other types of secondary-market data.

[16] J. Valentino-DeVries, Service Meant to Monitor Inmates’ Calls Could Track You, Too, N.Y. Times, May 10, 2018.

[17] W. Oremus, The Privacy Scandal That Should Be Bigger Than Cambridge Analytica, Slate, May 21, 2018.

[18] K. Bode, What A-GPS Data Is (and Why Wireless Carriers Most Definitely Shouldn’t Be Selling It), Motherboard, Feb. 7, 2019.

[19] J. Brodkin, Selling 911 location data is illegal—US carriers reportedly did it anyway, ARS Tech., Feb. 13, 2019.

[20] Id.

[21] B. Krebs, Tracking Firm LocationSmart Leaked Location Data for Customers of All Major U.S. Mobile Carriers Without Consent in Real Time Via Its Web Site, Krebs on Security, May 17, 2018.

[22] J. Cox, Hacker Breaches Securus, the Company That Helps Cops Track Phones Across the US, Motherboard, May 16, 2018.

[23] J. Fingas, Family tracking app leaked real-time location data for weeks. It would have let intruders spy on a child’s whereabouts., ENGADGET, Mar. 24, 2019.

[24] J. Cox, I Gave a Bounty Hunter $300. Then He Located Our Phone, Motherboard, Jan. 8, 2019.

[25] Id.

[26] J. Cox, Hundreds of Bounty Hunters Had Access to AT&T, T-Mobile, and Sprint Customer Location Data for Years, Motherboard, Feb. 6, 2019.

[27] E&C Republicans, Letters to Zumingo, Microbilt, T-Mobile, AT&T, Sprint, and Verizon on Location Sharing Practices (Jan. 16, 2019).

[28] J. Cox, AT&T to Stop Selling Location Data to Third Parties After Motherboard Investigation, Motherboard, Jan. 10, 2019.

[29]             D. Fitzgerald & S. Krouse, T-Mobile, AT&T Pledge to Stop Location Sharing by End of March, Wall St. J., Jan. 11, 2019.

[30] For convenience’s sake, this data is referred to as “anonymous” and “anonymized,” although in practice there is an entire spectrum of de-identification, and the assumptions made in this article may vary depending on the level of de-identification technologies employed. See K. Finch, A Visual Guide to Practical Data De-Identification, Future of Privacy Forum, Apr. 25, 2016.

[31] R. Dezember, Your Smartphone’s Location Data is Worth Big Money to Wall Street, Wall St. J., Nov. 2, 2018.

[32] S. Ghosh, Location Data and The Growing Role in Marketing and Advertising Campaigns, Martech Series, June 8, 2018.

[33] S. Mervosh, Jerry Westrom Threw Away a Napkin Last Month. It Was Used to Charge Him in a 1993 Murder, N.Y. Times, Feb. 17, 2019.

[34] J. Valentino-DeVries, N. Singer, M. Keller & A Krolik, Your Apps Know Where You Were Last Night and They’re Not Keeping It Secret.

[35] Id. [emphasis supplied].

[36] L. Hardesty, How hard is it to “de-anonymize” cellphone data?, MIT News, Mar. 27, 2013.

[37] D. Kondor, B. Hashemian, Y. de Montjoye & C. Ratti, Towards matching user mobility traces in large-scale datasets, Ieee Trans. on Big Data (Abstract), Sept. 24, 2018.

[38] This fact set is based on the scenario in Carpenter v. U.S., discussed below.

[39] The focus of this article is on the privacy implications of commercial location data tracking. Many of these practices are used by law enforcement as well, but addressing the need for balance with public safety and law enforcement purposes is a topic in its own right and beyond the scope of this article.

[40] L. Nelson, L.A. wants to track your scooter trips. Is it a dangerous precedent?, L.A. TIMES, Mar. 15, 2019.

[41] B. Schmarzo, Best Practices for Analytics Profiles, Infocus, July 8, 2014 [emphasis supplied].

[42] E. Mierzwinski & J. Chester, Selling Consumers Not Lists: The New World of Digital Decision-Making and the Role of the Fair Credit Reporting Act, Suffolk U. L. Rev. 46 (2013).

[43] Laursen, Who Are You?, MIT Tech. Rev. (Jan. 2015) [emphasis supplied].

[44] K.Kaye, Why the Industry Needs a Gut-Check on Location Data Use, AD AGE, Apr. 26, 2017.

[45] S. Zuboff, Surveillance capitalism’ has gone rogue. We must curb its excesses., Wash. Post, Jan. 24, 2019.

[46] S. Levin, Facebook told advertisers it can identify teens feeling ‘insecure’ and ‘worthless’, The Guardian, May 1, 2017.

[47] J. McKendrick, Information Technology Enters A ‘Psychic’ Stage, Forbes, Mar. 12, 2019.

[48] NPR Interview with Shoshana Zuboff, ‘We Are No Longer The Customers’: Inside ‘The Age of Surveillance Capitalism’, WBUR, Jan. 15, 2019.

[49] A. Chang, The Facebook and Cambridge Analytica scandal, explained with a simple diagram, Vox, May 2, 2018.

[50] Id.

[51] D. Ghoshal, Mapped: The breathtaking global reach of Cambridge Analytica’s parent company, Quartz, Mar. 28, 2018.

[52] K. Kaye, Why the Industry Needs a Gut-Check on Location Data Use, Ad Age, Apr. 26, 2017.

[53] Wireless carriers must certify that they do not use assisted GPS information other than for enhanced 9-1-1 purposes. 80 FR 45897 (08/03/2015). 47 U.S.C. § 222 protects “customer proprietary network information” (CPNI), which includes location data when a wireless customer makes or receives a call; it does not currently protect location data tracked via phone when calls are not being made. See EPIC, CPNI: Mobile Location Data as CPNI.

[54]             Section 5 of the Federal Trade Commission Act (the FTC Act), 15 U.S.C. § 45, prohibits “unfair or deceptive acts or practices in or affecting commerce.” The FTC regularly prosecutes enforcement actions in the privacy and cybersecurity context under the FTC Act.

[55] The Children’s Online Privacy Protection Act, 15 U.S.C. §§ 6501–6505 and implementing regulation 16 C.F.R. pt. 312 (COPPA), generally require operators of website or online services that collect information from children under 13 years old to give parents clear notice of the collection practices and obtain “verifiable parental consent” to such collection.

[56] FTC Staff Report, Mobile Privacy Disclosures: Building Trust Through Transparency (Feb. 1, 2013).

[57] FTC Report, Protecting Consumer Privacy in an Era of Rapid Change (Mar. 26, 2012).

[58] Complaint at 3, In re Uber Tech., Inc., File No. 1523054 (FTC Feb. 2017).

[59] Id. at 2.

[60] Id. at 2–3.

[61] P. Boshell, Survey of Developments in Federal Privacy Law, 74 Bus. Law. 1, 193 (Winter 2018/2019) [emphasis supplied].

[62] S. Roderiguez, Facebook uses its apps to track users it thinks could threaten employees and offices, CNBC, Feb. 14, 2019.

[63] In re BLU Products, Inc., File No.1723025, Decision and Order (FTC Apr. 2018).

[64] United States v. InMobi Pte Ltd, Case No. 3:16-cv-3474, Stipulated Order for Permanent Injunction and Civil Penalty Judgment (ND Ca June 2016).

[65] N. Sannappa & L. Cranor, A deep dive into mobile app location privacy following the InMobi settlement, FTC, Aug. 9, 2016.

[66] FTC Letter to Gator Group Co., Ltd (Apr. 26, 2018); FTC Letter to Tinitell, Inc. (Apr. 26, 2018).

[67] L. Mathews, Your Child’s GPS Watch Could Be Exposing Their Location In Real-time, Forbes, Feb. 5, 2019.

[68] InMobi, at 2.

[69] Susan Freiwald & Stephen Smith, The Carpenter Chronicle: A Near-Perfect Surveillance, 132 Harv. L. Rev. 205 (Nov. 9, 2018) (providing a thorough history of federal legislative and judicial authorities regarding modern surveillance, including GPS and cell phones).

[70] Carpenter v. United States, 585 U.S. __­_­ (2018).

[71] P. Boshell, Survey of Developments in Federal Privacy Law, at 198.

[72] 18 U.S.C. § 2703.

[73] Carpenter, 585 U.S. at ___.

[74] Id. at ___.

[75] Id. at ___.

[76] Id.

[77] Warshak v. United States, 631 F.3d 266 (6th Cir. 2007).

[78] Carpenter at ___.

[79] Id.

[80] 18 U.S.C. § 2703(d) (2018).

[81] U.S. v. Jones, 565 U.S. 400 (2012).

[82] Jones, 565 U.S. at 413 (Alito, J., concurring).

[83] Carpenter at ___.

[84] Id. at ___.

[85] Id. at ___ (emphasis supplied).

[86] S. ____, American Data Dissemination Act, 116th Cong. (2019).

[87] S. 2639, Customer Online Notification for Stopping Edge-provider Network Transgressions Act, 115th Cong. (2018).

[88] S._____, The Consumer Data Protection Act (2018).

[89] D. Abril, This Is What Tech Companies Want in Any Federal Data Privacy Legislation, Fortune, Feb. 21, 2019.

[90] A. Carson, At hearing, US Senate wants answers on location tracking, opt-in consent, The Priv. Advisor, Mar. 13, 2019.

[91] ACLU, Cell Phone Location Tracking Laws by State.

[92] National Consumer Law Center, Unfair & Deceptive Acts & Practices.

[93] Cal. Civ. Code § 1798.140(o)(1)(G) (2018).

[94] Cal. Civ. Code §§ 1798.100(a), (b); 1798.105(b); 1798.110; 1798.115; 1798.120(b); 1798.130; and 1798.135

[95] 9 V.S.A. § 2430 (2019).

[96] Id.

[97] 9 V.S.A. § 2447 (2019).

[98] NY Dept. Fin’l Servs. 2019 Circular Letter No. 1.

[99] Washington Privacy Act, S. 5376, 66th Leg., Reg. Sess. (Wa. 2019).

[100] J. Cedarbaum, D. Freeman & L. Lichlyter, States Consider Privacy Legislation in the Wake of California’s Consumer Privacy Act, Wilmer Hale, Feb. 20, 2019.

[101] Id.

[102] An Act concerning certain mobile device applications and global positioning system data, S. 4974, 218th Leg. (NJ 2019).

[103] Proposed H.B. 2866, Data Transparency and Privacy Protection Act, 80th Leg., Reg. Sess. (Or. 2019).

[104] S. Pastrick, CUB Protects Oregonians’ Digital Privacy By Way Of HB 2866, Or. Cub, Feb. 5, 2019.

[105] Proposed H.B. 2866 at § 1(2)(a).

[106] U.S. Senator Ron Wyden’s draft Consumer Data Protection Act includes a do-not-track provision that would allow the individual to opt out of “personal information” sharing. Wyden has said that the bill would allow individuals to know what location data is being collected and to opt out of collection. Senator calls for regulation that would force tech companies to offer “do not track” option, CBS News, Jan. 10, 2019.

[107] Data Marketing & Analytics, Mobile Marketing Guidelines for Ethical Business Practice.

[108] Id.

[109] R. Nakashima, Google clarifies location-tracking policy, AP News, Aug. 16, 2018.

[110] E. Dreyfuss, Google Tracks You Even if Location History’s Off. Here’s How to Stop It, Wired, Aug. 13, 2018.

[111] A. Griffin, Google stores location data “even when users have told it not to”, Independent, Aug.14, 2018.

[112] \M. Rosemain, France fines Google $57 million for European privacy rule breach, Reuters, Jan. 21, 2019.

[113] E. Birnbaum, Consumer groups urge FTC to investigate Google over location tracking, The Hill, Nov 27, 2018.

[114] T. Romm, Google’s location privacy practices are under investigation in Arizona, Wash. Post, Sept. 11, 2018.

[115] F. Navarro, Popular weather app may be selling your location data, Komondo, Jan. 7, 2019.

[116] M. Locklear, LA sues Weather Channel app owner over “fraudulent” data use, Engadget, Jan. 4, 2019.

[117] Editorial Board, Our privacy regime is broken. Congress needs to create new norms for a digital age, Wash. Post, Jan. 5, 2019.

[118] P. Day, P. Dave, Study shows limited control over privacy breaches by pre-installed Android apps, REUTERS, Mar. 25, 2019

By: Paige M. Boshell

MORE FROM THIS AUTHOR

Connect with a global network of over 30,000 business law professionals

18264

Login or Registration Required

You need to be logged in to complete that action.

Register/Login