Federal courts may eliminate Fourth Amendment protections for cell phone data based on dubious claims about Google’s Location History.
On May 20, 2019, Okello Chatrie robbed a bank in Midlothian, Virginia. Wearing a fisherman’s hat and a traffic vest, he handed the bank teller a threatening note and pulled a gun on her. He then forced the bank manager at gunpoint to open the vault and made off with $195,000. Unable to identify the robber, Chesterfield police Detective Joshua Hylton served Google with a geofence warrant—a warrant targeting all cell phone users in a certain area at a certain time. He obtained data on cell phones located within a circular area 300 meters in diameter around the bank within an hour of the bank robbery. Hylton received data on 19 cell phones, one of which turned out to be Chatrie’s.
A federal district court found that the geofence warrant used in the Chatrie investigation violated the Constitution because it lacked the particularity required by the Fourth Amendment. But it also held that the geofence evidence would not be suppressed under the good faith exception to the exclusionary rule, since Detective Hylton had relied on the warrant in good faith. Google filed a brief and several declarations in the case, describing its Location History program. The district court eventually adopted several of these descriptions in its lengthy opinion. Google’s statements indicated that millions of Google users, including Chatrie, affirmatively opted in to using Location History by checking a box they accessed via Google settings. But a close examination of the trial records and Google’s later testimony regarding its Location History program reveals that this is not what happened to Chatrie, or to millions of other users like him.
Chatrie eventually appealed the district court’s decision to the U.S. Court of Appeals for the Fourth Circuit, which heard arguments on it in December 2023. At oral argument, the judges repeated several of Google’s points, accepting them as central to resolving a crucial Fourth Amendment question at issue: whether cell phone users have any Fourth Amendment rights in the location data they share with cell phone apps. The Supreme Court’s last major Fourth Amendment decision, Carpenter v. United States, held that users had a Fourth Amendment right in the location data collected by their cell service provider.
But the Fourth Circuit appears to have found a way around this decision. If users opt in to Location History and similar app services, they may be affirmatively, voluntarily disclosing their data—which may place it outside of the Fourth Amendment’s protection. Indeed, the Chatrie case threatens to undermine existing constitutional protections for location data—and future protections for virtually every other type of cell phone data as well.
Claims Regarding Location History
In oral argument at the Fourth Circuit, Judges J. Harvie Wilkinson and Julius N. Richardson seemed averse toward Chatrie’s claims and sympathetic to the government’s arguments, signaling a potential ruling that cell phone app location data is wholly unprotected by the Fourth Amendment. The judges adopted several of Google’s descriptions of its Location History service—descriptions that appear to play a central role in their thinking about the privacy ramifications of the case. They emphasized, much like the district court had, that Google’s location permission system was opt-in with a default choice of no tracking. Further, as Judge Richardson stated, only one-third of Google users chose to opt in to location tracking, while “two-thirds … have rejected it.” The upshot was that people like Chatrie made a voluntary choice to participate in the program, a choice that most similarly situated users supposedly do not make.
These facts were apparently central to the lack of Fourth Amendment protection for cell phone location data. As Judge Wilkinson said, “If the default position is that if you were in unless you opted out, that would be one thing. But the default position is that you’re out unless you opted in.” People like Chatrie likely lack a constitutional right in their data, because “they can preserve their privacy with a simple step” by doing nothing and choosing not to opt in.
This is not an entirely accurate description of how Google’s Location History program works, however. While there is room for argumentation, Google obtained Location History permissions not via an opt-in system but mainly through bare-bones, forced choice permission screens, like many other apps. Often, Google put additional pressures or used additional manipulation to induce users to give permission. In addition, the real percentage of people who gave Google permission when they saw a permission screen was higher than one-third, and fewer than two-thirds of users declined permission. Finally, Chatrie himself did not affirmatively volunteer to use Location History but instead was told he was required to activate it in order for his phone to work properly.
Yet the Fourth Circuit judges are hardly to blame for any inaccuracies in their statements at oral argument. They simply relied on the facts found by the trial court and emphasized throughout its opinion. Those facts were in turn adopted from Google’s representations in the initial case.
Opt-In, Forced Choice, and User Permission for Data Collection
Google filed an amicus brief in support of neither party with the Chatrie district court in December 2019. (Disclosure: I briefly worked for the law firm filing the brief, WilmerHale LLP, many years ago and prior to anything involving this case.) Google indicated that users could activate Location History (LH) only by going to their account settings and enabling location reporting, essentially by checking a box. In Google’s words, “Google does not save LH information unless the user opts into the LH service in her account settings.” No other means of activating Location History were mentioned.
In reality, however, this is not the only means of turning on Location History, as any tech savvy person who uses Google Maps can tell you. Rather, Location History can be turned on far more easily during the setup of any Google app that uses it, such as Google Maps, Google Photos, or Google Assistant. Users setting up these apps generally see two permission screens, neither of which gives much information about Location History nor any information at all about Google’s potential disclosure of location data to third parties or use of location data for marketing purposes.
The process of obtaining user permission via screens presented during app setup is generally known as a “forced choice” regime, where users are forced to either grant or deny permission for something before they can use an app. Forced choice regimes, and especially forced choices during app setup with very limited disclosures about information use, are notoriously bad at capturing actual consumer preferences. Most consumers just blindly click yes on any and all app permission screens. A forced choice approach is qualitatively different from an opt-in system, where users have to seek out a setting and turn it on, departing from a default choice of no information collection or the like. In a forced choice system like Google Maps’s, there is no default choice, just a request for permission without the information or context necessary to make an informed decision—and perhaps the implication that the app won’t work as designed if the user denies permission. And that’s in a best-case scenario. As explored further below, companies like Google sometimes pressure or manipulate consumers into giving permission, making the choice to grant permission even more meaningless as a measure of user preference.
Even setting that aside, forced choice permission screens should not be deemed sufficient to waive users’ constitutional rights. The information given on such screens is typically incomplete or misleading. For example, a typical permission screen might provide a generic sentence about how the app will use your location data, such as the Weather Channel app’s message: “You’ll get personalized local weather data, alerts, and forecasts.” This screen does not mention that the app will sell your location data to third-party vendors, advertisers, and marketing analysts; does not mention how long your data will be stored; and does not mention how its data may be combined with data from other apps and websites using a variety of tracking technologies. That information is buried deep within a separate privacy policy document that users do not read and likely could not understand. And some uses of information, such as the Weather Channel app’s use of location data to analyze foot traffic for commercial purposes, may not be disclosed at all. Google’s permission screen is similar, omitting to mention the marketing and other uses Google makes of user data, or how long the data is stored, or Google’s involvement with law enforcement investigations.
Further, even if permission screen disclosures were adequate, the idea that cell phone users are making informed decisions about data disclosures during app setup is wholly unrealistic. Users are often confronted with numerous app permission screens during the initial setup of their cell phones, and they may be rushed, distracted, and especially prone to just clicking “Accept” and hoping for the best. A single app can ask users for many permissions, with the average app asking for roughly five. The app setup process is likely to be confusing for users who lack technological expertise. Even experts may get confused, like the Google software engineer whose email was put in evidence in the Chatrie case, who thought they had not turned on location tracking but then discovered they had. Certainly, few users are likely to understand the underlying technologies of location tracking, data storage, third-party data sharing, digital ad networks, targeting algorithms, ad servers, data auctions, and cross-device tracking, to name only a few of the technologies implicated in the collection and processing of consumer cell phone data. The upshot is that most app users have little to no idea about the data collection practices and exposure risks they are accepting when they hit the Accept button. As an advertising industry executive told the New York Times, “Most people don’t know what’s going on.”
In short, cell phone users are not opting in to location tracking by Google, but are clicking yes on bare-bones, forced choice permission screens in order to be allowed to use their apps.
Pushing Users to Grant Permission
Forced choice app permission screens are poorly suited to obtaining informed consent, and entirely inadequate to secure a meaningful waiver of a person’s Fourth Amendment rights. But Google’s permission screens were flawed for other, even more obvious reasons. Google users likely felt pressure to give permission for the collection of their location data, beyond the normal pressure that causes most people to give permissions during setup. First, denying permission for Google to collect location information was likely to degrade the functionality of Google’s products. For example, in Google Maps, failing to give permission for Location History would cause the app to forget previously searched addresses. Users (the author included) who found themselves retyping a familiar address while on the road might feel compelled to forgo their privacy in order to avoid crashing their cars while navigating. The reduction of functionality for users who declined permission for location tracking was a primary complaint of a May, 11, 2018, letter about Google sent by Sens. Richard Blumenthal (D-Conn.) and Edward Markey (D-Mass.) to the chairman of the Federal Trade Commission. (The senators also criticized Google’s characterization of its location permission system as “opt-in.”)
The pressure on cell phone users to grant permission for location tracking goes beyond just reductions in app functionality. Google’s permission screens also employed manipulative design elements intended to nudge users to give permission for Location History. These are examples of “dark patterns,” user interfaces that confuse users, make it difficult for users to express their actual preferences, or manipulate users into taking certain actions. On its Location History permission screen, Google highlighted the “Turn on” button in blue, making it both easily visible and emphasizing it as the default choice. By contrast, the “No thanks” button was not highlighted and was written in a light gray text that was less visible. The effect was to subtly manipulate users, particularly those hurrying through cell phone or app setup and faced with a barrage of confusing permission screens, to select the apparent default choice. And default choices exert a powerful effect, such that consumers select them with remarkable frequency and across a wide variety of domains. In other words, rather than an opt-in system, Google used something closer to an opt-out approach, where consumers have to reject the apparent default choice—“Turn on”—and select the unhighlighted and seemingly disfavored option. It’s also worth noting that it can be very difficult to change settings after selecting the default option. Most users would have to expend substantial time and effort figuring out how to access and change their Location History settings in order to deactivate location tracking once they’ve turned it on.
Not all of the measures used to convince users to grant Location History permission are so sophisticated, however. Some involve straightforward pressure. For example, when Chatrie was setting up his Android phone, he was automatically prompted to set up Google Assistant. According to an expert witness for the defense, Chatrie viewed a page entitled “Meet your Google Assistant” that directed users to “Give your new Assistant permission to help you.” That page then told users to give their permission for three different things, one of which was Location History (the other two involved access to audio activity and access to device information). The user was then told, apparently falsely, that “The Assistant depends on these settings in order to work correctly. Turn on these settings for [your account].” This is essentially the opposite process of a cell phone user who has to voluntarily opt in to having their location tracked. Rather, Google is directing the user to give permission for location tracking during the setup of their phone, and telling them that a core program on their phone won’t work unless they give this permission. A rational cell phone user, told during setup that their phone won’t work properly unless they give various permissions, and then told directly by their phone to turn on those permissions for their account, will turn on the permissions. Indeed, cell phone users generally grant permissions blindly with far less coercion. The fact that Google’s description was apparently not accurate—according to testimony at the Chatrie trial, Google Assistant worked fine without Location History—only makes its pressure-inducing qualities more obvious.
The “One-Third of Google Users” Claim
Finally, Google’s Location History product manager, Marlo McGriff, told the district court at various points that only “roughly one-third of active Google users” had activated Location History on their accounts. This statement might reasonably be interpreted as meaning that two-thirds of Google users had seen Google’s permission screen and declined to give permission, perhaps by selecting the “No thanks” option instead of the highlighted-by-default “Turn on” option. But that isn’t what that statistic means, to the extent that it has any meaning at all.
According to McGriff’s later testimony, “active Google users” apparently included everyone with a Google account via any app or service, which would include Gmail, YouTube, and other apps not associated with Location History. Many of these users would never have seen a permission screen for Location History in the first place. For this and other reasons, the percentage of people who give Google permission when they see a permission screen is obviously higher than one-third. Further, as McGriff confirmed during later testimony, his denominator of “active Google users” also included users who live in countries where Location History is banned altogether. In other words, it is simply not the case that two-thirds of Google users rejected Location History. Rather, they likely either were not shown the Location History permission screen because they did not activate an app that used it, or they did not live in a country where Google was allowed to track their location. While Google has never disclosed the proportion of users who say yes to their permission screens, it’s likely to be high. Just imagine the percentage of users in Chatrie’s situation who give permission after seeing a confusing permission screen during phone setup, being told that permission is necessary for a core phone program to work, and being directed to give permission by the screen itself.
None of this is to suggest that Google is unconcerned about user privacy. Google ultimately asserted in its Chatrie amicus brief that its users retain Fourth Amendment rights in their Location History data. Moreover, four days after the oral arguments in the Chatrie case, Google issued this announcement that after a year or so it would no longer store Location History data except on users’ own devices, essentially ending Google’s participation in geofence warrants. Google has apparently decided to get out of the detailed location tracking business. Although this decision has both pros and cons, and may impede law enforcement investigations, Google’s apparent concern for its users’ privacy is both commendable and understandable. While once police officers were thought to need a warrant to obtain geofence data, the Fourth Circuit seemed at least somewhat likely to declare that such data could be obtained warrantlessly. Google may have felt trepidation at facilitating widespread warrantless location surveillance.
Voluntary Disclosure and the Fourth Amendment
Misunderstandings about cell phone users and their data disclosures are likely to arise when courts focus too closely on the alleged “voluntariness” of these disclosures. Users are not the only ones who have a poor understanding of the complexities and infrastructures of digital data transfers. Judges are unlikely to conduct their own independent studies of how user permissions are obtained, whether users are properly informed when they give permissions, or the myriad undisclosed and often sensitive uses of consumer data by companies like Google.
Voluntary data disclosure is a profoundly slippery concept in the digital world. What appears to be a sophisticated user checking an “opt-in” box deep in Google settings may turn out to be a confused person scolded by his phone to give several permissions at once and told that core programs on the phone will not work if he refuses. Further, even if Chatrie, or a federal judge, had read and understood all the information Google offered to its users, they still likely wouldn’t begin to understand the extent of Google’s data collection and use. Google did not just store the addresses a user typed into Google Maps or Assistant; it automatically tracked his location roughly 238 times per day. It tracked him constantly, whether he was moving or stable, and even while he was asleep. It tracked the user even if he deleted the app used to activate Location History. Is this location data collection “voluntary” because the user clicked yes on a bare-bones permission screen years earlier during the setup of his phone? The question is both difficult to answer coherently and in danger of missing the point. Disclosure of one’s information to other parties is inevitable in the digital world. The real question is whether people nonetheless retain Fourth Amendment rights against sensitive data surveillance by government officials.
This is not to say that voluntariness has no place in a Fourth Amendment inquiry. Rather, some courts’ excessive concern with voluntariness may lead them astray in cases like Chatrie’s. Indeed, it previously led the Fourth Circuit and other courts astray in cases dealing with the disclosure of location information to cell phone service providers. In 2016, the Fourth Circuit (and several other circuit courts) held that the government could track cell phone users’ locations without a warrant, because users supposedly voluntarily disclosed their location information to their cell phone providers. The Supreme Court eventually disagreed, reversing the circuit courts cases and making clear that users had little choice but to disclose their location information, and also that location tracking was of constitutional concern for reasons having nothing to do with voluntariness.
In any event, as discussed above, a close examination of the supposed voluntariness of app permissions reveals that they aren’t so voluntary after all. The process of obtaining app permissions from cell phone users is characterized by pressure, manipulation, misinformation, and confusion instead of informed consent. The Supreme Court is likely to be concerned about these issues, as it has been concerned with similar issues surrounding cell phone use in the past. With any luck, that Court may ultimately set the record straight on user permissions for data collection—and ensure Fourth Amendment protection against government location tracking.
Disclosure: Google provides financial support to Lawfare. This article was submitted and handled independently through Lawfare’s standard submissions process.
– Matthew Tokson is a Professor of Law at the University of Utah S.J. Quinney College of Law, writing on the Fourth Amendment and other topics in criminal law and procedure. Published courtesy of Lawfare.