N.D.Ga.: Despite a technical mistake in this geofence warrant, it’s sustained under GFE

The charge is murder in aid of racketeering, and a geofence warrant was used. The process is described below, and the application was defective at step 2, but not so defective that the good faith exception didn’t apply. At the end, below, the court talks about the reasonable expectation of privacy in CSLI records. United States v. Brown, 2025 U.S. Dist. LEXIS 112603 (N.D. Ga. June 13, 2025):

The Court has reviewed all of the supplemental authority briefed by the parties in addition to the cases discussed in the R&R to aid in its consideration of the unsettled and controversial Fourth Amendment question of the validity of geofence warrants. After careful review, the Court concludes that, although the first geofence warrant was lacking in sufficient particularity at step two, which impermissibly allowed law enforcement unfettered discretion to obtain the identity of any person whose location history data placed him or her near the scene of a crime without any additional probable cause, exclusion is not warranted in this case under the good-faith exception.

. . .

To give important context to the precise issues at hand, the Court will partially recount the helpful history of the use of geofence warrants provided by the Fifth Circuit in United States v. Smith [110 F.4th 817 (5th Cir. 2024)]:

Google received its first geofence warrant request in 2016. Since then, requests for geofence warrants have skyrocketed in number. . . . By 2021, geofence warrants comprised more than 25% of all warrant requests Google received in the United States.

. . .

Unlike a warrant authorizing surveillance of a known suspect, geofencing is a technique law enforcement has increasingly utilized when the crime location is known but the identities of suspects are not. Thus, geofence warrants effectively work in reverse from traditional search warrants. In requesting a geofence warrant, law enforcement simply specifies a location and period of time, and, after judicial approval, companies conduct sweeping searches of their location databases and provide a list of cell phones and affiliated users found at or near a specific area during a given timeframe, both defined by law enforcement.

So far, Google has been the primary recipient of geofence warrants, in large part due to its extensive Location History database, known as the “Sensorvault.” Google collects data from accounts of users who opt in to Google’s Location History service. Location History is disabled by default. For Location History to collect data, a user must make sure that the device-location setting is activated, and that Location Reporting is enabled. … In October 2018, Google estimated that approximately 592 million—or roughly one-third—of Google’s users had Location History enabled. Once a person enables Location History, Google begins to log the device’s location into the Sensorvault, on average, every two minutes by tracking the user’s location across every app and every device associated with the user’s account. In other words, once a user opts into Location History, Google is always collecting data and storing all of that data in the Sensorvault.

. . .

Early on, when law enforcement officials first started requesting geofence warrants, they would simply ask Google to identify all users who were in a geographic area during a given time frame. However, Google began taking issue with these early warrants, believing them to be a “potential threat to user privacy.” Thus, Google developed an internal procedure on how to respond to geofence warrants. This procedure is divided into three steps.

Step 1

At Step 1, law enforcement provides Google with the geographical and temporal parameters around the time and place where the alleged crime occurred. Following, Google searches its Sensorvault for all users who had Location History enabled during the law enforcement-provided timeframe.

. . .

After Google searches its Sensorvault, it determines which accounts were within the geographic parameters of the warrant and lists each of those accounts with an anonymized device ID. Google also includes the date and time, the latitude and longitude, the geolocation source used, and the map display radius (i.e., the confidence interval). The volume of geofence data produced depends on the size and nature of the geographic area and length of time covered by the geofence request.

. . .

Step 2

At Step 2, law enforcement contextualizes and narrows the data. During this step, law enforcement reviews the anonymized list provided by Google and determines which IDs are relevant. As part of this review, if law enforcement needs additional de-identified location information for a certain device to determine whether that device is actually relevant to the investigation, law enforcement can compel Google to provide additional location coordinates beyond the time and geographic scope of the original request. The purpose of this additional data is to assist law enforcement in eliminating devices that are, for example, not in the target location for enough time to be of interest, or were moving through the target location in a manner inconsistent with other evidence. As a general matter, Google imposes no geographical limits on this Step 2 data. Google does, however, typically require law enforcement to narrow the number of users for which it requests Step 2 data so that the Government cannot simply seek geographically unrestricted data for all users within the geofence.

Step 3

Finally, at Step 3, law enforcement compels Google to provide account-identifying information for the users that they determine are relevant to the investigation. This identifying information includes the names and emails associated with the listed device IDs. Using this information, law enforcement can then pursue further investigative techniques, such as cell phone tracking, or sending out additional warrants tailored to the specific information received.

110 F.4th at 821-25 (internal citations, quotations, and alterations omitted).

As to the third party information, 2025 U.S. Dist. LEXIS 112603, at *33-36:

But again, while this ruling may have dealt with a novel type of information and changed how the third-party doctrine is applied, it remained tethered to the original aims of the Fourth Amendment—”to secure the privacies of life against arbitrary power” and “to place obstacles in the way of a too permeating police surveillance.” Carpenter, 585 U.S. at 305. Put another way by Judge Wynn in his excellent concurrence in Chatrie, “[i]nstead of ‘mechanically applying the third-party doctrine,’ Carpenter applied a new framework rooted in historical understandings of Fourth Amendment privacy rights but adapted to the particular surveillance technology at issue.” Chatrie, 136 F.4th at 120 (Wynn, J., concurring) (quoting Carpenter, 585 U.S. at 314).

We live in an age when many of the most vital and intimate parts of what we might deem integral to our private selves or our activities have been embedded in the cloud. Often, they are stored there intentionally to allow us later to save and revisit cherished memories, or perhaps for the more mundane administrative purposes of quickly locating tax returns or insurance information when needed on the fly. But much of this information storage is automatic, unthinking, just another feature of one of the dozens of apps we have on our phones. Ours is an era of convenience, where the near-constant documentation of our lives murmurs along steadily in the background of our living. We are accustomed to it and often give no thought to it. A maps or rideshare app prompts us to turn on location services to improve location accuracy, and we agree. See United States v. Chatrie, 590 F. Supp. 3d 901, 936 (E.D. Va. 2022), aff’d on reh’g en banc, 136 F.4th 100 (4th Cir. 2025) (“While the Court recognizes that Google puts forth a consistent effort to ensure its users are informed about its use of their data, a user simply cannot forfeit the protections of the Fourth Amendment for years of precise location information by selecting ‘YES, I’M IN’ at midnight while setting up Google Assistant, even if some text offered warning along the way.”). It proves more challenging with each passing year to apply 18th-century or even 1970s-era understandings of privacy to a world where faceless, voiceless, corporate or government “third parties” collect deeply sensitive information through automatic processes on our phones or other smart devices as we go about our days. It is for these reasons that “the third-party doctrine is an increasingly tenuous barometer for reasonable privacy expectations in the digital era.” Chatrie, 136 F.4th at 119 (Wynn, J., concurring).

As described above and in the Supreme Court’s Carpenter opinion, the modern rule from Katz is that “[w]hen an individual seeks to preserve something as private, and his expectation of privacy is one that society is prepared to recognize as reasonable, . . . official intrusion into that private sphere generally qualifies as a search and requires a warrant supported by probable cause.” Carpenter, 585 U.S. at 304. So written, the rule appears simple and straightforward in its application. But courts have struggled mightily to apply it faithfully in the five decades since Katz. For example, it is difficult to reconcile this formulation of the rule with the holding in Miller that bank records can be obtained without a warrant. Certainly, society generally recognizes bank records as private—so private they are password-protected and often doubly hidden behind two-factor authentication to boot. Yet longstanding precedent in the form of the third-party doctrine dictates that, because we allow bank employees access to such information, we would be unreasonable to expect that law enforcement be denied the same ready access. That did not ring true to many reasonable minds in the 1970s when Miller was decided and, since that time, the doctrine has become even more “ill suited to the digital age, in which people reveal a great deal of information about themselves to third parties in the course of carrying out mundane tasks[.]” Jones, 565 U.S. at 417 (Sotomayor, J., concurring). See also, e.g., Smith, 442 U.S. at 749 (Marshall, J., dissenting) (“Privacy is not a discrete commodity, possessed absolutely or not at all. Those who disclose certain facts to a bank or phone company for a limited business purpose need not assume that this information will be released to other persons for other purposes.”).


This entry was posted in geofence. Bookmark the permalink.

Comments are closed.