WI: Google’s scans of user content for CSAM is a private search

It’s a private search when Google scans user consent for child sexual abuse material (CSAM) and then it reports to law enforcement what it finds. It is not required to search, only to report what it finds. [Note that the court noted this was on a “sparse” record with stipulated facts, not that it would have made a difference.] State v. Sharak, 2026 WI 4 (Feb. 24, 2026):

2.Did Google act as an instrument or agent of the government here?

[*P20] Having established the relevant test, we turn to Google’s search in this case. Considering the totality of the circumstances, we conclude that Rauch Sharak has not met his burden to show that Google acted as an instrument or agent of the government.

[*P21] As an initial matter, we note that Rauch Sharak bears the burden to show that Google acted as an instrument or agent of the government, and here we have a sparse record. See Payano-Roman, 290 Wis. 2d 380, ¶23 (laying out the burden). There was no suppression hearing in this case. Instead, the circuit court said the facts were not in dispute and adopted facts from the State’s suppression-motion response. Rauch Sharak has not challenged any of the circuit court’s actions or findings in this regard. Considering the record in front of us, we determine Rauch Sharak has failed to meet his burden.

[*P22] The undisputed facts suggest that Google did not act as an instrument or agent of the government. To begin, the government was not involved in Google’s search. Google scanned Rauch Sharak’s files on its own. An employee opened and viewed the files without law enforcement participation. The government became involved only after Google created the CyberTip. Compare the facts here with Payano-Roman, for instance, where the defendant was in custody, police officers sat in his hospital room during the search, and an officer physically administered the laxative to get access to the swallowed evidence. See 290 Wis. 2d 380, ¶27; see also United States v. Booker, 728 F.3d 535, 540-41 (6th Cir. 2013) (emergency room doctor is a government agent for Fourth Amendment purposes where the defendant was in police custody, police brought the defendant to the hospital to have contraband removed from his rectum, and the police knew that the doctor would do a procedure to remove it). In short, the government became involved only after Google scanned, opened, and viewed Rauch Sharak’s files. That lack of government involvement suggests Google was not acting as an instrument or agent of the government.

[*P23] Moreover, Google had a business reason to complete this search. By searching for and removing CSAM from its platform, Google helps ensure that its users have a good experience using its products. See United States v. Ringland, 966 F.3d 731, 736 (8th Cir. 2020) (“Google acted out of its own obvious interests in removing child sex abuse from its platform.”); United States v. Miller, 982 F.3d 412, 425 (6th Cir. 2020) (“Google … sought to rid its virtual spaces of criminal activity … to protect [its] business[].”). The record reflects as much: Google’s terms of service state that users must “comply with applicable laws” to “maintain a respectful environment for everyone,” and Google’s abuse policies say that it checks for abuse on its site to “maintain a positive experience for everyone that uses Google products.” The fact that Google had a business purpose independent of any desire to help law enforcement suggests that Google was not acting as an instrument or agent of the government.

[*P24] Rauch Sharak’s arguments do not convince us otherwise. First, he argues that Google is a government agent because NCMEC participated in the search. He highlights the circuit court’s conclusion that NCMEC provided a list of known CSAM to Google and therefore participated in the search. We reject his assertion that providing a list of known CSAM amounts to involvement in a search—providing a tool to identify contraband is too attenuated to constitute participation in a search. Moreover, it appears that Google may be able to scan user content for CSAM without NCMEC’s list, contrary to Rauch Sharak’s argument. See Miller, 982 F.3d at 426 (determining that “Google did not even scan for any NCMEC-provided hash values” during the time when it flagged a user’s files as CSAM). In the end, NCMEC was not involved in Google’s scan of Rauch Sharak’s content or the viewing of his files, which, again, Google did independently.

[*P25] Rauch Sharak also argues that Google acted as an agent of the government because the federal government enacted and enforces various federal statutes related to ESPs and CSAM. Specifically, he points to 18 U.S.C. § 2258A, 47 U.S.C. § 230, and the criminal and civil statutes referenced in 47 U.S.C. § 230(e)(5). He says these statutes encourage Google to scan for CSAM and provide a law-enforcement reason for Google’s search.

[*P26] We first look at 18 U.S.C. § 2258A, which regulates CSAM reporting. That section creates a duty for ESPs to submit a CyberTip if they discover potential CSAM-related federal law violations. See § 2258A(a)(1), (2). The ESPs are required to submit CyberTips to NCMEC, which must forward the tips to law enforcement. See § 2258A(a), (c). Importantly, though, the statute contains a disclaimer that “[n]othing in this section shall be construed to require a provider to … affirmatively search, screen, or scan for” CSAM. § 2258A(f)(3).

[*P27] The disclaimer in 18 U.S.C. § 2258A(f)(3) blunts any argument that 18 U.S.C. § 2258A turns ESPs into instruments or agents of the government. Indeed, § 2258A(f)(3) says searches are not required. Many federal courts have relied on that disclaimer in determining that 18 U.S.C. § 2258A does not turn ESPs into government agents. See United States v. Rosenschein, 136 F.4th 1247, 1256 (10th Cir. 2025); United States v. Sykes, 65 F.4th 867, 877 (6th Cir. 2023); United States v. Rosenow, 50 F.4th 715, 730 (9th Cir. 2022); United States v. Meals, 21 F.4th 903, 907 (5th Cir. 2021); Ringland, 966 F.3d at 736.

[*P28] We turn to the other statutes that Rauch Sharak cites—47 U.S.C. § 230 and the statutes referenced in § 230(e)(5). Section 230 governs liability for online content and for ESPs who screen or block user content. See § 230(c). It says that “[n]o provider … shall be held liable on account of … any action voluntarily taken in good faith to restrict access to or availability of material that the provider … considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected.” § 230(c)(2)(A). Paragraph 230(e)(5) provides that “[n]othing in this section (other than subsection (c)(2)(A)) shall be construed to impair or limit” enforcement of certain sex-trafficking-related civil and criminal violations. For instance, § 230(e)(5)(B) says that § 230 does not limit state-law prosecutions for conduct that would violate the federal child sex-trafficking provisions in 18 U.S.C. § 1591.

[*P29] Rauch Sharak argues that this scheme encourages ESPs to scan for CSAM by granting immunity to ESPs that moderate content and creating civil and criminal liability if ESPs do not scan for CSAM. While we acknowledge that the statutes Rauch Sharak identifies may encourage ESPs to search for CSAM, we do not think he has shown that the statutes turn Google into an instrument or agent of the government under the totality of the circumstances. Though § 230(c) may grant immunity to ESPs that choose to scan for CSAM, it does not require, reward, or incentivize scanning for CSAM in the first place. See Children’s Health Def. v. Meta Platforms, Inc., 112 F.4th 742, 762 (9th Cir. 2024) (saying that § 230(c) “is fundamentally unlike the regulations in Skinner” because it is “entirely passive”), cert. denied, 145 S. Ct. 2846, 222 L. Ed. 2d 1130 (2025). Moreover, § 230(c)(2)(A) grants immunity for “any action voluntarily taken in good faith to restrict access to” obscene material, which sweeps far more broadly than would be required to induce Google’s CSAM scan at issue here.

[*P30] Likewise, § 230(e)(5) does not turn Google into an instrument or agent of the government. Rauch Sharak claims that ESPs risk criminal liability for failing to remove CSAM from their platforms, but does not develop an argument for why that is so, even in light of evidence otherwise. See Does 1-6 v. Reddit, Inc., 51 F.4th 1137, 1145 (9th Cir. 2022) (holding that § 230(e)(5)(A) “requires that a defendant-website violate the criminal statute by directly sex trafficking or, with actual knowledge, ‘assisting, supporting, or facilitating’ trafficking, for the immunity exception to apply”). Rauch Sharak’s assertion about sex-trafficking liability is not enough to show that the government was involved in Google’s scan for CSAM here. In the end, these statutes are unlike the regulations in Skinner which required that employees comply with intoxication testing or lose their jobs and prevented employers from negotiating away the testing requirements. See 489 U.S. at 615. Even if the statutes encourage Google to scan for CSAM or provide a law-enforcement purpose, Rauch Sharak has not shown that they are enough to turn Google into an instrument or agent of the government.

[*P31] We note that our decision here puts us in good company. Seemingly without exception, federal circuit courts and other state supreme courts have held that ESPs like Google are private actors when searching for CSAM on their platforms. See, e.g., United States v. Bebris, 4 F.4th 551, 560-62 (7th Cir. 2021); Rosenow, 50 F.4th at 728-35; Miller, 982 F.3d at 425-26; United States v. Richardson, 607 F.3d 357, 364-67 (4th Cir. 2010); State v. Pauli, 979 N.W.2d 39, 51-52 (Minn. 2022); State v. Lizotte, 197 A.3d 362, ¶¶22-23, 208 Vt. 240 (Vt. 2018).

This entry was posted in Private search. Bookmark the permalink.

Comments are closed.