(21.04.2023)
– It makes it a crime for providers to “knowingly host or store” CSAM or “knowingly promote or facilitate” the sexual exploitation of children, including the creation of CSAM, on their platforms.
– It creates a new civil claim and corresponding Section 230 carveout to encourage private lawsuits against internet companies and app stores for the “promotion or facilitation” of child exploitation, the “hosting or storing of child pornography,” or for “making child pornography available to any person”—all based on the very low standard of negligence.
– It requires providers to remove (in addition to reporting and preserving) “apparent” CSAM when they obtain actual knowledge of the content on their platforms.
– It creates a notice-and-takedown system overseen by a newly created Child Online Protection Board, requiring providers to remove or disable content upon request even before an administrative or judicial determination that the content is in fact CSAM.
(…)
Because the law already prohibits the distribution of CSAM, the bill’s broad terms could be interpreted as reaching more passive conduct like merely providing an encrypted app.
(…)
Not every platform will have the resources to fight these threats in court, especially newcomers that compete with entrenched giants like Meta and Google.