As has been reported extensively over the past year, social media companies have come under scrutiny both at the national and international levels for not doing enough to prevent their platforms from being used to facilitate illegal drug sales. Despite reports that many popular platforms have stepped up efforts to police content – both TikTok and Meta have recently claimed to have removed nearly 96% of drug-related videos and posts before they are even reported – recent reports suggest they are still falling short in protecting users. In the latest round of Drug Enforcement Agency’s (DEA) enforcement action against counterfeit controlled substances (CS), popular apps such as Snapchat, Facebook Messenger, Instagram, and TikTok were involved in one-third of the 390 investigations. These investigations, which took place from May to September, seized a staggering 10.2 million fentanyl pills and over 980 pounds of fentanyl powder. In addition to removing dangerous illicit drugs from circulation, these efforts, which are a part of DEA’s One Pill Can Kill initiative, also seek to warn Americans that buying pills from illegal online sellers can be deadly. The risks are particularly acute for young adults like Cooper Davis, a 16-year-old from Kansas who used Snapchat last year to purchase what he assumed was Percocet® from a drug dealer in Missouri.

His story has inspired the name of a bill introduced by Senator Roger Marshall (R-KS) in late September, which would not only require social media to report instances of illegal drug sales to law enforcement, but also encourage them to share information that would aid in investigations. The Cooper Davis Act (S4858) amends the Controlled Substances Act (CSA) to require communication service providers to report to DEA “as soon as reasonably possible” any communications they become aware of detailing an apparent or imminent unlawful sale and distribution of CS. If found to be knowingly failing to report these crimes, providers may be hit with fines up to $150,000 for their first offense and $300,000 for any thereafter.

In addition to establishing a duty to report, the bill also outlines the type of information platforms might provide, which includes the following: 

  • Identifying information of the perpetrator – email address, Internet Protocol (IP) address, uniform resource locator, payment information excluding personally identifiable information, and screen names or account monikers
  • Historical information – when and how the perpetrator uploaded, transmitted, or received content relating to the illegal sale and distribution of CS 
  • Geographical location – any information that would help identify where the perpetrator is located, including IP address, verified address, and area or zip code 
  • Complete or partial communication – the complete communication describing the intent to unlawfully sell or distribute a CS or information contained therein, such as symbols, photos, video, icons, or direct messages

There is also a requirement to preserve the contents of the report and related information for 90 days, but the bill leaves it to the discretion of the service providers on what information they ultimately provide in their report to DEA. The bill also clarifies that it should not be interpreted as a requirement for communications providers to “affirmatively search, screen, or scan for the facts or circumstances.” 

This is a departure from other recent attempts by some members of Congress to hold social media platforms accountable for adopting content moderation practices that effectively deter criminal activity. For example, the See Something Say Something Act (S27) introduced by Senator Joe Manchin (D-WV) in 2021, sought to establish a similar duty to report major crimes, including felony violations of the CSA. However, unlike the Cooper Davis Act, Manchin’s bill attempted to reform Section 230 of the Communications Decency Act, which protects interactive computer service providers from civil and criminal liability for the content their users create. Manchin’s bill also sought to establish stricter requirements for reporting, including a 30-day deadline and requirements for what information should be included in a report.

Given these differences, the Cooper Davis Act offers a new legislative strategy for addressing this challenge that social media companies and other communications service providers, who have long argued that amending Section 230 would stifle competition and free speech, may be more willing to embrace. However, others have argued that the provisions outlined in the bills put forth by Congress haven’t done nearly enough to protect users. A report published in September by family members of victims and state law enforcement officials agrees in spirit with many of the provisions outlined in the Cooper Davis Act but argues in many cases to go a step further, including mandating more timely reporting requirements and third-party oversight of compliance. With intense scrutiny on social media and pressure on Congress to take significant steps to address the opioid epidemic, expect to see the Cooper Davis Act and other similar bills reintroduced in 2023.