This month marks one year since Charlie Ternan, a 22-year-old, soon-to-be college graduate, was found unresponsive in his room after taking what he thought to be Percocet® to alleviate his back pain. The pill was unfortunately not a legitimate pain medication but was instead a fentapill, a counterfeit drug containing the synthetic opioid fentanyl. Charlie purchased the pill online from a social media site, which unknowingly connected him to a counterfeit dealer. Nine months after Charlie’s death, Laura Berman, a famed relationship therapist and television host, lost her 16-year-old son, Sammy Berman, who had also bought a prescription drug similarly containing fentanyl on Snapchat, a social media platform commonly used by teenagers. The convenience of internet shopping attracts many young buyers to turn to social media and internet platforms, not only to buy ordinary goods but also medications. This is especially concerning as the world grows increasingly reliant on the internet for access to health care and public health information.
What some fail to recognize – especially young adults – is that is that most social media platforms are designed to keep users engaged by pushing content they believe the individual wants based on their activity on the platform. The algorithmic system that tracks users’ every move uses each click, search, like, and friend request to create a personalized feed for every user. Though algorithms were born out of platforms attempting to produce a list of recommended items to help users make decisions on content, it has arguably become much more complicated than just a short, tailored menu. For every click or search a user makes on their own, the algorithm spits back out recommended content based on assumptions of what other content is likely to entice the user to stay engaged. In this way, these algorithms are not designed to satisfy the user’s search; they are designed to keep users searching for more. And after multiple clicks or searches, the line between what a user was searching for and what content the algorithm is pushing can quickly become blurred.
Some platforms further mislead the user by blending recommended content with credible advertisements and pages or accounts the user has knowingly followed, reinforcing the user’s notion that the content they are viewing is from a trusted source. This may seem harmless, but when an individual is searching to buy a drug, the platform will deliver more drug-related ads, pages, and hashtags to a user’s feed, further enabling the illicit activity. Even if a user is not actively participating on the platform by searching, liking, or sharing, algorithms use other indications of interest such as time spent hovering over a post. In doing so, social media algorithms have become so efficient at suggesting content to users, even passive ones, that they have created a perfect pathway for digital drug dealers to connect to potential victims in a matter of a few clicks.
The sale of drugs on social media apps – and other illegal activity perpetrated on these platforms – is nothing new and has long been on the radar of policymakers. Year after year, the chief executive officers (CEOs) of Facebook, Twitter, and Google have come to Washington to testify before members of Congress on what they are doing to prevent their platforms from becoming conduits of dangerous misinformation and criminal activity, including the sale of drugs, which clearly violates their own policies. Though they pledge to do better, their solutions have fallen short in protecting users.
In the last two months alone, two congressional hearings were held to address algorithms and the larger conversation around the misinformation they fuel. On April 27, 2021, the Senate Privacy, Technology, and the Law Subcommittee of the Senate Judiciary Committee held a hearing entitled Algorithms and Amplification: How Social Media Platforms’ Design Choices Shape our Discourse and Our Minds. The hearing was largely focused on the role that algorithms played in fueling misinformation which led to the attack on the United States Capitol on January 6, 2021. On March 25, 2021, a joint hearing on disinformation and the role of social media was held by the Energy and Commerce Subcommittee on Communications and Technology and the Subcommittee on Consumer Protection and Commerce. During this hearing, Committee members questioned the CEOs of Facebook, Twitter, and Google, highlighting the need for content moderation along with ending the sale of illegal drugs on the platforms.
Unfortunately, despite continued congressional focus on this issue, lives have already been lost. The stories of Charlie Ternan and Sam Berman should be a warning to those considering purchasing drugs online to beware of what they are being sold, even as unassuming or benign it may appear. However, even more importantly, these stories should be a wake-up call for social media companies that there is an urgent need to implement safety measures to ensure their platforms do not become vehicles for the illegal sale of medications to unsuspecting consumers.