You know this: Instagram tries to show you content that keeps your attention in order to keep you hooked on the app because the time you spend on it is sold to advertisers.
This is the Instagram business model.
So, let’s say you search on the name of a drug. Your feed will be loaded with content that it thinks you want to see: it’ll include recovery programs, addiction stories, parties and–yes–you’ll get information from and about illegal drug dealers. The good news? If you’re an addict you’ll find support and help. The bad news? Just follow or like a dealer’s site and, well, you’re free to roam the world of Insta-drugs.
The FDA is laying part of the blame for the opioid crisis on online sites that are ineffectively policed–even though Facebook and Instagram (Instagram is owned by Facebook) are working on it.
The social media giants built artificial intelligence software that is used to identify Russian bots, terrorist accounts and now they’re making progress with software that can do the same for drugs, with photographs of pills and identification of phone numbers.
Even though these social media cites are working on filtering illegal drug content, their progress is too slow considering how popular they are, and therefore how fast these illegal sales can grow.
Pew Research Center reports that more Americans use Instagram than Twitter, Snapchat and WhatsApp. Last year the company said it had 800 million users on the app every month.
And, there are approximately 40,000 illegal online pharmacies.
Read more here from the Washington Post.
Lawmakers are considering changes in liability laws so that social media companies will no longer be protected from the content they allow on their site.
But, with this, two things are likely to happen: either dealers will leave social media and shift to using an underground network known as the dark web (Geek out on it here); or, social media will be forced to stop policing their sites and illegal social media dealers could thrive. Check out the Debater here.