As an avid TikTok cynic who has managed to block TikTok on their home internet, I firmly believe that TikTok and similar competing products are hurting our society and will only continue to cause social harm.
The world has been enamored by algorithm-based social media products for years now. These algorithms are built to keep people using companies’ products for as long as possible. This approach has been around since the dawn of the internet and this idea is a result of the internet’s absolute reliance on digital advertising and subscriptions.
It is well known that the longer people spend on a platform and the longer they spend viewing ads, the more money each of these companies makes.
Each company makes its own algorithm, and each algorithm uses different data sets and objectives to deliver content to users. The other side of the algorithm encourages people to create certain types of content by rewarding them with likes, comments and engagement; giving the creator a dopamine hit and in rare cases a few cents as well.
These companies disproportionately encourage people to create salacious content. This type of content, especially when created in TikTok-style short videos, is supercharged by these companies’ algorithms. This artificial boost keeps people creating and keeps viewers watching.
The Wall Street Journal conducted an investigation last year that focused on Meta’s reels algorithm. They set up test accounts that specifically only followed young influencers.
These tests revealed that “Instagram’s system served jarring doses of salacious content to those test accounts, including risqué footage of children as well as overtly sexual adult videos—and ads for some of the biggest U.S. brands,” according to the Wall Street Journal.
The Wall Street Journal did this after finding that thousands of accounts run and or starred by younger creators had huge numbers of adult men as followers.
They had also observed “that many of the accounts who followed those children also had demonstrated interest in sex content related to both children and adults. The [Wall Street] Journal also tested what the algorithm would recommend after its accounts followed some of those users as well, which produced more disturbing content interspersed with ads,” according to the WSJ.
The WSJ also did a feature on a mom and her teen daughter who within months of creating the account began seeing disturbing messages and comments from a male-dominated follower base.
“Men left public comments on photos of the daughter with fire and heart emojis, telling her how gorgeous she was. Those were the tamer ones. Some men sent direct messages proclaiming their obsessions with the girl. Others sent pictures of male genitalia and links to porn sites.”
The scariest part of this is that the people in these situations are forced between what could easily be viewed as exploitation and what they view as their shot at fame and wealth.
“That was a reason to say no. There were also reasons to say yes. The mom felt the account had brought her closer with her daughter, and even second and third-tier influencers can make tens of thousands of dollars a year or more. The money could help pay for college, the mom thought.”
The incentive structure must change on platforms like Instagram and TikTok. Creators must be able to create content they can be proud of that will not endanger them in the present, or in the future.
It is imperative that these platforms disincentivize the creation of this type of exploitative content. Children and adults should not feel as though this is the only way that they can find success on these platforms. Additionally, platforms should be fully enforcing their own rules and helping creators to find success in ways that they won’t regret later in life. Only then will the platforms be the true success stories they continually claim to be.