Indiana is suing TikTok, but a serious conversation about social media regulation is still called for
Indiana has filed two lawsuits against the social media giant
The first time I heard about TikTok was at dinner right before my undergraduate graduation in 2019. A classmate of mine described it as this cool platform that was encouraging people to dance and share videos of themselves doing choreography they had learned on the app. I then did not hear about it again until the beginning of the pandemic when, stuck at home with nothing to do, it took off like wildfire. And if it had once been a platform for spreading the love of dance, that certainly was not what it was anymore.
Last week, the state of Indiana filed two lawsuits against TikTok. Likely inspired by a €750,000 fine imposed on the social media platform in 2021 by the Dutch Data Protection Authority for violating child privacy, as well as probes into the company by regulators from Ireland’s Data Protection Commission, currently the leading regulatory body in Europe.
In their lawsuits, Indiana similarly focuses on the impact of TikTok on children, namely that it (1) exposes children to mature content, and (2) that it violates child privacy.
I do not believe that TikTok, or most social media for that matter, is a force for good in the world. While there are individual content creators who are doing positive work that spreads important messages, and for as much as I enjoy scrolling on Facebook and Instagram, and watching TikTok videos reposted to these places, I do not believe that taken as a whole, social media is a net positive for society. So, while I loathe speaking ill of anyone standing up to social media giants and calling out their problematic platforms, I have real concerns with the way that Indiana has put forward the arguments in their complaints.
In their complaint about exposing children to inappropriate content on TikTok, Indiana alleges that the app “promotes” inappropriate content including nudity, alcohol and drug consumption, and profanity to children. They argue on this basis that the 12+ rating the app currently has on the App Store and Google Play Store is incorrect, and that though TikTok is aware of this, they will not change it because the core demographic for the app is 13-17 years old. This means that if the app were to be rated 17+, it would lose access to its core demographic.
Stepping away from their legal arguments momentarily and looking at the situation from a public policy standpoint, Indiana is probably right that TikTok should not be available to tweens and younger. The app is specifically designed to be more addictive than other forms of social media, and even if the content children are exposed to is not “mature” in the traditional sense, it does not mean it is not misinformation or dangerous/inappropriate in other ways. However, the foundation upon which they argue that inappropriate content is “promoted” to children is a precarious one. The claim is premised on the fact that if you use the search feature on TikTok, and type something like “strip-” it will suggest “stripper Tok” as an autocomplete option. Similarly, if you search “shr-” it will autocomplete an option to watch videos about hallucinogenic mushrooms.
Searching for content and finding it is not the same as having that content promoted to you. While the core demographic for TikTok may ride a school bus, they are not the only users of the app, and the fact that videos of pole dancing and alcohol consumption exist on the app is not, in and of itself, the issue. Nor should it be. That a 14-year-old could search for videos of scantily clad women, while something we try to regulate to a certain extent, is not a phenomenon that is new to TikTok, and speaking as someone who was a teenager that wanted a Facebook account less than a decade ago, it’s pretty easy to check a box claiming to be 18 when no one is asking for any more proof than that. But searchability is not the same as “promotion.” From a logical standpoint, for content to be “promoted” to a user it would need to be coming up the longer they continue to scroll through the app - essentially, what is the path from Baby Shark to naked shark? Though a child or someone in their early teens could have a feed filled with mature content, Indiana has not established in their complaint that they would not have needed to seek that content out, at least initially. This is a flaw in their argument that may be difficult for Indiana to overcome, especially given the existing protections for social media platforms in the United States.
While we should absolutely be concerned about what children and teenagers are learning on TikTok, we should be no less worried about what TikTok is learning about them. TikTok can collect data on a user's location, phone model, contacts, facial features, voice, interests, and screen use patterns. Given the interest of the Chinese government in ByteDance, the company that created and owns TikTok, these data collection trends may not be something to take lightly, particularly because, as the second lawsuit alleges, it can be used to train AI technology which will, in turn, be able to produce content that influences users in their beliefs and actions in all of the ways social media is already so capable of doing. Though TikTok claims that they have not and will never give user data to the Chinese government, they do provide access to ByteDance. Regardless, the level of data collection they can engage in remains alarming, especially given that the average user spends over 90 minutes a day on the app.
The privacy lawsuit describes TikTok as a “wolf in sheep’s clothing.” While we are likely a very long way from a decision in this case, if it ever even gets to that stage, it is difficult not to view at the very least this statement as true. TikTok is filled with no end of bright colours, fun music, and every manner of entertaining content. It is incredibly easy to use on both the viewer and creator sides. But it is also very dangerous. There are very few safeguards around content, and those that exist can be easily circumvented. It is so easy for dangerous content and misinformation to go viral.
The answer to what to do about TikTok, and other social media is not easy or obvious, particularly in the United States where section 230 of the Communications Decency Act, which was beyond the scope of this overview, has been consistently upheld as exempting social media platforms from liability from content that is uploaded and algorithmically fed to users. The answer is not, however, to do nothing, and it is time to take a serious look at the features and impacts of social media.
It’s almost Chanukka! While I think I’m a little old to be asking for presents, it would mean the world to me if you subscribe for free and help me to reach my goal number of subscribers by the end of the year! If you already subscribe, consider forwarding this newsletter to someone who might be interested.
As always, if you would like to support Horse of a Different Colour in other ways (and you’re not my mom), you can do so through the “Keep Me Caffeinated” button or by reaching out in any of the methods below.
If you are interested in a lengthier discussion of section 230 or any other topic related to law, Judaism, and beyond, comment below, reply to this email or hit me up on social media: @sadie_rw on Twitter and @horseofadifferent_colour on Instagram.
Another very on-point write-up of the problem! You are entirely correct that social media as a whole has proven itself more a negative effect culturally than a positive one. I just feel a little awkward at times railing on them too hard since my fiancée and I connected over Facebook!