It is already known how extensive TikTok is in collecting user data. From users’ names, approximate locations, and IP addresses, even keystrokes(opens in a new tab), is quite intense. This meticulous level of data collection is an advertiser’s wet dream, but it’s the method by which TikTok catalogs this data that allegedly led to recent internal concerns.
How The Wall Street Journal reported(opens in a new tab) On Friday, former TikTok employees claimed the app has been tracking videos users watch under topics including “LGBT,” and essentially compiling lists of users who watch that content, which at one point determined could be seen by some employees through a dashboard.
Montana Legislature Passes TikTok Ban
The WSJ he described the groupings as “groups”, which function like the notorious “taste groupings”.(opens in a new tab)” on Netflix that have been widely mocked and parodied(opens in a new tab). They had names such as “main female,” “tall female,” “southeastern black male” and “coastal white collar male,” according to the magazinethe report of TikTok doesn’t ask for users’ sexual orientation, but based on the content users see, it appears the algorithm at least assumed users were members of the LGBTQ community and categorized them accordingly, all in the name of get the people to use the app more.
In an illustrative example, the WSJ notes that the “alt-female” cluster is distributed across content related to “tattoos, some lesbian content, and ‘Portland.’
TikTok is bringing AI-generated profile pictures to the masses
As noted in the report, it’s not surprising that many social media and ad tech companies will infer traits about their users based on online behavior. They use it to select which content or ads to show users. However, with TikTok’s ‘cluster’ system, liking LGBT content not only meant you were shown more queer-friendly content, but it seems the app as a whole essentially tagged users as community members.
This incomplete way of cataloging user data causes internal concerns, according to the WSJ, as some TikTok employees could see the unique ID numbers of users and the list of users who were watching videos in each cluster. This raised fears among workers that the data could be shared with third parties or used to blackmail LGBT users, the WSJ reported Especially since Tiktok has admitted to spying on journalists in the past.
A TikTok spokeswoman said WSJ that the application does not identify sensitive information based on what users look at and that users’ interests do not necessarily represent their identity. TikTok also confirmed that the dashboard used to access the data of viewers of gay content was deleted almost a year ago.
TikTok still collects this data, but has simply replaced cluster names with numbers and restricted access to a smaller number of employees within the company’s new US unit.
It will be interesting to see how this new development plays out as the US continues to push to ban TikTok in the country. However, given the US government’s abilities to spy on its own citizens, and at best its “tolerance” of the LGBTQ community at this point, this might not mean much to the relevant policy makers of d.c.
A recent Wall Street Journal report has shed light on how unsettling TikTok’s data collection can be. According to the report, the popular short-form video app is “building an extensive database of people’s interests and personal information.” As a result, the app categorizes people into “clusters” based on the kinds of videos users watch and the content they “like” or comment on.
TikTok reportedly uses this data to serve ads and generated content tailored to each individual user’s tastes, in an effort to keep them engaged and make money from advertisers. While this type of targeted advertising can be beneficial to companies offering personalized services, it can also make users feel uncomfortable. According to the report, many individuals do not realize they are being categorized in this manner and often don’t have any control over the process.
Ikaroa, a full stack tech company, know there is growing awareness about the need for strong data security and compliance measures to ensure user privacy is not being violated. We believe it is essential to empower users to take control of their own data and ensure companies understand their obligations to protect it.
In the wake of the news about TikTok’s data collection, it is imperative that companies like ours are vigilant and educated about regulations designed to protect the privacy of individuals. We believe it is our responsibility to ensure the data entrusted to us by individuals is used ethically and responsibly.
We look forward to working alongside legislative bodies and other industry leaders to develop clear regulations and best practices for the collection and use of user data. By doing this, we hope to foster an industry that values user privacy and maintains the trust of its users.