TikTok has been in the headlines over the last couple of days over its Creator Marketplace. Phrases that included “Black” and “Black Lives Matter” was flagged as “inappropriate content” on the platform. Ziggi Tyler, a 23-year-old content creator, posted a video showing that he was flagged for putting these phrases in his bio. On the same day, he posted a follow-up video that showed phrases like “I am a neo-Nazi” and “I am an anti-Semite” not being flagged.
Tyler’s two videos on this matter have already gained around a million views.
TikTok has apologised for what has happened and has suggested that they are working on this issue.
How did this happen?
Tyler first noticed his content being flagged when he attempted to update his bio page in TikTok’s Creator Marketplace. This is a beta feature on the platform that allows creators to partner with branded sponsors, and the feature has been up since last year. While trying to update his bio to include phrases like “Black Lives Matter,” “Pro-Black”, or “Black Success,” a message popped up saying that Tyler’s bio contained “inappropriate content”. Later, Tyler used words that included “supporting white supremacy” or “pro-white”, and those terms were not flagged.
In an interview with Forbes magazine, Tyler wanted to highlight his racial background within the marketplace so that advertisers looking to focus on racial justice or expand their ads would choose him. TikTok has explained that their algorithms flagged Tyler’s bio due to him including the word “audience” within his bio. Because their AI tools were taught to flag bios that included the word “die”, the algorithm would also flag words that included “die” within specific phrases like “audience”. The platform has also said that their algorithms would draw attention to any combination of words like “die” and “Black”.
A Spokesperson for the app has come out and suggested that their Creator Marketplace protections “were erroneously set to flag phrases without respect to word order.”
TikTok has said that they were still developing the Marketplace section of the app and that they have since promised to correct the AI issues that caused these phrases to be warned.
“Our TikTok Creator Marketplace protections, which flag phrases typically associated with hate speech, were erroneously set to flag phrases without respect to word order. We recognize and apologize for how frustrating this was to experience, and our team has fixed this significant error. To be clear, Black Lives Matter does not violate our policies and currently has over 27B views on our platform.”
TikTok spokesperson in a statement to NBC News
Algorithms and content creation
This isn’t the first time that algorithms have been mention regarding creating or posting content on social media platforms. Last year when Black Lives Matter protests were occurring, multiple Black creators claimed that TikTok had suppressed content about George Floyd’s death. The company said that there was a “technical glitch”, which was their reasoning behind why there was suppressed content. In May, Media Matters, a non-profit media watchdog, found that TikTok’s algorithm promoted homophobic and transphobic content to viewers. The watchdog found that likening one anti-LGBTQ+ video on an account led to TikTok recommending more videos of that kind. There have been forms of algorithmic censorship happening on Twitter, Instagram and Facebook over the last few years. The most recent example of this was with what was going on in Gaza and the violence towards Palestinians earlier this year.
Away from algorithms, Black creators on TikTok have started to go on strike to call out dance appropriation.
Charli D’Amelio and Addison Rae were criticised for not crediting Black creators when performing TikTok dances on their platforms or during televised appearances.
TikTok’s algorithms have once again raised question marks over these social media platforms and how they allow content on their sites.
It also raises concerns over racial biases within the technology sector, which has been a widely discussed issue.
This story proves that regardless of how technically advanced society is going, we are struggling to meet the demands that come with it.