By Niket Nishant
(Reuters) -Reddit will need to spend heavily on content moderation as it may face greater scrutiny as a public company, analysts said, threatening its longstanding policy of relying on an army of volunteers to maintain order on its platform.
The newly listed company warned in its initial public offering (IPO) paperwork that its unique approach to content moderation can sometimes subject it to disruptions like in 2023, when several moderators protested against its decision to charge third-party app developers for access to its data.
Depending on volunteers is not sustainable, given the regulatory scrutiny that the company will now face, said Julian Klymochko, CEO of alternative investment solutions firm Accelerate Financial Technologies.
«It's like relying on unpaid labor when the company has nearly a billion dollars in revenue,» he added. Reddit reported revenue of $804 million in 2023, according to an earlier filing.
Reddit will need to make substantial investments in trust and safety, which could lead to a «dramatic» rise in expenses, Klymochko said.
A spokesperson for Reddit said that alongside moderators, it also had robust internal safety teams, which used a combination of artificial intelligence, machine learning and human review to enforce its content policy.
The policy allows a ban on users and communities for harassment, inciting violence and hateful content.
The company also invests in safety tools for moderators to help automate their tasks, the spokesperson added.
Stamping out offensive content is crucial for social media platforms, which can see an exodus of advertisers keen on preventing ads from appearing next to unsuitable material.
Many advertisers have been wary of Elon Musk's X
Read more on investing.com