Content moderation in private for-profit companies

November 12, 2022

Moreover, content moderation presents many technical and social issues. The two main avenues for such moderation are algorithmic and manual. Given the recent advancements in AI algorithms that have accompanied the rise of social media, many large social media platforms have begun relying on these automated methods to moderate content automatically. Though efficient, such algorithms still frequently make mistakes, occasionally allowing harmful content to slip through and censoring all comments about controversial topics altogether. With recent studies showing that social media platforms amplify misinformation over the truth, these small blips may have tremendous impacts. On the other hand, manual moderation remains extremely time-consuming while also taking a psychological toll on the human moderators, who are inundated with harmful content daily. With content algorithms pushing increasingly inflammatory messages, moderation is more necessary than ever, countering the issues of the polarized views social media fosters. 

In light of all these issues in the tech field, one company which has increased focus on developing ethical technology is OpenAI, an AI research and development company. OpenAI Member of Technical Staff Valerie Balcom explains that the need for moderation stems from an innate mindset of big tech, where companies optimize for specific metrics to simplify an overarching problem. 

“Speed and efficiency are the biggest challenges in emerging technologies; everything is always getting faster and smaller,” Balcom said. “If you’re targeting the wrong metric, you can end up with unexpected side effects and terrible outcomes. Unfortunately, [companies] are not incentivized to make safe incentives or metrics.”

In 2016, Facebook Vice President of Ads and Business Platform YouTube executives have previously mentioned that they equate watch time with overall user happiness, which directs their focus primarily on increasing the metric of watch time. While such assumptions may simplify the entire situation, it also poses the risk of creating unintended consequences. For instance, social media platforms’ pursuit in maximizing engagement has increased polarization and exacerbated political divisions. This quandary of choosing the right metrics is yet another roadblock in our journey to a more sustainable tech environment, one where privacy and truth are valued more than profit.

We have seen the fragility of our country’s democracy and the increasing divisiveness among citizens. Social media has only served to exacerbate these longstanding issues, conflicted between the interests of a private for-profit company and of an equitable forum for public exchange of ideas. As these platforms have zeroed in on making a profit, they push increasingly radical content towards users, seeking to further interaction with the platform and leaving behind the unintended consequences of a more polarized society. 

As we proceed into this uncharted territory of technology, we have essentially given the tech giants that loom large over the field control of our lives, and their algorithms have fed us harmful content in hopes of greater profits. We know such a system is unsustainable, we know the system must change. But for us to enact the proper changes and to protect our values of free speech and democracy, technologists assert that we must reconsider the compromise we’ve so easily made with corporations, sacrificing our privacy for the conveniences of technology and too easily consuming conversations orchestrated in the companies’ interests rather than one we build as active participants. 

Harker Aquila • Copyright 2024 • FLEX WordPress Theme by SNOLog in