YouTube Cracks Down on Unhealthy Community Conversations by Introducing Warnings and Filters

YouTube has recently announced further changes geared towards making their comment section a safe space for all. In addition to issuing warnings to people for typing hateful comments before posting them, the site will also enable access to filters for creators who wish to weed out particularly hateful comments from their videos.

Changes such as these have been warranted for quite a while, but in such a politically and racially charged environment as the one we face today, they have obviously seen setbacks. It becomes hard to draw the line between censorship for the sake of one’s mental peace and obstruction of free speech. However, such a line has yet to be crossed by YouTube. The warnings issued with toxic comments, while urging their commenters to reconsider their choice of words, will ultimately still give users the choice to post and respects their right to a voice. Such a decision also seems necessary, as the technology behind recognising hate comments is very much a work in progress.

To whittle a complex concept down to its basics, the recognition of a toxic comment will be conducted by an AI. The AI, using feedback and highlighted keywords from previously flagged comments, will then see if any similarities are drawn with what’s being written by a user. A pop-up bubble will ask the user if they wish to edit their comment, or would like to post it as it is. As more and more comments get flagged, and the AI receives more data to work with, the system should refine itself and show actual effective use down the line. Creators will also be able to view and hide potentially offensive comments that will be automatically held for approval via the new filter option.

Such improvements from YouTube in at least attempting to steer its vast audience towards civil debate and discussion are vastly overdue. Over the years, the website has encountered very harsh criticism for how easily creators and users can partake in creating a toxic environment and come back with no more than a slap on the hand. Flagging comments and videos does no more than simply remove them, with no further repercussions for the people behind them. Content creators such as Logan Paul have been especially lambasted for their role in propagating such an unhealthy online environment, with no consequences for years to follow. Stars that have online fights with each other (labelled “beef”) often call on their young and impressionable fanbases to go harass each other, which further lights up comment sections, and unnecessarily wraps up more people in heated disputes.

So yes, even with new improvements on the horizon, YouTube has a long way to go before truly reaching a plane of unbiased, healthy, and objective discussion and commentary.

Read next: YouTube brings more to the Premieres segment
Previous Post Next Post