contador Saltar al contenido

Facebook: the worrying state of mind of content moderators

Content moderators are employees of organizations to which Facebook has outsourced this work. Very high mental stress!

These content moderators are not Facebook employees per se, but employees of organizations to which Facebook has outsourced this work.

Facebook takes about 15,000 content editors to check offensive content on your platform. Their task is to keep Facebook free from content that violates platform standards. It can include anything from misinformation, hate speech to video. These employees are employees of organizations to which Facebook has outsourced this work.A detailed report highlighted the conditions in which many of these content moderators work in Arizona, in the United States, Cognizant's office of the IT company, with which Facebook collaborated for this purpose.

facebook-state-of-mind-of-content-moderators "width =" 1024 "height =" 683 "srcset =" -mental-of-content-moderators-1024x683.jpg 1024w, 300x200.jpg 300w, 768w, https: // www. 1080w, /02/facebook-stato-mentale-dei-moderatori-di-contenuti-610x407.jpg 610w "sizes =" (max-width: 1024px) 100vw, 1024px "/></p>
<h2>Facebook, the working conditions of the moderators</h2>
<p><span style=According to one moderator, there is a constant tension on the employees who deal with this kind of content, as they are not only subjected to nudity and pornography, but also to extremely bloody images.There is the secrecy that also isolates Facebook from criticisms about the working conditions of its own employees. THE moderators they claim to be driven to not discuss the emotional cost of their work, including their loved ones, leading to an increased feeling of isolation and anxiety.Content moderators must also take into account any minor interruptions. Pauses such as prayer in a religious case are not allowed.There is a constant fear of dismissal due to a count of errors committed weekly.It would seem that in more than a few cases employees have been forced to resort to drugs to compensate for the mental trauma faced during the hours of checking the graphic content.Following all these complaints, Facebook has highlighted the efforts the company is making to ensure the mental well-being of content moderators. We have not mentioned the reports that we have reported but the knowledge of the misunderstandings and the accusations against the practices of review of the contents of Facebook was assured.Facebook has emphasized the collaboration with organizations such as Accenture, Cognizant, Genpact among others, to highlight the work of moderating content due to the fact that these organizations are known for their standards of assistance to employees.

Some of the key mechanisms for content moderation partners would include an agreement to provide employees with good working facilities, complete with predetermined breaks. Facebook would claim to provide weekly calls, regular visits and monthly and quarterly reviews to its employees.