Two ex-Microsoft workers are suing the software firm or failing to protect them from the psychological effects of viewing illegal material such as child abuse images and films.
A complaint filed on behalf of the two men and their families alleges they are unable to look at children or use computer equipment without being “triggered” after having viewed “inhumane and disgusting content” as part of their jobs working as Microsoft moderators.
As well as content depicting child sexual exploitation, the complaint says the men were also required to view images and film of murder, violent adult pornography and bestiality. Working as part of Microsoft’s online safety team, Greg Blauert and Henry Soto were charged with reviewing flagged material to decide whether it needed to be removed or reported to law enforcement agencies.
The pair claim to now be suffering from severe psychological disorders brought about as a result of the work they carried out for Microsoft, and claim the company failed to warn them what their jobs would involve, and have subsequently refused to provide them with therapy to treat their conditions.
Speaking with the Guardian, Ben Wells, one of the lawyers who filed the suit in Washington state, said: “It’s bad enough just to see a child get sexually molested. Then there are murders. Unspeakable things are done to these children.”
Responding to the men’s complaint in a statement, Microsoft said: “We disagree with the plaintiffs’ claims. Microsoft takes seriously its responsibility to remove and report imagery of child sexual exploitation and abuse being shared on its services, as well as the health and resiliency of the employees who do this important work.
“This work is difficult, but critically important to a safer and more trusted internet. The health and safety of our employees who do this difficult work is a top priority.”
Blauert and Soto allege Microsoft knew the work they were doing was dangerous, but failed to take adequate steps to protect them.
Google, Microsoft and a number of other technology firms agreed to make it more difficult for paedophiles to find child abuse content online in 2013. They agreed to remove tens of thousands of search results that led to indecent images of children.
Speaking at the time, Google’s European Communications Director Peter Barron told the BBC: “We’re agreed that child sexual imagery is a case apart, it’s illegal everywhere in the world, there’s a consensus on that. It’s absolutely right that we identify this stuff, we remove it and we report it to the authorities.”
While companies such as Microsoft and Google now have robust processes in place to identify child abuse content shared on their networks, paedophiles are increasing using the dark web to access and distribute indecent images and film.
According to Europol, most child sex offenders do not operate as part of traditional organised criminal networks, but regularly use the dark web to link up with like-minded individuals to share knowledge and resources and gain access to a larger group of children.