Major Tech Companies Form Partnership to Review Trust and Safety Practices

Written by Dominic Whitlock

Nine of the world’s biggest technology companies, including Facebook, Google and Microsoft, are collaborating to produce an industry-wide structure for dealing with harmful content and improper conduct.

The ‘Digital Trust & Safety Partnership’, which also features Twitter, Reddit, Discord, Pinterest, Shopify and Vimeo, want to make sure users are protected and bad actors are dealt with appropriately. 

It comes in response to the US government potentially changing what level of responsibility these companies have when it comes to user-generated content.

The new regulations will be based around five pre-determined commitments: anticipate the potential for misuse, adopt clear and consistent rules, enforce the rules, keep up with changing risks, and regularly report on what action has been taken.

Alex Feerst, an advisor to the new group, explained in a statement: “Trust and safety is a critical function at many of the world’s leading digital services companies. These are the people who spend their days working to keep consumers safe from abuse. 

“The Digital Trust & Safety Partnership was formed to advance the best practices and assessments needed to support this emerging field. This is the beginning of a process, and we look forward to collaborating with stakeholders to foster greater transparency.”

Each company is due to conduct an internal review of its own policies before they all come together and release a state of the industry report later this year.

It is also committed to working with consumer/user advocates, policymakers, law enforcement, relevant NGOs and general experts to make sure the practices are as appropriate and effective as possible.

Visit the Digital Trust & Safety Partnership website here.


Leave a Reply

Your email address will not be published. Required fields are marked *