The Digital Services Act (DSA) is a set of regulations enacted by the European Union (EU) to harmonize content regulations and establish specific processes for online content moderation. These regulations apply to various online services including marketplaces, app stores, video sharing platforms, and search engines. In order to comply with the DSA’s requirements, Google has made several adaptations to its trust and safety processes and modified the operation of its services.
Google believes that prioritizing user safety is beneficial for both users and its business. Over the years, the company has made significant investments in people, processes, policies, and technologies to address the goals of the DSA. For example, Google has established the Priority Flagger program, which focuses on reviewing content flagged by experts according to the DSA’s Trusted Flagger provision. Additionally, YouTube allows creators the option to appeal video removals or restrictions, and the YouTube team reviews these appeals to determine whether the original decision should be upheld or reversed. The DSA will require all online platforms to implement similar measures and establish internal complaint-handling systems.
Google has also taken steps to protect underage users. After consulting with parents, educators, child safety, and privacy experts, the company decided to block personalized advertising to anyone under age 18 in the summer of 2021. The DSA will require other providers to adopt similar approaches.
Transparency is another important aspect of complying with the DSA. Google has been working to increase transparency and accountability around its responsibility efforts. This includes publishing the YouTube Community Guidelines Enforcement Report to provide more information about its work in protecting users from harmful content. The company is also focused on collaborating with experts in content moderation through its Google Safety Engineering Center in Dublin, which has consulted with over a thousand experts at various events.
In terms of adapting to the DSA’s requirements, Google has experience in complying with regulations at scale. The company has invested in complying with the EU’s General Data Protection Regulation (GDPR) and has built processes and systems to handle requests for over five million URLs under Europe’s Right to be Forgotten. Google is now expanding its efforts to meet the specific requirements of the DSA.
Google plans to expand its Ads Transparency Center to meet DSA provisions and provide additional information on targeting for ads served in the EU. The company also intends to increase data access for researchers who want to understand more about how Google’s services, such as Search, YouTube, Google Maps, Google Play, and Shopping, work in practice. This will help researchers analyze systemic content risks in the EU.
To provide more visibility into content moderation decisions, Google is making changes to its reporting and appeals processes. It is rolling out a new Transparency Center where users can access information about Google’s policies on a product-by-product basis and find reporting and appeals tools. The scope of transparency reports will also be expanded to include more information about how Google handles content moderation across its services.
Google is committed to assessing risks related to its online platforms and search engine in line with DSA requirements. The company will report the results of its assessment to EU regulators and independent auditors and will publish a public summary at a later date.
In conclusion, Google is adapting its trust and safety processes and operation of its services to comply with the EU’s Digital Services Act. The company is prioritizing user safety, increasing transparency, and providing researchers with more data access to understand content moderation practices. Google believes that complying with these regulations is important for the well-being of users and its business.