Select audit type or platform element:
Connected elements
Dr. Martin Degeling
Stiftung Neue Verantwortung
IFIP summer school on privacy and identity management - 09.08.2023
Slides at: martin.degeling.com/slides/ifip
"The Digital Services Act is a EU Regulation that defines obligations for online services regarding liability for illegal content, content moderation, transparency and due diligence obligations for service providers.""
| New obligations | Intermediary services | Hosting | Online | Very large |
|---|---|---|---|---|
| Transparency reporting | ● | ● | ● | ● |
| Requirements on ToS due account of fundamental rights | ● | ● | ● | ● |
| Cooperation with national authorities | ● | ● | ● | ● |
| Points of contact | ● | ● | ● | ● |
| Notice and action/provide information to users | ● | ● | ● | |
| Reporting criminal offenses | ● | ● | ● | |
| Complaint and redress mechanism and out of court dispute settlement | ● | ● | ||
| Trusted flaggers | ● | ● | ||
| Measures against abusive notices and counter-notices | ● | ● |
| New obligations | Intermediary services | Hosting | Online | Very large |
|---|---|---|---|---|
| Special obligations for marketplaces | ● | ● | ||
| Bans on targeted adverts to children | ● | ● | ||
| Transparency of recommender systems | ● | ● | ||
| User-facing transparency of online advertising | ● | ● | ||
| Risk management obligations and crisis response | ● | |||
| External & independent auditing | ● | |||
| User choice for recommender | ● | |||
| Data sharing with authorities and researchers | ● | |||
| Codes of conduct | ● | |||
| Crisis response cooperation | ● |
After publication by the EU in April. Amazon complained that it's unfair to be singled out.
Zalando argued that it is a safe platform .
Many of the measures proposed are happening
Subscriptions Network Algorithm
e.g. Podcasts, RSS e.g. the "old" Facebook, IG e.g. Tiktok FYP, IG Reels, Twitter
active selection selection of others weights (based on implicit feedback)


They are required by VLOPs to conduct on their own platform. But we have little trust that they will be thorough so researcher and civil society should do them to.
Get a good understanding of the platform. Use this information to determine the profiles of stakeholders who should be involved in the process. Depending on the experience and expertise needed, stakeholders could be platform developers, researchers, legal experts and representatives of the parties affected.
More detailsDefine and prioritise scenarios. A scenario is a description of specific issues related to a 'systemic risk'. It breaks down abstract risks into concrete testable hypotheses by defining the affected party and its characteristics, the harm, the involved elements of the platform and the further impact. A systemic risk may often involve several scenarios; therefore, selecting scenarios and deciding if they have a 'high' priority is necessary.
More detailsDevelop measurements to understand the scenario. There are different types of algorithm audits, as well as platform elements to consider. These can range from automated measurements that look at the actual implementation to user perspectives through surveys. An auditor needs to develop multiple measurements and then prioritise them to find the best measurement(s) to test a specific scenario.
More detailsSelect audit type or platform element:
Connected elements
After conducting the measurements, you need to analyse the results and write an audit report. The report should foster observability and enable reproducibility and recommend mitigation measures.
More detailsThere are various possibilities to study risks on different elements of the platform with different methods

Goal: Better understand the platforms processes and company motivation.
Parameter TikTok lists:
The "internal memo":
The value of a video varies with respect to actors
How do different (software) products work together to create the platform experience
The total video views of heated videos accounts for a large portion of the daily total video views, around 1-2%, which can have a significant impact on overall core metrics.
Using automated means to simulate users (aka scraping)
Using a mitmproxy and adb we can automate the use of TikTok

TikTok is not an open-source oriented companies

But adverserial methods can help. See MobSF

Platforms are pushing new transparency features and promise new APIs. But this is often mere PR:
follow our research github.com/snv-berlin/tiktok-audit
or me :) chaos.social/@mrtn3000