The D-day for the Digital Markets Act (DMA) compliance is fast approaching. Starting from March 2024, the regulatory framework’s substantive obligations will start to apply to the six designated gatekeepers concerning 22 of their core platform services (on the first designation decisions issued by the European Commission see here). In the meantime, the European Commission (EC) has been preparing detailed arrangements for bringing the regulatory instrument to life.

Aside from the Implementing Regulation (see comments here and here) that it issued exercising its powers to adopt implementing acts under Article 46 abiding by the procedural safeguards, the European Commission has adopted a trend to issue templates, which lie outside of the procedural tools at the EC’s disposal. These templates are all-encompassing in the sense that they flesh out the main details that the gatekeepers must abide by if they do not want to risk being in breach of the DMA. However, they do not bear legal value before the EU Courts. One could argue that the templates carry a similar weight to that of the EC’s soft law relating to the application of the prohibitions under Articles 101 and 102 TFEU in reining in the public authority’s administrative action (similarly, see Post Danmark, para 52). They cannot, however, play a role in occupying the place of legal benchmarks in the regulation, especially because they may updated at any given time by the European Commission without further notice (this circumstance is acknowledged by the EC at the header of each template – the first update was performed with regard to the Template on the gatekeeper’s obligation to inform about its concentrations, see latest version here).

In any case, the EC has been adamant in releasing these templates to provide the gatekeepers with a blueprint towards compliance which includes, at the moment of writing: i) the Template Form for Reporting pursuant to Article 11 DMA (see the review of the draft template here); ii) the Template relating to the Obligation to Inform About a Concentration Pursuant to Article 14 of the DMA (see the review of the Template here); iii) the Template relating to the reasoned request for a specification process pursuant to Article 8(3) DMA; iv) the Template relating to the submission of a reasoned request under Article 9 DMA and; v) the Template relating to the submission of a reasoned request under Article 10 DMA (the last three were jointly analysed here).

Last July, the EC also issued for public consultation on the Template relating to the audited description of consumer profiling techniques pursuant to Article 15 of the DMA (the Template) (see the draft’s outline here). The blog post disentangles the differences between this first version and the finalised version issued by the EC in late December, which included substantive changes in terms of the role to play by third parties in the DMA’s enforcement as well as in relation to the expansion of the limits of the obligation under Article 15 DMA. A red-line version of the draft and the final version are available for comparison.

 

The scope of Article 15 DMA and Recital 72

Article 15 DMA imposes upon the gatekeeper the obligation to submit to the Commission an independently audited description of any techniques for the profiling of consumers that the gatekeeper applies to or across its core platform services. Article 15(1) opens the door for the EC to transmit that audited description to the European Data Protection Board (EDPB). That implicitly means that the data protection supervisory authorities will get a grip on the DMA-imposed audit for their enforcement in the area of EU data protection regulation. Recital 72 expands on the scope of the obligation and sets out the two distinct objectives sought by the provision: first, to ensure an adequate level of transparency of profiling practices employed by the gatekeepers and, second, to secure contestability due to the fact that transparency puts external pressure on gatekeepers not to make deep consumer profiling the industry standard (and enhances the capacity of the undertakings providing CPS to differentiate themselves through the use of superior privacy guarantees).

The obligation is set out in the background of the legislator’s assertion that online advertising services to business users are often non-transparent and opaque and that they have become less transparent after the introduction of new data protection legislation (Recital 45).

Despite the all-encompassing nature of Article 15 DMA, Recital 72 lists the items that the gatekeeper should disclose in its audit: “the (legal) basis upon which profiling is performed, including whether personal data and data derived from user activity in line with (the GDPR) is relied on, the processing applied, the purpose for which the profile is prepared and eventually used, the duration of the profiling, the impact of such profiling on the gatekeeper’s services, and the steps taken to effectively enable end users to be aware of the relevant use of such profiling, as well as steps to seek their consent or provide them with the possibility of denying or withdrawing consent”.

The picture that the European Commission has finally depicted in establishing the list of items that a gatekeeper via an audit should include to comply with the obligation under Article 15 DMA can be elucidated via the acknowledgement of the EC’s expansion of the terms of the provision (even concerning the draft version of the Template). By this token, one can rapidly gain entire knowledge that the European Commission has substantively enlarged the gatekeeper’s responsibilities under the provision, both in relation to those items that fall within the scope of Recital 72 (in-scope obligations) and those that fall without it (out-of-scope obligations).

 

In-scope obligations

In the table below, the comparison of the two versions of the Template is performed, bearing in mind the limits and contours of the items first listed by Recital 72. The phrases in bold correspond to those passages that have been added to the finalised version of the Template:

DMA (Recital 72) Draft Template Final version of the Template (last updated 12 December 2023)
Whether personal data and data derived from the user is relied on – A numbered list with a detailed description of each category of personal data and data derived from user activity (in particular, distinguish data and personal data categories actively provided by consumers from observed data) and sources for each of these categories of data and personal data processed for profiling consumers applied to or across the designated core platform services (in particular, distinguish data and personal data originating from the gatekeeper’s services, including core platform services, from data and personal data originating from third parties) (Section 2.1.c)

– A detailed description of the inferred data about consumers from the processing of the data and personal data listed in point c) (Section 2.1.d)

– A description of each category of personal data and data derived from user activity (in particular, distinguish data and personal data categories actively provided by consumers from observed data) and sources (e.g., first or third party service) for each of these categories of data and a description of personal data processed for profiling consumers applied to or across the designated core platform services (in particular, distinguish data and personal data originating from each of the gatekeeper’s services) (Section 2.1.b)

A description of each category of personal data and data originating from third parties (in particular, distinguishing data and personal data originating from third parties, such as advertisers, publishers, developers, or others) and/or derived from user activity on third parties’ services (in particular, distinguishing data and personal data categories actively provided by consumers from observed data and inferred data originating from third parties) (Section 2.1.c)

– A detailed description of the inferred data about consumers derived from the processing of the data and personal data listed in point (b) and/or (c) as well as an explanation of how such derived or inferred data were created (Section 2.1.d)

The processing applied The processing applied (Section 2.1.g) Eliminated.
The purpose for which the profile is prepared and eventually used The specific purpose(s) pursued by the profiling technique(s) and for which  they are used (Section 2.1.a) The specific purpose(s) pursued by the profiling technique(s) and for which they are used (Section 2.1.a)
The duration of the profiling The retention duration of each category of data and personal data listed in points c) and d) of the profiling itself (Section 2.1.e) The retention duration of each category of data and personal data listed in points (b), (c), and (d), or duration of retention of the profile itself (Section 2.1.e)
The impact of such profiling on the gatekeeper’s services Qualitative and quantitative impact or importance of the profiling techniques in question for the business operations of the gatekeeper (Section 2.1.i) Qualitative and quantitative impact or importance of the profiling techniques in question for the services and business operations of the gatekeeper. Under this point, please also include information on the number of end users exposed to each profiling technique per year, and the number of business users using the gatekeeper’s services based on profiling per year, within the core platform service and, where relevant, across multiple core platform services (Section 2.1.k)
The steps taken to effectively enable end users to be aware of the relevant use of such profiling Actions taken to effectively enable consumers to be aware that they are undergoing profiling and the relevant use of such profiling (Section 2.1.j) Actions taken to effectively enable consumers to be aware that they are undergoing profiling and the relevant use of such profiling (Section 2.1.l).
The steps to seek consent or provide users with the possibility of denying or withdrawing consent Where consumer consent is required for the given purpose under Regulation EU) 2016/679, Directive 2002/58/EC and/or Regulation (EU) 2022/1925, a description of any steps taken to seek such consent to profiling, including details on how consumers can refuse consent or withdraw it, and any consequences of such refusal or withdrawal (Section 2.1.k) Whether consent is required under Article 5(2) of Regulation (EU) 2022/1925 for the processing of data and personal data listed in points (b), (c) and (d) for each purpose of profiling consumers. The reporting under the present point should distinguish between consent under points (a) to (d) of Article 5(2) of Regulation (EU) 2022/1925. In addition, if consent is not required, the reporting under the present point should provide an explanation (Section 2.1.g)

– Where consumer consent is required for the given purpose and obtained by the gatekeeper under Regulation (EU) 2016/679, Directive 2002/58/EC and/or Regulation (EU) 2022/1925, a description of any steps taken to seek such consent to profiling, including visual representations (click-by-click) on how consumers can refuse or withdraw consent, any consequences of such refusal or withdrawal, and how any such consequences are notified to the consumer (Section 2.1.h).

Adding of footnote: It should be clear from the description what measures (e.g. in design) the gatekeeper takes to guarantee a neutral presentation of choices to the end user, and the level of facility or ease (e.g. how many clicks) for an end user to refuse or change their consent. The consequences of such refusal or withdrawal should also be clear from the description.

Where consumer consent is required for the given purpose and obtained by third parties (e.g., as required under Article 5(2)(a) of Regulation (EU) 2022/1925), a description of any steps taken to seek consent to the sharing of personal data with the gatekeeper for the purpose of profiling, including visual representations (click-by-click) on how consumers can refuse or withdraw consent, and how the gatekeeper ensures respect of consumer’s consent refusal or withdrawal (Section 2.1.i)

 

Even though the items listed in the Template above remain within the scope of Recital 72, four main ideas are introduced by the EC in this last version of the Template, which were not formerly addressed via the draft Template.

First, the gatekeeper needs to distinguish between data that has its origin within its data operations (first-party) and the data that stems from third-party services (third-party services) (Sections 2.1.b) and 2.1.c) of the Template). The nuance added to distinguish both types of data follows the recommendations of the Joint EDPB-EDPS contribution to the public consultation on the draft template (the Joint EDPB-EDPS contribution) that prompted toward the idea that the Template should not only include a description of each category of personal data and data derived from user activity which are used to profile consumers but that it should also include a description of two types of data coming from third parties: i) the data originating from third parties (for instance, the data that are produced as a result of a third party’s advertising services); and ii) the data derived from the user’s activity on the services of third parties. By this token, the data that are captured by the gatekeeper within its CPSs are relevant to assess compliance with Article 15 DMA alongside data originating outside of the boundaries of the gatekeeper’s CPSs (and even first-party services) that are later instrumentalised by the latter to perform the tasks of combination, profiling and/or profiling. One cannot but suspect that the information that the EC will receive as a result of the gatekeepers’ compliance with Article 15 DMA may be later exploited by the Commission to figure out the organisational structure of the gatekeeper’s data operations, both concerning the datasets lying within the DMA’s scope of application as well as relating to those clusters of data that remain separate to the gatekeeper’s services.

Second, the European Commission concretises how the gatekeeper should report the qualitative and quantitative impact or importance of the profiling techniques it performs for its services and business operations. This is one of the few instances where the EC explicitly fleshes out indicators that may serve the role of prompting compliance in one direction or another (De Streel and Feasey stressed this same point substantively and I criticised the lack of benchmarking elsewhere, too). In this particular context, the EC follows the suggestions received via its public consultation in specifying indicators for measuring the quantitative impact of the profiling techniques. For example, it shall include the number of end users exposed to each profiling technique per year within the CPS and, where relevant, across multiple CPSs (Section 2.1.k) of the Template). It is yet unclear whether those benchmarks are exhaustive in nature, but the jury is still out on those indicators that will apply to the qualitative impact of the gatekeeper’s profiling techniques. Bearing in mind the idea of the DMA’s ‘qualitative’ appreciation of the undertaking under Article 3(8) for the designation process, these indicators could bear some resemblance to the economic-driven elements included there. For instance, one could imagine that the profiling technique’s qualitative impact could be measured against the task’s contribution/induction to business users or end user lock-in.

Third, the final version of the Template addresses the intricate loopholes borne into the prohibition of combining, cross-using and processing personal data across CPSs and other services under Article 5(2) DMA in detail (Section 2.1.g of the Template). Despite that the provision prohibits this conduct, the legislator engrained into the mandate that the gatekeeper could still be exempted from its application if it was granted the end user’s consent effectively in the terms presented by Article 4(11) and 7 of the GDPR (on the potential friction between the prohibition and the exemption, see Botta’s and Borges’ contribution and my criticism on the circularity operated by the legislator). Therefore, the last version of the Template introduces (exceeding the EC’s powers to adopt an implementing act to concretise the methodology for compliance with Article 15 DMA) the need for the gatekeeper to explicitly disclose how the user granted consent for each one of the activities listed under the provision. By doing this, the European Commission kills two birds with one stone.

On one side, it reinforces the scrutiny of Article 5(2) DMA by adding to the gatekeeper’s documentation efforts the need to flesh out where consent was asked from the consumer when profiling takes place, building upon the same obligation which is already included under the Template relating to the gatekeeper’s reporting obligation under Article 11 (albeit not with the particular reference to profiling). On the other side, it undermines the legislator’s initial desire to include the possibility of exempting the prohibition in full. Article 5(2) is drafted en bloc: each of the prohibitions is listed as letters a)-d), and the exemption applies to all of them. Thus, it is sensible to assert that the gatekeeper can ask for consent for all the operations at once to the consumer. If consent is granted, then the prohibition is overridden. However, the EC’s understanding of the provision is different. The gatekeeper must ask for the consumer’s consent for every one of the activities prohibited under the provision. This is the reason behind the fact that the Template establishes that “the reporting under the present point should distinguish between consent under points (a) to (d) of Article 5(2)”. The difference in the provision’s interpretation is nuanced but substantive: with the minute change in wording, the EC multiplies the gatekeeper’s efforts in asking for the consumer’s consent by four, despite that the regulatory instrument desired to reduce dark patterns and consumer fatigue as much as possible. For instance, once consent has been refused or withdrawn, the gatekeeper shall refrain from repeating that request more than once within a period of one year. Bearing in mind the Template’s addition, compliance with Article 5(2) becomes more intricate for the gatekeeper, despite that it already was utterly complex in the face of the gatekeeper’s presumed imbalances in power that pre-empted consent would only be effectively granted in exceptional circumstances.

In a similar fashion to the substantive changes introduced to the Template regarding indicators, the fourth and final difference that the EC introduced into the final version of the Template concerns how the gatekeeper shall demonstrate how it presents consumers with sufficient choice to grant their consent to profiling. In particular, the EC followed the Joint EDPB-EDPS contribution as well as the comments provided via the public consultation by BEUC and EDRi (and others) to compel gatekeepers to include a visual representation of each step displayed to the consumer to seek consent for profiling. The final version of the Template establishes that the visual representation must take place click-by-click. That is, by including screenshots of each of the steps involved in the consumer’s journey when consent is sought from the gatekeeper, with special reference to how consumers can refuse or withdraw consent considering the consequences and how they are notified to the consumer (Section 2.1.h) of the Template). Following the spirit of the anti-circumvention clause under Article 13 DMA, the Template adds a footnote to determine that the design of such measures to seek consent should guarantee a neutral presentation of choice to the end user so that the end user’s capacity to refuse or change their consent is not too burdensome.

 

Out-of-scope obligations

The Template’s final version enlarges the scope of those items that remained within the scope of Recital 72 DMA, but this same motion applies to those items that are held outside of its scope. In the previous blog post reviewing the draft Template’s inconsistencies, I already distinguished between two types of items that I termed as out-of-scope: i) those that are necessary to concretise the items contained in Recital 72; and ii) those unrelated to its content. The table below includes a comparison of the changes operated by the EC from the draft Template to its last version (again, in bold one can find the passages that were added by the EC):

Type Draft Template Final version of the Template (last updated 12 December 2023)
Necessary concretisation of Recital 72 The legal ground relied on by the gatekeeper under Article 6(1) of Regulation (EU) 2016/679 and whether consent is required under points a) to d) of Article 5(2) of Regulation (EU) 2022/1925 for each purpose of profiling consumers (Section 2.1.b) The legal ground relied on by the gatekeeper under Article 6(1) and, where applicable, Article 9(2) of Regulation (EU) 2016/679. The reporting under the present point should distinguish the legal ground relied on under Regulation (EU) 2016/679 for the processing of personal data collected directly by the gatekeeper from the legal ground relied on for the processing of personal data originating from third parties (Section 2.1.f)
Unrelated to the content of Recital 72 A numbered list with a detailed description of the technical safeguards in place to avoid the presentation of advertisements on the gatekeeper’s interface based on profiling of minors or children, including a description of how user data is collected, used or processed in a way that allows the gatekeeper to identify a user as a minor, as well as quantitative indicators to measure the successful identification of minor (Section 2.1.f) Eliminated.
Unrelated to the content of Recital 72 Whether automated decision-making takes place on the basis of an applied profiling technique, the number and object of such automated decisions, the legal effects the automated-decision making mechanism is producing or may produce, and a description of the algorithms underpinning the automated decision mechanism (Section 2.1.h) Whether automated decision-making takes place on the basis of an applied profiling technique, the number and object of such automated decisions, the legal effects and other similarly significant effects that the automated decision-making mechanism is producing or may produce, and a description of the algorithms underpinning the automated decision mechanism (Section 2.1.j)

 

Adjusted footnote: A decision produces legal effects when the subject’s legal rights are impacted. This could include, for example, any resulting effect on their right to vote, their ability to take out a loan, and their position in e-recruitment.

Unrelated to the content of Recital 72 Statistics on how many consumers choose to undergo profiling if they are given a choice (Section 2.1.l) Statistics on how many consumers choose to undergo profiling and how many refuse it, if such choice is given (Section 2.1.m)
Unrelated to the content of Recital 72 Whether and when the profiling technique has been the object of a data protection impact assessment and the conclusion of such assessment (Section 2.1.m) Whether and when the profiling technique has been the subject of a data protection impact assessment and the main conclusions thereof (Section 2.1.n)

 

Additional footnote: Asking for alternatives to profiling allows an assessment of whether gatekeepers have considered less intrusive measures and is particularly informative in terms of accountability.

 

The most salient idea underlying all these obligations relates to the impending overlap between the DMA vis-à-vis other pieces of regulation. Despite the ‘without prejudice’ clause contained in Recital 12, the DMA substantially overlaps with the obligations imposed on data controllers pursuant to the GDPR and the addressees of the DSA (see Bania’s criticism on the clause and the reasons why it might not work as planned). For the sake of consistency, the EC backpedalled the latter overlap. It eliminates from the final version the need for the gatekeepers to flesh out the technical safeguards to avoid the presentation of ads based on the profiling of minors (Section 2.1.f) of the draft Template). This is one of the few instances where the contributions of the designated gatekeepers through public consultation are embraced, given that they are also subject to compliance and audit obligations under Articles 28 and 37 DSA.

However, the same conclusion cannot be reached regarding the overlap of the DMA with the GDPR, insofar as the references to the legal grounds relied on by the gatekeeper under Article 6(1) of the GDPR and to the data protection impact assessment remain in the Template (Sections 2.1.f) and Section 2.1.m) of the Template). The efforts of the outspoken contributors to the public consultation advocating for their elimination due to the fact that they fall outside of the scope of the transparency goals of Articles 15 and Recital 72 DMA fell on deaf ears.

 

The social impact of the gatekeepers’ business models on profiling

Aside from the comparison of the two versions, the essential modification that underlies the whole spirit of the Template is placed under Section 2.1.j) (formerly, Section 2.1.h) of the draft) where the EC finally recognises that automated decision-making on the part of the gatekeepers does not only bear legal effects but also other types of effects, which are equally relevant to the exercise of restoring fairness to the digital arena. Under the terms of the draft Template, the gatekeeper had to disclose the legal effects that automated decision-making caused.

However, both the Joint EDPB-EDPS contribution as well as the responses of EDRi, Privacy International, and BEUC (amongst others) via the public consultation pushed for the recognition of the fact that these tasks could produce impacts on comparable impairments for consumers, which go well beyond legal effects. Due to this reason, the European Commission clarified that a decision produces legal effects when the subject’s legal rights are impacted, e.g., any effect on the subject’s ability to take out a loan or their position in e-recruitment. Additionally, Section 2.1.j) of the Template provides visibility on the fact that other similarly significant effects may take place as a result of automated decision-making. Thus, these types of effects that do not directly impact the subject’s legal rights are to be factored into the rationale that the European Commission will measure compliance with Article 15. The development is significant. The widening gap between social effects and the competitive dynamics of digital markets is narrowed towards the slow permeation of the consideration of legally ambivalent effects on the EC’s enforcement of the DMA.

 

The role of third parties: a win for consumer organisations, but still insufficient for the external monitoring of the DMA

The ‘informal’ regulatory dialogue happening between the designated gatekeepers and the European Commission has been confirmed by officials (see here). One would think that both forces are the most relevant in enforcing the DMA’s provisions. In the formal and procedural sense of the word, they are.

However, a range of different agents have a substantial role to play in making the DMA’s provisions effective. For example, recently, Olivier Guersent confirmed that the EC’s success depends on how the new opportunities and benefits triggered by the regulatory instrument will be seized by competitors (aka business users) and customers (end users). In the same event, Martijn Snoep demonstrated his preoccupation due to the worrying low level of engagement with the DMA’s provisions coming from local businesses. Thus, one of the cornerstones to the regulatory framework’s prosperity relies on the fact that third parties can provide their take on the DMA’s shortcomings and grounds for improvement. This is not entirely evident when one goes through the regulatory instrument’s provisions since the direct intervention of third parties in the enforcement of the DMA is largely curtailed and the circumstances where they can intervene are scarce and targeted. For instance, Recital 68 and Article 11(2) DMA provide that the compliance report should be published in the form of a non-confidential summary so that third parties can assess whether the gatekeepers comply with the obligations laid down in the DMA.

Following this same idea, Section 6 of the draft Template included a clear and comprehensive non-confidential overview of the audited description of each profiling technique of consumers applied to or across CPSs listed in the designation decision should be publicly made available, in the abstract. The final version of the Template, however,  substantially expanded on this possibility and recognised the intervention of third parties in the process of the DMA’s enforcement by adding that the “non-confidential overview should enable third parties to obtain an adequate understanding of those profiling techniques and, consequently, to provide meaningful input on them to the Commission”.

By doing this, the European Commission mirrors the requirements under the compliance reporting obligation under Article 11 and adds the requisite that the overview comprises the form of a self-standing text that provides for a faithful comprehensive and meaningful picture of the audit. The amendment of Section 6 of the Template responds to the call for a more open and accessible DMA for third parties, by recognising the role that they hold in supporting the European Commission’s enforcement efforts. Notwithstanding, third parties flagrantly missed out on the DMA’s institutional design and enforcement strategy (for instance, the Implementing Regulation hinders their capacity to submit their views regarding the EC’s open proceedings) and their intervention is simply understated as a formality in light of their subservient and secondary function.


________________________

To make sure you do not miss out on regular updates from the Kluwer Competition Law Blog, please subscribe here.


Kluwer Arbitration
This page as PDF

Leave a Reply

Your email address will not be published. Required fields are marked *