On the 31st of July, the European Commission issued for public consultation its (fourth) Template relating to the audited description of consumer profiling techniques pursuant to Article 15 of the DMA (the Template). The Template is inserted within the Commission’s wider transparency strategy to provide for a stream of implementing acts to secure the DMA’s immediate compliance.

The Template is structured into five sections aside from the preliminary introduction to the provisions, establishing the requirements to fulfil the obligation under Article 15 of the DMA, namely: i) the information on the identity of the gatekeeper (Section 1); ii) the information and items to be disclosed by the gatekeeper in the audit, with the character of minimum information (Section 2); iii) the general information on the audit (Section 3); iv) the information on the audit process (Section 4) and; v) the gatekeeper’s customary declaration on the completeness and accuracy of the information that has been disclosed (Section 5).

At the moment of writing, the Commission has published and made applicable: i) its Procedural Implementing Regulation, especially directed at the gatekeeper designation stage, see comments here and here; and ii) the Template Relating to the Obligation to Inform about Concentrations under Article 14 DMA, see review here. The Commission’s final version of the Template for Compliance Reports under Article 11 DMA has not yet been made available, see the summary of the draft released for public consultation here.

The post reviews the latest Template issued by the Commission, with reference to its legal basis as well as to the concepts of consumer profiling and auditing in light of past experience in the field of data protection regulation and the application of the GDPR.

 

The content of Article 15 of the DMA (and Recital 72)

Data-related practices and (mis)conducts play a crucial role in the DMA and, as such, a range of safeguards are established throughout the regulatory instrument to secure the provisions relating to (and hopefully, morphing) the future-to-be-designated gatekeepers’ data-intensive business models. Aside from the anti-circumvention clause set out in Article 13 or the obligation to notify all of the transactions related to the core platform services pursuant to Article 14, Article 15 of the DMA stands out as a standalone obligation to contribute to effective enforcement.

The gatekeepers’ obligation to submit an “independently audited description of any techniques for profiling of consumers that (…) (it) applies to or across its core platform services” will apply within six months after their designation. That is to say, the obligation under Article 15 of the DMA will apply simultaneously to the provisions set out under Articles 5, 6 and 7.

Recital 72 builds up on the rationale under the provision: the potential negative effects of the gatekeepers’ collection and accumulation of large amounts of data from end users in terms of data protection and privacy. Thus, not only Article 8 of the Charter is concerned relating to data protection (the legal basis of the GDPR) but also the right to privacy under Article 7 of the Charter. To that end, the Union legislator deems relevant that, at least, the profiling practices employed by the gatekeepers should be communicated by the Commission. Article 15(3) DMA provides that the profiling will not only be transmitted to the EC but also to the public via a summarised version.

Surprisingly, however, the ultimate purpose of the obligation is not to achieve transparency. Recital 72 does acknowledge that “ensuring an adequate level of transparency of profiling practices (…) puts external pressure on gatekeepers not to make deep consumer profiling the industry standard, given that potential entrants or start-ups cannot access data to the same extent and depth, and at a similar scale“. In the same vein, enhanced transparency will enable that entrants to those same markets will be able to differentiate themselves through the use of superior privacy guarantees.

The rationale goes: if gatekeepers have to disclose their profiling activities (that have been seldom kept in the dark despite the GDPR’s efforts to unveil these practices), the public outrage will force them to steer away from data-intensive and targeting activities imposed upon their end users. In turn, the backlash will force them to embrace more privacy-enhancing technologies and alternatives in their business models.

The limitations regarding the imposition of the obligation are well set out in Recital 72 regarding its scope and the future steps that the Commission may adopt once it is complied with by the gatekeeper. On one side, the obligation comprises an independently audited description of the basis upon which profiling is performed, including: i) whether personal data and data derived from user activity in line with the GDPR is relied on; ii) the processing applied; iii) the purpose for which the profile is prepared and eventually used; iv) the duration of the profiling; v) the impact of the profiling; vi) the steps taken to effectively enable end users to be aware of those same profiling activities; and vii) the steps provided to the end users in seeking their consent or providing them with the possibility to deny or withdrawing it altogether.

Once the gatekeeper files the audit before the Commission, it will also be passed on to the European Data Protection Board (EDPB) to inform the enforcement of the data protection rules, i.e., the GDPR and the e-Privacy Directive. Despite the without prejudice clause under Recital 12 (see Bania’s paper on its conflated application here), the fleshing out of the terms of Article 15 and Recital 72 of the DMA may collide with the interpretation (and perhaps misguided) and approach adopted by the Template vis-à-vis its related enforcement in the field of the GDPR.

 

The expansion of the scope of Recital 72

Even though Article 15 of the DMA (in the Commission’s initial draft, Article 13) did not provide a definition for profiling, the latest version that came as a result of the trilogue negotiations referenced explicitly that profiling should be understood under the same meaning of Article 4(4) of the GDPR (ex, Article 2(31) of the DMA). Thus, profiling should be understood as “any form of automated processing of personal data consisting of the use of personal data to evaluate certain personal aspects relating to a natural person, in particular, to analyse or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements“.

In the GDPR, profiling is later referenced as a species of automated decision-making in Articles 13(2)(f), 14(2)(g), 15(1)(h) and 22. Excluding Article 22, the rest of the provisions are directed at securing transparency on the side of the data subject, whereas Article 22 enshrines the data subject’s right “not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her“.

The Template does not reference the definition under Article 4(4) GDPR nor any allusion is granted to the guidelines that apply to the particular topic of profiling, which were first established by the Article 29 Working Party (see the Guidelines on Automated Individual decision-making and Profiling for the purposes of Regulation 2016/679 or the Guidelines) and were later endorsed by the EDPB in 2018.

According to the Guidelines, profiling is characterised by three main elements: it has to be an automated form of processing, it has to be carried out on personal data and the objective of profiling must be to evaluate personal aspects of a natural person. Stemming from the fact that profiling refers to any form of automated processing (as opposed to the narrower concept under Article 22 GDPR that relates ‘solely’ to automated processing), human involvement is not pre-emptively factored out of the equation. In profiling, data collection, automated analysis identifying correlations and applying those correlations and patterns into reality are accounted for.

In a similar motion to the GDPR, the Guidelines also draw a line separating profiling from automated decision-making, due to their difference in scope. However, this does not imply that cannot intersect in practice. Solely automated decision-making implies the controllers’ ability to make decisions by technological means without human involvement based on any type of data (provided directly by the individual, observed or inferred data). Both can happen separately, but a particular activity may start off as an automated decision-making process and become later a process based on profiling, depending on how the data are used. Hence, the GDPR addresses three main manifestations of profiling: general, decision-making based on profiling and as a species of solely automated decision-making producing legal effects on the data subject.

Despite the differences, the Template does conflate the three approaches and amalgamates all of the gatekeeper’s collection and processing of data into the wider category of profiling.

 

In-scope obligations

Section 2 of the Template addresses the items that the gatekeeper must disclose in the audit with reference to the data protection principles that the Guidelines provide in interpreting the GDPR. However, when including the fifteen items that the gatekeepers shall disclose, the Commission expands on the DMA’s scope (as per Recital 72). The Table below includes a comparison of the items regarding the DMA’s, the Guidelines’ and the Template’s content with reference to those provisions that are initially covered by Recital 72:

DMA (Recital 72) Guidelines Template
Whether personal data and data derived from the user is relied on Only personal data – A numbered list with a detailed description of each category of personal data and data derived from user activity (in particular, distinguish data and personal data categories actively provided by consumers from observed data) and sources for each of these categories of data and personal data processed for profiling consumers applied to or across the designated core platform services (in particular, distinguish data and personal data originating from the gatekeeper’s services, including core platform services, from data and personal data originating from third parties) (Section 2.1.c)

– A detailed description of the inferred data about consumers from the processing of the data and personal data listed in point c) (Section 2.1.d)

The processing applied Fair and transparent processing (interpretation of Article 5(1)(a) GDPR) and purpose limitation principle (Article 5(1)(b) GDPR) The processing applied (Section 2.1.g)
The purpose for which the profile is prepared and eventually used Fair and transparent processing (interpretation of Article 5(1)(a) GDPR) and purpose limitation principle (Article 5(1)(b) GDPR) The specific purpose(s) pursued by the profiling technique(s) and for which  they are used (Section 2.1.a)
The duration of the profiling Principle of storage limitation and retention periods (interpretation of Article 5(1)(e) GDPR) The retention duration of each category of data and personal data listed in points c) and d) of the profiling itself (Section 2.1.e)
The impact of such profiling on the gatekeeper’s services Purpose limitation principle (Article 5(1)(b) GDPR) and data minimisation principle (Article 5(1)(c) GDPR), Qualitative and quantitative impact or importance of the profiling techniques in question for the business operations of the gatekeeper (Section 2.1.i)
The steps taken to effectively enable end users to be aware of the relevant use of such profiling Fair and transparent processing (interpretation of Article 5(1)(a) GDPR) and Article 12(1) GDPR Actions taken to effectively enable consumers to be aware that they are undergoing profiling and the relevant use of such profiling (Section 2.1.j)
The steps to seek consent or provide users with the possibility of denying or withdrawing consent Consent as a legal basis for processing (Article 6(1)(a) GDPR) Where consumer consent is required for the given purpose under Regulation EU) 2016/679, Directive 2002/58/EC and/or Regulation (EU) 2022/1925, a description of any steps taken to seek such consent to profiling, including details on how consumers can refuse consent or withdraw it, and any consequences of such refusal or withdrawal (Section 2.1.k)

In those items that can be clearly linked to the Guidelines and the DMA’s preliminary (and proposed) items in Recital 72, the expansion of the safeguards imposed on profiling is clear, starting from their scope. Even though the DMA defines profiling with reference to Article 4(4) GDPR, the auditing of the consumer profiling techniques is broader in scope, relating to both personal and non-personal data that are used to correlate patterns and deliver insights to the gatekeepers.

Furthermore, the purpose limitation-like item contained in Section 2.1.a) diverges substantially from the initial intentions of Recital 72, insofar as the locus of the Template is not so much the profiling as an activity within the undertaking’s operations, but profiling as an overall and all-encompassing technique in the hands of the gatekeeper. By this token, the purpose limitation principle (inspired by Article 5(1)(b) of the GDPR) does not apply so much to the profiling, but to the different techniques employed by the gatekeepers and their distinct uses. In a similar vein, the Template translates the gatekeeper’s obligation to disclose the real-life consequences of the profiling activities to tangible metrics, both qualitative and quantitative (one can only think about the number of decisions that the gatekeepers adopt on the basis of profiling).

Moving forward, the item relating to the duration of the profiling is not so much expanded in scope but transformed in its own nature. Recital 72 provided that the gatekeeper should disclose the duration of the profiling (that is, the focus was centred on the activity), whereas the Template establishes that the gatekeeper shall communicate the retention duration of the underlying data derived from user activity and the inferred data that the gatekeeper obtained as a result of the processing of the former.

Finally, in terms of securing effective consent, the Template does not only take stock of the foreground contained in Article 6(1)(a) of the GDPR as a legal basis for the processing of personal data but also the narrower notion of consent contained in the e-Privacy Directive. Hence, if one is to imagine the effective application and scrutiny of this particular item, and given the gatekeepers concerned (for instance, Apple and Alphabet have notified their meeting the notification thresholds and are awaiting a notification decision any time around early September), a narrow concept and interpretation of consent will apply in this context deriving from the e-Privacy Directive’s provisions and they may elevate the thresholds of intervention in favour of the Commission (and the related data protection authorities concerned) and against the gatekeeper’s scope of action within their business models.

 

Out-of-scope obligations

The Commission must engage, interpret and enforce the DMA in light of the principles of proportionality, necessity and in coherence with the objectives of contestability and fairness (Recital 28). In principle, the EC must only apply those obligations which seek these purposes. However, when setting out the items to be presented by the gatekeeper in the midst of the Template, the Commission decided to expand on the elements initially provided by Recital 72.

In this regard, two main groups may be differentiated. The first comprises those items that derive from those already enshrined in Recital 72 and that serve as their necessary concretisation into reality. For instance, the legal ground relied on by the gatekeeper under Article 6(1) to perform profiling activities (Section 2.1.b). If consent is to play a role in the gatekeeper’s data-related practices, then it is concomitant that the gatekeeper discloses whether consent is relied on as a legal basis to process personal and non-personal data across its core platform services and across proprietary and third-party services. In principle, these items may serve to identify and contour the operative side of the substantiveness of Recital 72, and the Commission should not be censured in this regard.

The second group of elements which may conflict to a larger extent with the principles of necessity and proportionality are those that are directly unrelated to the content of Recital 72. Four items from the Template may be inserted into this group, namely: i) the safeguards that the gatekeeper must establish to avoid presenting ads on the basis of profiling minors (Section 2.1.f); ii) the use of automated decision-making in terms of the number, object, legal effects and the mechanisms used for the gatekeeper’s operations, including a description of their underlying algorithms (Section 2.1.f); iii) the statistics on how many consumers choose to undergo profiling in the end (Section 2.1.l); and iv) whether and when the profiling techniques have been the object of a data protection impact assessment (Section 2.1.m).

The inclusion of these items is somewhat more decisive, insofar as the Commission directly engages with an expansionist instrumentalisation of Recital 72, elevating the threshold of protection guaranteed under the GDPR, without a proper legal basis sustaining it (and the without prejudice clause looming in the background). In my own view, this is the point where the Commission should differentiate and draw a thick line between the enforcement of the DMA and that of the Digital Services Act (DSA). Many of those topics are already addressed in the latter with more instruments at the EC’s disposal. On the contrary, the DMA cannot serve as the lifeboat to the DSA’s regulatory gaps, even if the Commission is legitimised to protect different interests to those directly addressed in its substantive text.

A clear-cut example is that of the protection of minors, which is translated into the gatekeeper’s obligation to submit a detailed description of the technical safeguards put in place to avoid the presentation of advertisements on their interfaces based on the profiling of minors or children. Pursuant to Recital 38, the specific protection of children regarding their personal data is acknowledged as an “important objective of the Union and should be reflected in the relevant Union law“. Notwithstanding, the regulatory instrument redirects the attainment of that objective to a Regulation on a single market for digital services, i.e., the DSA, regardless of the fact that the DMA does not exempt the gatekeeper from the obligation to protect children laid down in applicable Union law. No further reference is provided throughout the DMA to the specific objective of the protection of children.

However, in the Template, the Commission takes the objective a notch too far by directly addressing the protection of children’s personal data in the realm of the application and enforcement of the DMA. One thing was to recognise its importance within the wider framework of Union law (Recital 38), but another completely different aspect is to elevate the matter and include it as a cornerstone to the gatekeeper’s audit submission regarding consumer profiling techniques.

Moreover, the inclusion of the particular item on minors’ protection in the Template defies the purpose of complementarity with the DSA, given that a range of instruments is already put in place within the latter to address those same problems with an identical purpose (see Recitals 71, 81, 83, 89 and 104 of the DSA), namely the adoption of the appropriate and proportionate measures to protect them (e.g., the design of their online interfaces or parts with the highest level of privacy, safety and security for minors by default where appropriate or the adoption of standards for the protection of minors). To that end, the DSA works around the pre-emptive idea of the high protection of minors in terms of their privacy and the protection of their personal data by securing transparency (Articles 14(3) and 28 DSA) by conducting risk assessments on the target’s functionalities (on the particular matter of minors, see Article 34(1)(d) of the DSA) within the wider aim to mitigate those same risks (Article 35(1)(j) of the DSA). Notwithstanding, scholars have already highlighted the mitigation of risks as a regulatory solution (also considered in the AI Act) might be a naïf solution in terms of metering its long-term impact, see Fassiaux and Almada.

In fact, Section 2.1.f) of the Template replicates in full and in substance the safeguards established in Article 28 of the DSA, so the duplication of the gatekeeper’s efforts to demonstrate that it protects children’s personal data will not add anything new, even if it instrumentalists a different tool to those already put forward by the DSA. Actually, if one analyses the duplication not from an objective viewpoint, but from a subjective perspective, the intervention seems to be justified, insofar as some of the gatekeepers captured by the DMA (seven undertakings notified their gatekeeper status to the Commission by July 2023, including Alphabet, Amazon, Apple, ByteDance, Meta, Microsoft and Samsung) have not been designated on the basis of the DSA or some of their core platform services fall out of the scope of the DSA’s enforcement. For example, despite Apple is expected to be designated under the DMA, only its Apple Store has been designated as a Very Large Online Platform (VLOP), and the same goes for Alphabet (only Google Play, Maps and Shopping are included – and Google Search as a Very Large Online Search Engine), Amazon (Amazon Store) and Meta (Facebook). Even if this was the case, however, the Commission’s inclusion of the DSA’s substantive obligations imposed on the gatekeepers may incur an ultravires intervention regarding their business models.

The same conclusion can be drawn out from the Commission’s inclusion of Section 2.1.h) -the gatekeeper’s documentation of automated decision-making when applying its profiling techniques (again, techniques and not profiling as an activity)-, but this time with relation to the DMA’s complementarity with the GDPR. The Commission directly translates the content of Article 22 of the GDPR into the Template but goes further than that. Perhaps, the reaction comes as a result of the data protection authority’s failure to make the largest data controllers in the EEA accountable for their processing activities through auditing their profiling activities, just as a range of stakeholders put forward in the last of the workshops held by the European Commission regarding the DMA’s data-related provisions (see a review here).

In a similar vein to the upgrading of the right to data portability from Article 20 of the GDPR to Article 6(9) of the DMA, the Template also raises the thresholds of protection under Article 20 of the GDPR by adding that the gatekeeper shall describe the algorithms underpinning the automated decision mechanisms related to profiling. Similarly to the protection of children, the EC’s inclusion of this item is not supported by the DMA’s actual content, but by a wish to expand the contours of the scrutiny over the gatekeeper’s profiling activities (techniques, as per the narrative of the Template) to the deployment of algorithms (and that is, even before that the AI Act‘s provisions start to apply at the EEA).

Furthermore, Section 2.1.h) cannot be read in isolation, but only related to Section 2.1.m) (the gatekeeper’s obligation to disclose whether it has performed a data protection impact assessment), insofar as it also brings forward the obligations already imposed on data controllers pursuant to Article 35(3)(a) of the GDPR to the particular activities of the gatekeepers. To the extent that the audit information may be transmitted by the Commission to the EDPB for its further analysis, the inclusion of the item may make sense, but it does not add anything new to the already-regulated scope of scrutiny of profiling over data subjects. By this same token, one cannot start to think about the Commission’s reaction in case the gatekeeper responds in the negative, other than passing on that same information to the public bodies in charge of enforcing data protection regulation.

And finally, Section 2.1.l) of the Template is also special in its own way, given that the Commission compels the gatekeeper to disclose whether the deployment of the obligations set out in the DMA, i.e., Articles 5(2), 6(10) and 6(11) (and the compliance with the rest of the items included in the Template) will have any impact, aside from its actual reporting on paper. The submission of this item, however, may render more deceptive than helpful for the Commission when assessing compliance with Article 15 of the DMA, insofar as consumer biases may come to the fore and the real-life consequences in the digital arena when it comes to the disclosure of end user’s preferences might not be as straightforward as expected.

For example, a well-known (and widely discussed) case that can resemble the deceitfulness of observing opt-in rates to profiling as a story of success is that of the deployment of cookie banners online. As a result of the GDPR’s entry into force, cookie banners have been streamlined across the websites that we access every day. Every single time that a consumer accesses a web page then a pop-up (a different one each time with different options) appears asking for its consent to deploy cookies on the particular site. The results of implementing cookie banners on the Internet have been skewed, insofar as a wide range of biases have influenced the opt-in rates to the accepting of the deployment of the technologies: some argue that cookie banners are directly ignored by most consumers (if they are allowed to do so), whereas others uphold that individual personality traits and nudges curb the user’s decision far from a binary choice between yes or no. In addition, IAB Europe’s Transparency and Consent Framework, which was first pushed as a standard in response to the GDPR’s entry into force and widely adopted by Google in 2018, was found by the Belgian data protection authority as infringing the data protection regulation, so that the matter is far from resolved even in the realm of interpreting the GDPR. In the same spirit as the preceding sections, this last item incorporated by the Commission may not produce the conclusive indicators that the enforcer may have foreseen in the immediate future, at least not as far as the EC’s capacity is not enlarged to the point of fine-tuning the gatekeeper’s interfaces to the point of interpreting the principles of transparency and lawfulness in the terms of the GDPR directly.

 

The legal basis and objectives for the implementing act: expansive enforcement in disguise?

In light of the foregoing, the Template navigates between the European Commission’s expansive reading of its own capacity for intervention and the broad interpretation of the terms under Article 15 and Recital 72. The introduction to the Template points toward the same direction: the Commission recognises that Section 2 is particularly addressed at “meeting the objectives set out in Recital 72 of the DMA, including enhancing transparency and accountability regarding gatekeeper’s profiling techniques as well as facilitating fairness and contestability of respective core platform services“.

If one traces the regulatory steps back to Recital 72 of the DMA (which the Template directly references), the purpose of Article 15 is quite different. On one side, the obligations imposed upon the gatekeepers are directed at ensuring an adequate level of transparency of their profiling practices with the goal to facilitate contestability of core platform services. No further reference is warranted, however, to the elements of accountability and fairness in relation to the obligations deriving from Article 15 of the DMA. Consequently, the Commission expands the provision’s purposes way out of the line placed by the Union legislator.

Along these same lines, the Commission’s draft Template does not seem to correspond in ascribing to the mandate that the Union legislator conferred upon it. The legal basis contained under Article 46(1)(g) of the DMA enables the Commission to issue an implementing act laying down detailed arrangements for the application of Article 15 of the DMA, but not without limitations. The provision confers this capacity upon the Commission to concretise the “methodology and procedure for the audited description of techniques used for profiling of consumers provided for in Article 15(1)“. Even though the Template does develop the concrete items to be disclosed by the gatekeeper under its obligation pursuant to Article 15 of the DMA, it is difficult to argue that the in-scope obligations constitute the methodology for the audits (given that they relocate the content under Recital 72 directly onto the Template), whereas those items that fall out of the scope of the DMA’s initial intentions add new requirements to the obligations (mimicking the DSA’s and GDPR’s provisions -unnecessarily or not-), and do not nail down the pre-existing foundations of Article 15 of the DMA.

Hence, one could argue that at least as far as Section 2 of the Template is concerned, the legal basis under Article 46(1)(g) of the DMA cannot justify sufficiently the Commission’s issuing of the implementing act. Sections 3 and 4 (that will be touched upon below) legitimise the EC’s intervention in terms of fleshing out the procedural aspects behind the term “independently audited description of the basis upon which profiling is performed” (Recital 72), but methodology -understood as a system of ways of doing, teaching, or studying something, as per the Cambridge Dictionary’s definition– is not directly considered, insofar as the Commission does not delve into the indicators and benchmarks that the audit should follow.

Additionally, when developing the draft implementing act for this purpose, “the Commission shall consult the European Data Protection Supervisor and may consult the European Data Protection Board, civil society and other relevant experts” (Article 46(1)(g) of the DMA). According to the Template’s lack of reference to a previous consultation with the EDPS, its opinion is yet pending (and shall be gathered before the final version of the draft Template is issued after its public consultation). A related question is whether the Commission will consult with the EDPS directly or whether it will do so in the midst of the next meeting of the High-Level Group (see review of its creation here). The most likely scenario is that the Commission will engage directly with the EDPS (despite that no meeting has been scheduled to the author’s knowledge, according to the EDPS’ agenda) insofar as the High-Level Group’s next meeting will be held after the designation process has been completed -at the latest, on the 6th of September, as they agreed on their first meeting (see the first meeting’s minutes here).

Against this background, the path in fine-tuning the draft Template towards its final version will be a patchy and substantive one, which might require to re-work some of its core elements and objectives.

 

The framing of the ‘independently audited description of the basis upon which profiling is performed’ in Sections 3 and 4 of the Template

The European Commission sets out in Section 3 the general information that the gatekeeper must provide regarding the auditor and the third parties that it might consult to draw out the audit. In contrast, Section 4 of the draft Template delves into the descriptions that the gatekeeper will disclose about the methodology employed by the auditor (Section 4.1) with reference to the information relied upon as the audit evidence (Section 4.2).

The sole requirement for the designation of the auditor is, thus, that it is external to the gatekeeper. An auditor with a previous commercial or contractual relationship to the audited gatekeeper may also be designated for the task at hand (Section 3.1.c). However, the auditor(s) will not necessarily act in isolation to the gatekeeper, without any risk of interference from it. Instead, the Commission opens up the door for any member of the gatekeeper’s organisation or external experts to contribute to the drafting of the submitted description of the consumer profiling techniques, as long as they are disclosed as per Section 1.2 of the Template.

The allocation of the incentives between the auditor(s) and the audited gatekeepers remains, thus, unclear. On one side, the auditors will have every incentive to keep the gatekeeper’s business for themselves when drawing up the audit in the form of commissions as well as due to gaining sufficient access to data to perform their analysis (as highlighted by Laux, Wachter and Mittelstadt). Thus, the threat of audit capture looms over the gatekeeper’s compliance with the obligations set out under Article 15 of the DMA, just as it happened in similar auditing obligations imposed on economic operators in multiple sectors such as accounting, finance and products safety (as put forward by Allan, Joyce and Pollock in 2018). The related effect of this risk is that there might be no guaranteed quality audits produced as a result and no effective governance concomitant to the effective enforcement of the DMA as far as its data-related obligations are concerned. One must not forget that the assessment of the auditor will be key to securing compliance (in the form of a ‘positive’, ‘positive with comments’ or ‘negative’ conclusion) of Article 15 of the DMA (Section 5.1.a).

On the other side, due to the gatekeepers’ gravity effects on their auditors, they might pull the methodologies applied to assess compliance with the audit away from the emerging methods and approaches in the field towards bespoke solutions that might argue their case better before the Commission. Following Laux, Wachter and Mittelstadt, incorporating the principles and rationale behind the Auditing Directive into the Template might act as a safeguard to securing complete, accurate and precise audits as a result of the interaction of the gatekeeper with an auditor directly appointed by it.

 

Key takeaways

The draft Template relating to the audited description of consumer profiling techniques pursuant to Article 15 of the DMA provokes a long stream of thoughts, relating to the Commission’s limited mandate to apply and enforce the DMA’s provisions proportionally (Recital 24) and its capacity to expand on the regulatory instrument’s obligations at its own discretion. By this token, one has identified four main fields for improvement in the EC’s drafting of the Template’s final version:

  • First, a clear-cut definition of the concept of profiling must be provided to ensure the Template’s coherence with the DMA’s obligations, insofar as the definition referenced in Article 4(4) of the GDPR is narrower in scope than the interpretation that both the DMA and the Commission have in mind. However, as the current text of the implementing act stands, this narrower definition would apply and it would be difficult to argue that the audit must encompass both personal and non-personal data processed and ‘profiled’ by the gatekeeper.
  • Second, the legal basis sustaining the Template’s legitimacy (i.e., Article 46(1)(g) of the DMA) must be observed, insofar as it compels the Commission to draw out both the methodology and procedure applicable to the obligation. As analysed above, the in-scope items reproduce the contents of Recital 72, whereas those items that are out of scope are new to the regulatory instrument and cannot be sanctioned under the premise that they flesh out the provision’s methodology.
  • Third, the complementary nature of the DMA alongside the DSA as a regulatory bundle must be reviewed in those instances where the Template pursues to satisfy legal interests different to the broader objectives of contestability, fairness and transparency, notably the protection of children in terms of the processing of personal data. The same can be applicable to argue against the inclusion of the legal transplants operated from the GDPR (e.g. Article 22 of the GDPR) to avoid duplication in procedures and the regulatory burden on the targets of the regulation under the DMA.
  • Fourth, Sections 3 and 4 relating to the procedural aspects of the auditing must be re-worked to capture and align the incentives that the Commission, the auditors and the gatekeepers (in their role as audited agents) hold towards compliance with Article 15 of the DMA, in order to curtail the threat of audit capture in the realm of the regulatory instrument.

The European Commission’s reaction to the public’s response when issuing its first draft Procedural Implementing Regulation was substantial and significant. A range of provisions were altered and fine-tuned as a result. The same should apply to the draft Template that has been analysed in the previous words, to avoid the perils of confining compliance to a formalistic endeavour.


________________________

To make sure you do not miss out on regular updates from the Kluwer Competition Law Blog, please subscribe here.


Kluwer Competition Law

The 2022 Future Ready Lawyer survey showed that 79% of lawyers are coping with increased volume & complexity of information. Kluwer Competition Law enables you to make more informed decisions, more quickly from every preferred location. Are you, as a competition lawyer, ready for the future?

Learn how Kluwer Competition Law can support you.

Kluwer Competition Law
This page as PDF

Leave a Reply

Your email address will not be published. Required fields are marked *