Miguel Amaral
OECD
Case Studies on the Regulatory Challenges Raised by Innovation and the Regulatory Responses
2. Case 1. Data-driven markets: regulatory challenges and regulatory approaches
Abstract
Data-driven markets have increasingly widespread in economies and societies and they are now supporting many of our daily activities. They entail a number of regulatory challenges for governments that strive to enable innovation in these markets while ensuring a “sufficient” level of protection for people and businesses. This case study documents the range of regulatory challenges raised by the development of data-driven markets as well as some of the regulatory responses that have been implemented by governments. It shows, in particular, that the development of data driven markets will require new institutional solutions to strengthen co-operation across government agencies, including across borders, in order to tackle the transversal challenges of data-driven markets.
“To achieve an economics for the common good in this new world, we will have to address a wide range of challenges, from public trust and social solidarity […] Success will depend, in particular, on whether we can develop viable new approaches to antitrust, labour law, privacy, and taxation” (Tirole, 2019[1])
Context
Data-driven markets have increasingly widespread in economies and societies and they are now supporting many of our daily activities. These markets share a number of economic properties, which allow data-driven businesses to quickly increase their scale in operations and may lead to a high level of concentration. Critical economic features include (OECD, 2019[2]), (OECD, 2018[3]), (OECD, 2019[4]) and (OECD, 2019[5]):
Direct network effects: in data-driven markets, users’ utility usually increases with the number of end-users consuming the same product or service;
Indirect network effects: in multi-sided markets, end-users’ utility on one side of the market usually depends positively on the number of users on the other market side;
Cross-jurisdictional scale without mass: while the development of data-driven business might imply significant upfront (i.e. fixed) costs, the production digital services generally entails near-zero or zero marginal cost. It allows companies to scale without mass, including across borders and, in some cases, without any physical presence;
Lock-in effects: the combination of network effects and switching costs (which might be psychological) holds the potential to lock consumers into a specific service.
The development of data-driven markets, which often takes place within complex ecosystems, bears a number of consequences on market and societies. Their impacts on production and consumption systems should be properly understood in order to help governments navigate the regulatory challenges and target appropriate regulatory responses. The nature of these effects can be broken down into two broad categories: the implications on market structures and the impacts on firms’ strategies (for a more detailed and comprehensive presentation of these effects, see (OECD, 2018[6]), (OECD, 2019[7]), (OECD, 2019[8]) (OECD, 2019[9]), and (OECD, 2020[10])):
Impact on markets structures
Shift towards services: while the development of the service sector is a longstanding phenomenon that preceded the rise of the digital economy, the rapid development of data-driven markets has further reinforced this evolution (OECD, 2019[4]);
Impact on transaction costs: as stated by (OECD, 2019[5]) and (OECD, 2019[2]), digitalisation may contribute to reduce the level of transaction costs (even for cross-borders transactions), allowing the development of new business models;
Development of networks: data-driven markets trigger the development of vast networks for different purposes ranging from e-commerce, sharing economy to social interaction (OECD, 2018[3]).
Impacts on firm’s strategies
The development of data-driven markets is affecting firm’s strategies along two key drivers: the change in competition dynamics and the growing power given to consumers.
Competition dynamics
Monopolisation: the economic properties of online platforms (network effects and cross-jurisdictional scale without mass) may create a tendency towards the creation of (natural) monopolies and the rise of undue barriers to entry;
New forms of anti-competitive behaviours: while big data and algorithms offer great opportunities to enhance pricing models and foster competition, it may also favour the emergence and the sustainability of tacit collusive agreements without any human interaction (OECD, 2017[11]). Beyond algorithmic collusion, concerns about types of anticompetitive conducts in data-driven markets include anti-competitive manipulation of search results, anti-competitive bundling of apps, anti-competitive use of data by platforms that are also downstream competitors or the collusion in online advertising (OECD, 2019[2]).
Growing power to consumers
Data-driven markets shows great potential to enhance consumer choice and subjective well-being. Their development has indeed created opportunities for promoting wider choice and reducing information asymmetries between consumers and businesses on the quality of a product or service (OECD, 2019[2]), allowing in turn markets to work more efficiently.
Key issues for governments and regulators
A need to rethink traditional approaches and existing tools to address the challenges raised by data-driven markets
Data-driven markets can be seen as a double-edged swords. One the one hand, it is clear that hold the potential to bring important benefits by increasing consumer choice, improving markets’ efficiency, and fostering cross-border trading. At the same time, networks effects and cross-jurisdictional scale without mass have allowed data-driven businesses to gain outsized market power in some cases. A lingering concern is that the market structure may lead to anticompetitive conducts resulting in inefficient outcomes (in terms of prices, quality and incentives to innovate) to the detriment of consumers.
For that reason, governments originally addressed the challenges brought by multi-sided platforms through antitrust laws, as illustrated by the series of actions that have been launched against Google by the European Commission in 2010, 2017 and 2018. Yet, for a number of reasons, the underlying economic features of data-driven business might challenge this initial approach and confuse the rationale for regulatory intervention.
First, the development of data-driven markets raises strong interplays between competition concerns, data privacy and data security. As an illustration, it is often assumed that a response to the potential competitive concerns stemming from the accumulation of data (e.g. consumer lock-in) by data-driven businesses might be to promote the development data portability and interoperability measures from one platform to another to empower consumers and, in turn, foster competition. Such measures create however a fundamental tension with data privacy and security issues which should properly considered and addressed by governments. Part of the solution to help governments balance these competing concerns certainly lies in the cooperation and the development joined-up approaches between competition authorities, consumer protection authorities and data protection agencies, including across borders.
Second, the economic properties of data-driven markets might raise the need to rethink antitrust tools used in traditional markets to make sure that they remain effective in the context of multi-sided platform markets. As highlighted by (OECD, 2018[12]), standard market definition exercise appears for example to be a less valuable tool for multi-sided platform: “a traditional starting point for framing an analysis of the competitive effects of a merger, an action or an agreement is to define the relevant market(s) that might be affected […]. For multi-product or multi-location firms, the answer is the result of the market definition exercise, which identifies the scope of the market, and hence whether those different products and locations fall within the same or different markets. In contrast, for multi-sided platforms, the product that a platform provides to one side of the market does not compete with the product it provides to another side. In the case of multi-sided markets the question of how many markets to define cannot be answered within a market definition exercise, instead it is a conceptual question that requires an answer before any exercise to define the scope of the market can be carried out”. The emergence of new forms of anticompetitive strategies also questions the analytical tools used by competition agencies. As for algorithmic collusion for example, a key question that needs to be addressed is whether antitrust agencies should revise the traditional concepts of agreement and tacit collusion for antitrust purposes and how traditional antitrust tools might be used to tackle some forms of algorithmic collusion. While a specific regulatory approach (e.g. rules on algorithmic design) could be considered to deal with this anticompetitive practice, it must be acknowledged that any regulatory initiative may pose costs (e.g. new barriers to entry and adverse effects on the incentives to invest in proprietary algorithms) that could outweigh its expected benefits.
Third, the economic properties of digital platforms raise additional concerns around the rationale for regulatory intervention. Indeed, network externalities, the capacity to scale without mass and the economies of scope characterising online platforms can give rise to natural monopoly conditions and create barriers to entry to potential competitors, with substantial risks that excessive prices and lack of innovation will follow. At the same time, digital transformation offers potential to stimulate competition: the same economic properties may eventually shift in favour of innovative entrants, which might be able to grow rapidly and gain market shares over incumbents once they bring a new product to market, often with few employees, few tangible assets and limited geographic footprint. They can even replace an incumbent in a relatively short time simply by offering a qualitatively superior good or service. Hence, becoming a dominant platform at a discrete point of time does not come with a guarantee that the leading position will be maintained permanently or that it is invulnerable to competition. In sum, the potential for increasingly concentrated market raises arguably less concerns in situations where digital markets are contestable. This raises a clear need to understand and capture the dynamics of the industry, rather than defaulting to static or short-run markets analysis. It should also be underlined that these different (and sometimes counteracting) effects may confuse the rationale for regulatory intervention as any initiative will influence the nature of competition between the incumbent and (potential) new entrants. On the one hand, regulators may be prompted to ensure a level playing field to increase the (dynamic) competitive pressure and foster the contestability of data-driven markets (through lower switching costs for example). On the other hand, undue regulatory intervention may threaten the entry of new players and curb innovation. As a consequence, there remain active debates about what regulations are necessary, particularly in the light of their potential adverse consequences, whether intended or unintended. While it is probably difficult to define one-size-fits all policy for these issues, governments will certainly need to reconsider existing regulations to provide efficient responses. OECD tools, such as the Competition Assessment Toolkit or the Product Market Regulation Indicators already provide governments with valuable analytical frameworks to review the impact of regulations on competition in data-driven markets, but it seems that more works would still have to be done to further understand competition dynamics in online markets and define appropriate policy measures.
Data privacy and security
Central to the discussions raised by data-driven markets are the concerns around data privacy and security, notably because they are fundamental drivers of trust. As underlined by (OECD, 2019[13]) “almost 30% of Internet users do not provide personal information to social networks because of security or privacy concerns” and results of a survey undertaken in 2015 shows that “about 3% of individuals on average in OECD countries reported experiencing a privacy violation in the past three months”. While these policy concerns are not new in themselves, the high and increasing number of platform users, combined with the unprecedented abundance of data shared to with online platforms (both willingly and unknowingly) and the evolving uses of digital technologies is substantially changing the scale and scope of digital privacy and security challenges. These evolutions put strain on governments regulators who need to devise appropriate regulatory regimes and encourage businesses to better manage digital privacy and security risks to foster trust and improve consumer protection.
Against this backdrop, OECD has long insisted on the need for national strategies to mitigate digital privacy and security risks, using different legal instruments such as the OECD 2013 Guidelines Governing the Protection of Privacy and Transborder Data Flows (OECD, 2013[14]) or the OECD’s 2015 Recommendation on Digital Security Risk Management for Economic and Social Prosperity (OECD, 2015[15]). (OECD, 2019[13]) notes that, while technological developments offers avenues to help governments address the data privacy and security challenges, there is still a need to develop “national data strategies, supported at the highest levels of government, that incorporate a whole-of-society perspective to strike the right balance between various individual and collective interests. Such strategies would provide clear direction to reap the social and economic benefits of enhanced reuse and sharing of data while addressing individuals’ and organisations’ concerns about the protection of privacy and personal data, and intellectual property rights”. Digital security and privacy concerns also raise a critical need foster international regulatory co-operation given the importance of cross-border data flows for data-driven markets. Strengthening co-ordination and co-operation across borders appears critical to avoid costly inadvertent regulatory divergence that leads to the erection of non-tariff trade barriers, and can result in a reduction of regulatory protections as regard data privacy and security.
Socio-ethical challenges
While the development of artificial intelligence (AI) associated with data-driven businesses brings outstanding opportunities in different sectors such as health, business, or education, it also raises new types of policy concerns for governments in comparison to previous technologies. A well-documented risk associated with AI systems is the potential for algorithms to create biases that could lead to unfair or unlawful discrimination creating, perpetuating or exacerbating inequalities.
As reported the OECD AI Policy Observatory, an important number of strategies and initiatives have been developed at national and international level to harness all the opportunities promised by AI while mitigating its unintended effects of AI to uphold protection for citizens. The Observatory gathers over 700 AI policy initiatives from 60 countries, territories and the EU (OECD.AI, 2021[16]). Recently, OECD member countries approved the OECD Council Recommendation on Artificial Intelligence (OECD, 2019[17]), which identifies five complementary values-based principles for the responsible stewardship of trustworthy AI. In 2021, the European Commission also published a proposal for an Artificial intelligence Act (European Commission, 2021[18]), which can been seen as the first attempt to introduce a comprehensive regulatory regime to address AI-related issues. In addition, as a fruition of an idea developed within the G7, 15 countries created the Global Partnership on AI in 2020 (the partnership now counts 25 members), a state-led multi-stakeholder initiative which aims to promote responsible development of AI. Yet, as stated by (Cameron et al., 2021[19]), “AI policy around the world seems to have reached a tipping point, with governments now seeking ways to operationalize ethical principles into concrete policy provisions or detailed guidance for AI developers and deployers; at the same time, governments are also in the process of adapting their general AI framework and strategies to the specificities of individual policy domains and industry sectors. This tipping point presents a unique opportunity to strengthen international cooperation in AI policy and development while governments around the world are still in the early stages of understanding the issues and developing their approaches. Moreover, we see broad recognition that AI is of such magnitude in multiple dimensions that it requires nations to work together”.
Another challenge associated with the development of data driven-markets and social media platforms in particular lies in the fact that they may contribute to the spread of false, inaccurate or misleading information. This is raising strong concerns for governments as it holds the potential to decrease public trust in government, undermine the evidence-based democratic processes and decrease citizen participation. While there is a growing agreement among governments on the need to rethink existing approaches to tackle this information challenge, this is still an area of high complexity, notably because any regulatory intervention might create risks in terms of freedom of information and expression.
Regulatory challenges for governments
Data-driven markets bring new challenges for governments as they may not fit well within existing regulatory regimes and some of them may operate in regulatory grey areas. They are putting many regulatory regimes under pressure by creating goods of services where regulatory framework could be unclear, redundant or overlapping. Adapting the regulatory frameworks requires, in the first place, a precise understanding of the challenges data-driven markets pose to the rule-making activities of governments.
Pacing problem
As for other technological developments, governments face major uncertainties on the potential immediate and tertiary risks raised by data-driven markets. Both foreseen and unforeseen risks are amplified by the accelerating speed and complexity of technological development in these markets. This is not only the types of technology that challenge existing regulatory frameworks but also the sheer pace of technological change underlying the development of data-driven markets. While the pacing problem has always be a concern for governments, it has acquired a new urgency in data-driven markets due to the scope and the speed at which businesses are scaling.
Challenges to the existing regulatory frameworks.
The traditional regulatory framework, often designed on an issue-by-issue, sector-by-sector or technology‑by-technology basis, may not be a good fit for the challenges brought by data driven markets. Economies of scope that characterises digital platforms are, by definition, blurring sectoral boundaries and affecting the landscape for market competition. It may challenge governments as policy implications may extend across what are in many cases separate policy domains delineated by ministries, departments or agencies. This may require co-ordination, harmonisation, or integration, often demanding a multidisciplinary perspective. As an example, the fact digital platforms are increasingly performing similar functions to media businesses challenges the traditional approaches to media regulation.
The development of data-driven markets also create the risk of new market failures (such as implicit transactions, incomplete markets, information asymmetries, hold-up and locked-in phenomena) that should be carefully addressed by governments. A way to deal with this issue would be to make the digital data transaction explicit and empower consumers to exercise their (existing) property rights and thus exerting a decentralised discipline in data-driven markets. The definition of data property rights could enable owners to explicitly exchange information in data-driven markets or exclude any other party from accessing or using them. Yet, the definition of ownership regimes as regard data raise a critical challenge: there is indeed a fundamental distinction to be made between raw data provided by consumers and data processed by companies. If this distinction was easy to establish, it should be possible to implement a simple ownership regime: data belong to consumers (and could be transferred at the wish of people) and processed data belong to companies (and protected through intellectual property regimes for example). However, a major drawback is that the boundaries between these types of data are not always simple to establish in a number of cases, notably because the quality of the data may strongly depend on efforts made by the company. Some argue that platforms should pay for data shared by consumers but, again, this solution raises a number of difficulties:
In a number of cases, the data take the form of public goods: information goods are not rival in consumption (data can be replicated with no loss of quality). At the same time, data generate positive externalities and, without a proper pricing regime, data may be under-exploited or under-shared;
Replicating an information good is generally associated with zero or near zero marginal costs;
Data can often be reused ad libitum for different objectives (sampling, repackaging, versioning, etc.);
Some argue that platforms do pay for the data, although this payment does not take the form of a financial transfer. The platforms provide indeed a service or a commercial transaction in exchange of the data shared by consumers;
Beyond the very low marginal costs of information goods, the value of a single data is likely to be very low. Most of the economic value of a single data may indeed result from its aggregation with other data.
Challenges to regulatory enforcement
Data-driven markets challenge regulatory enforcement in several ways. One of the issues has to do with the fact that traditional notions of liability may no longer be fit for purpose due to difficulties in apportioning and attributing responsibility for damages caused – for instance, in accidents involving AI-embedded machines or devices.
As for other innovations, technological developments in data-driven markets challenge regulatory enforcement because categories, which underpinned regulations, and specific rules, which are supposed to be verified and enforced, are often not strictly applicable to new situations, products, and services. Depending on legal frameworks and enforcement approaches, regulators can end up either cracking down indiscriminately on innovations that do not fit previously existing categories, or powerless to respond to emerging risks - or (not so rarely) both at the same time.
In data-driven markets, regulatory enforcement is also challenged by the shift in liability from digital platforms to individual market participants and, more generally, by the shift from traditional regulation (e.g. labour law) towards contractual relations and private governance arrangements. These shifts restrain the ability for government authorities to oversee, regulate and enforce obligations in this space. Data-driven business models have also given rise to a fundamentally new way of distributing content that makes intellectual property rights difficult to enforce.
Digitalisation is also offering new ways of hiding from the law. Money laundering can be facilitated by the complex flows of data worldwide and the possibility of using the Internet to conceal certain activities or transactions. In the same vein, the collection and exploitation of data, network effects and emergence of new business models such as multi-sided platforms exacerbates the challenges to existing tax rules.
Institutional and transboundary challenges
The inherently transboundary nature of a number of data-driven markets pose new types of policy challenges that put increasing strain on regulators operating within the limits of their own jurisdictions. Indeed, businesses operating digital technologies can span multiple regulatory regimes, creating the potential for confusion and risks. Moreover, digitalisation pays no regard to national or jurisdictional boundaries and drastically increases the intensity of cross-border flows and transactions. The phenomenon of global value chains has been “turbocharged” by Internet openness. Firms from around the world are now able to participate in supply chains and open up new markets for products and services (Centre for International Governance Innovation; Chatam House, 2016[20]). Data-driven businesses gain global reach while being able to locate various stages of their production processes or service centres across different countries. This feature enables companies to “forum shop” by choosing the jurisdiction most advantageous to them and potentially avoid compliance with certain regulatory requirements, their internal tax policy, and their policy for data protection or other regulated areas. These transboundary challenges are exacerbated by the pacing problem: the fact that regulatory frameworks cannot accommodate the increasing pace of technological development expands the avenues for regulatory arbitrage.
The global reach of these markets make it hard to identify, prevent and respond fully to the myriad effects across the globe. At the same time, policy challenges that these fast evolving technologies pose are faced by most countries in parallel. And yet, these are addressed by governments following traditional institutional frameworks around line ministries and agencies and focused within the sole national legal framework, following their own legal, cultural and political frameworks. The erosion of hitherto clearly delineated sectoral boundaries as well as the blurring of the distinction between consumers and producers compounds this challenge.
In a number of cases, the traditional institutional frameworks underpinning regulations are no longer adapted to address or effectively keep up with data-driven markets. The mismatch between the transboundary nature of data-driven markets and the fragmentation of regulatory frameworks across jurisdictions may undermine the effectiveness of action and therefore people’s trust in government. It may also generate barriers to the spread of beneficial innovations on those markets.
Regulatory approaches
The developments in this section presents a selection of regulatory approaches that have been implemented across countries to cope with the governance and regulatory challenges brought by data-driven markets. It is worth noting that, given the number of policy measures taken across countries, this section is certainly not meant to be exhaustive but aims merely to shed light on interesting initiatives in this area. A number of examples come from communications and media regulatory bodies. Indeed, while all sectors are impacted by the rise of digital markets, they are usually at the front line, as they traditionally regulate communications networks and very often services provided by digital platforms are substitutes of traditional communications, information and audio-visual services.
Co-regulation and self-regulation
Self-regulation and co-regulation are instruments with no or limited government involvement. Self-regulation typically involves a group of economic agents acting together to adopt among themselves (and for themselves) rules or common guidelines that regulate behaviours. In fast-moving data-driven markets, such initiatives can lead to faster regulatory responses than approaches relying solely on governments.
Striking examples of include the Global Internet Forum to Counter Terrorism (GIFCT) (Twitter, 2017[21]) created by Facebook, Microsoft, Twitter and YouTube in 2017 or the EU Code of Conduct on countering illegal hate speech online (European Commission, 2016[22]).
In a context where conventional regulation and enforcement frameworks struggle to tackle the challenges raised in data-driven markets, some argue that the self (and spontaneous) regulation by digital platforms holds the potential to create new governance schemes, which would entail important implications for the existing paradigms framing the regulatory functions of governments (Cantero Gamito, 2017[23]). These self‑regulation properties could indeed give rise to a new, decentralised form or regulation where platforms would take part in (or compete with) the rule-making activities of governments (including in its enforcement dimension, as online platforms might also be able to offer dispute resolution mechanisms). While this prospect of private legal ordering and enforcement offer interesting avenues to address some of the challenges raised by data-driven markets, recent academic research highlights that further works would need to be done to examine the strengths and weaknesses of these emerging trends vis-à-vis traditional regulatory approaches.
Informal co-ordination mechanisms such as reputation and trust could also exert a strong decentralised discipline in data-driven markets. Quality compliance can, to some extent at least, be fostered through user’s ratings or peer-review systems. While this remains a relatively unexplored terrain, such incentives can certainly be helpful to deal with information asymmetries in data-driven markets, as a complement to traditional regulation.
Adapting the regulatory framework
Guidelines and policy recommendations developed by economic regulators
As underlined in (OECD, 2020[25]), economic regulators are at the “forefront of interaction with consumers, business, and government” and, as illustrated by the initiatives below, they can play an essential role in helping governments understand the regulatory challenges at stake and target the appropriate regulatory response.
In 2020, three Italian regulators (Italian Telecommunications Authority, the Competition Authority and the Data Protection Authority) jointly published a guidance to the legislator on big data regulation and platforms (AGCOM, AGCM and Garante, 2019[26]). The report aimed to exploit synergies among the three Authorities and identify the most suitable tools for future enforcement. The recommendations include the following:
Implement an appropriate legal framework that addresses the issue of effective and transparent use of personal data in relation to both individuals and society as a whole
Promote a single and transparent policy on the data protection;
Strengthen international co-operation for the governance of Big Data;
Reduce information asymmetries between digital corporations/platforms and their users (consumers and firms);
Identify the nature and ownership of the data prior to processing;
Promote online pluralism through new tools, transparency of content and user awareness of information provided on online platforms;
Reform merger control regulation so as to strengthen the effectiveness of the authorities’ intervention;
Facilitate data portability and data mobility between platforms through the adoption of open and interoperable standards;
Strengthen investigative powers of the AGCM and AGCOM and increase the maximum financial penalties for the violation of consumer protection laws.
In a same vein, the Australian Competition & Consumer Commission (ACCC) launched an inquiry on digital platforms in 2019 (Australian Competition and Consumer Commission, 2019[27]). This report looks at the impact of digital platforms on consumers, businesses using platforms to advertise to and reach customers, and news media businesses that use platforms to disseminate their content. A number of recommendations have been put forward, including:
Changes to merger law to incorporate, in particular, the likelihood that the acquisition would result in the removal from the market of a potential competitor;
Proactive investigation, monitoring and enforcement of issues in markets in which digital platforms operate;
Process to implement harmonised media regulatory framework;
Designated digital platforms to provide codes of conduct governing relationships between digital platforms and media businesses to the Australian Communications and Media Authority (ACMA);
Digital Platforms Code to counter disinformation;
Strengthen protections in the Privacy Act (e.g. update personal information definition, strengthen consent requirements and pro-consumers defaults, enable the erasure of personal information, higher penalties for breach of the Privacy Act);
Broader reform of Australian privacy law.
European Commission’s legislative initiatives: Digital Services Act (DSA) and Digital Markets Act (DMA)
In January 2021, the European Commission proposed two legislative initiatives to reform the rules governing digital services in the European Union: the Digital Services Act (European Commission, 2021[28]) and the Digital Markets Act (European Commission, 2021[28]).
The projects have different goals: as per the DMA's impact assessment, the Digital Markets Act (DMA) addresses risks to contestability and fairness in digital markets where “gatekeeper platforms” are present. The proposal builds on a an acknowledgement of sorts that pure antitrust-based approaches have reached their limits (both regulation proposals encompass ex ante requirements, as opposed to traditional ex post interventions).
The Digital Service Act (DSA), in turn, "addresses risks derived from the fact that very large platforms have become de facto public spaces, playing a systemic role for millions of citizens and businesses, creating a need for more accountability for the content which these providers distribute on their platforms".
Digital Services Act (DSA)
The general objective of the DSA is to ensure the proper functioning of the single market, especially the provision of cross-border online intermediary services. This translates into a set of specific objectives:
Maintaining a safe online environment;
Improving conditions for innovative cross-border digital services;
Empowering user and protecting their fundamental rights online;
Establishing an effective supervision of digital services and co-operation between authorities.
The proposal targets illegal content, services or goods, and comes as a complement to the European Democracy Action Plan which set outs measure to counter disinformation in particular. The Commission's proposal puts forward an asymmetric approach whereby very large online platforms (more than 10% of the European Union's population, or 45 million users) will be subject to more stringent requirements. Self‑regulation (e.g. codes of conduct) would also be part of the policy mix. Crucially, the proposal also seeks to strengthen oversight and enforcement. It includes provision to create a board of national Digital Services Coordinators (independent regulatory authorities in each member state will need to co-ordinate amongst themselves). In addition, the Commission would have supervisory and sanctioning powers (amounting to up to 6% of annual turnover).
It also encompasses measures to increase transparency (e.g. algorithms for targeted advertising) and put more information in the hands of the public and provide researchers with access to platform data. Transparency obligations in both the DSA and DMA proposals are expected to contribute to better enforcement of obligations under the General Data Protection Regulation.
Digital Market Act (DMA)
The general objective of the DMA is to ensure a competitive Single Market for digital services. The proposed regulatory approach is expected to “increase the contestability of digital markets”, “help businesses overcome the barriers stemming from market failures or from gatekeepers’ unfair business practices” and “foster the emergence of alternative platforms” (consumer surplus has been estimated at EUR 13bn per year. In the long run, reducing fragmentation in the internal market is also expected to enhance growth potential).
Some of the key features are the following:
The scope of application is restricted to “major providers of the core platform services most prone to unfair practices, such as search engines, social networks or online intermediation services” that are considered “gatekeepers” either on the basis of quantitative thresholds or market investigation;
In addition to banning a series of "unfair" practices (e.g. users would no longer be prevented from un-installing pre-installed software or apps), “gatekeepers” would be obliged to comply with provisions that would shift market power towards from platforms to their business users. For example, business users would be entitled to getting usable, portable copies of their data on real time and access to data generated by them and their users (also inferred data);
“Gatekeepers” would also need to ensure the interoperability of the software of third parties with their own services;
Sanctions for non-compliance are foreseen and would include fines totalling up to 10% of the worldwide turnover of “gatekeepers” as well as potentially breaking up certain businesses in case of recurrent infringement;
Market investigations by the EC are also foreseen with a view to ensuring that rules remain fit for purpose and “keep up with the fast pace of digital markets”.
Merger of regulators in France
In order to deal with the cross cutting challenges raised by digitalisation in the audiovisual landscape, the French government passed a new bill on 2019 which implements a substantial regulatory reform. A key measure is the merger of HADOPI (Authority for the dissemination of works and the protection of rights on the internet) and CSA (media regulator) to form a single regulatory body in charge of audiovisual and digital communications (ARCOM). The objective is to improve the regulatory capacity to handle all communication issues raised by the rapidly changing digital environment, including copyright protection.
The role of traditional regulatory policy tools
Regulatory impact assessment
An interesting illustration of the use of ex ante impact assessment in digital markets comes from the European Commission, through its legislative initiatives to revise the regulatory framework for digital markets.
The impact assessment for the Digital Services Act uses the evaluation of the 2000 e-Commerce Directive as starting point. This evaluation concluded that the Directives' core principles remain valid, but “some of its specific rules require an update in light of the specific challenges emerging around online intermediaries and online platforms in particular”. The impact assessment points out three key problems:
Citizens' exposure are exposed to increasing risks and harms online, especially from very large online platforms;
Online platforms' supervision is not well co-ordinated: “the limited administrative co-operation framework set by the e-Commerce Directive for addressing cross-border issues is underspecified and inconsistently used by Member States”;
"Member States have started regulating digital services at national level leading to new barriers in the internal market. This leads to a competitive advantage for the established very large platforms and digital services."
It concludes to the need for EU level regulatory action to “reduce legal fragmentation and compliance costs, enhance legal certainty, ensure equal protection for citizens and a level playing field for businesses, strengthen the integrity of the single market, and enable effective supervision across borders”.
The impact assessment developed for the DMA considers the proposal to be “coherent with and complementary to the proposal for the update of the e-Commerce Directive under the DSA. The DSA is, in this context, a horizontal initiative focusing on liability of online intermediaries for third party content, safety of users online, etc., with risk-proportionate obligations. The DMA, in turn, focuses on economic imbalances, unfair business practices by gatekeepers and their negative consequences, such as weakened contestability of platform markets”. The impact assessment also notes that, “to the extent that the DSA contemplates an asymmetric approach which may impose stronger due diligence obligations on very large platforms, consistency will be ensured in defining the relevant criteria, while taking into account the different objectives of the initiatives”. The DMA also builds on the 2019 Platform to Business Regulation.
The impact assessment points out three key concerns due to the emergence of “gatekeeper platforms”:
Weak contestability of and competition in platform markets (entrenched dominant position of gatekeeper platforms, which control access to digital markets/ecosystems);
Unfair business practices vis-à-vis business users;
Fragmented regulation and oversight of market players operating in these markets (as a result of the emergence of regulatory initiatives at national level), which “puts at risk the scaling-up of start-ups and smaller businesses and their ability to compete in digital markets”.
Moreover, it concludes that the market failures undermining these problems, chiefly barriers to entry and high dependence of platform business users, won't self-correct. This situation may lead to higher prices and lower quality, and risks undermining innovation.
National regulatory co-operation
Given the cross-jurisdictional nature, regulating data-driven markets calls for increased dialogue and coherence amongst government bodies to tackle fragmentation. This may require specific institutional responses such as the creation of One Stop Shops for Business in Denmark of the Center for Data Ethics and Innovation in the United Kingdom. The Centre for Data Ethics and Innovation is an independent advisory body whose mission build on the wealth of expertise and evidence across UK to analyse the risks and opportunities posed by data-driven markets and provide guidance to the government. The key objectives are the following:
Analysing risks and opportunities and anticipating in governance and regulation that could impede the ethical and innovative deployment of data and AI;
Agreeing and articulating best practices, codes of conduct and standards that can guide ethical and innovative uses of AI;
Advising governments on the specific policy or regulatory actions required to address or prevent barriers to innovative and ethical uses of data.
The Center is a core component of the Digital Charter (Department for Digital, Culture, Media and Sport, 2019[29]), the rolling program of work of the government to agree norms and rules in the face of data-driven markets.
In 2019, French telecom (ARCEP) and media (CSA) regulators signed an agreement establishing a joint division between the two institutions. The aim is to leverage the two authorities’ complementary expertise to sharpen their shared technical and economic analysis of digital technology markets: content distribution methods and quality, consumer habits, vertical and horizontal relationships between digital tech value chain players, including over-the-top companies and digital platforms. The joint division will also focus on a number of topics, including: supervisory methodologies, rules and benchmarks, data-driven regulation tools for digital platforms, data collection, utilisation and retrieval, and analysing platforms’ algorithms. This co-operation aims to delineate new regulatory tools to deal with the challenges raised by data-driven markets.
In 2019, seven French regulatory bodies cooperated to define a common approach on data-driven regulation (Autorité de la concurrence, AMF, Arafer, Arcep, CNIL, CRE and CSA, 2019[30]). The report highlights, in particular, that data-driven regulation might be a powerful tool to reduce information asymmetries and improve transparency for consumers in data-driven markets. In practice, this would not only require to collect detailed information from regulated players, but also expanding the scope of the data collected (thanks to crowdsourcing tools for example), developing simulation-based approaches, and comparison engines. The report states that the development of data-driven regulation raises the need to increase regulatory capacities and extend their traditional regulatory tools.
Promoting good practices, sharing expertise and developing joined-up approaches through international co-operation
Data-driven markets are a key area of focus for the Agile Nations, a network created in 2020 to promote global cooperation on rulemaking in response to innovation. Co-operation activities include, in particular, sharing foresight and evidence on the opportunities and risks raised by innovation in these markets, exploring opportunities to jointly test approaches to rulemaking, supporting innovative firms to navigate participating governments’ rules and co-ordinating enforcement activities as necessary to manage cross-border risks.
Beyond the Agile Nations, an interesting illustration of the opportunities provided by international co‑operation to share expertise is the joint project launched by the French and the German competition authorities on the potential competitive risks associated with algorithms in data-driven markets (Autorité de la concurrence and Bundeskartellamt, 2019[31]). The report examines three practical scenarios in which algorithms may enhance collusion: explicit direct collusion, algorithm-driven collusion involving a third party and collusion induced by the parallel use of individual algorithms. The report concludes that both authorities should continue to share their expertise on the topic and to engage more broadly with businesses, academics and other regulatory bodies.
Conclusion
The rapid development of data-driven markets has far-reaching socioeconomic impacts, notably on market structures and firms’ strategies. In addition, it entails a number of regulatory challenges that call for governments and regulators to ratchet up efforts to ensure the quality of their rule-making activities. There is a clear need to rethink traditional antitrust tools with a view to addressing risks of algorithmic collusion as well as the anticompetitive use of data by dominant platforms. In addition, the rationale for regulatory intervention will in many cases have to be revisited in view of the economic properties of digital platforms, such as their capacity to scale without mass and the presence of network externalities as well as economies of scope.
Regulatory action will need to rely on a thorough understanding of market dynamics (as opposed to defaulting to static or short-run markets analysis) and make use of the full range of regulatory tools at governments’ disposal, including experimental approaches, self-regulation and co-regulation. In the same vein, regulating digital platforms will require new institutional solutions to strengthen co-operation across government agencies, including across borders, in order to tackle the transversal challenges of data-driven markets. While issues pertaining to data privacy and security are not new, associated regulatory challenges have acquired a completely new dimension due to the high number of platform users and the unprecedented amount of data collected by online platforms. These challenges, together with growing concern about the ethical and social issues brought by data-driven markets (e.g. regarding transparency and equity) warrant adapting regulatory frameworks and enforcement approaches accordingly.
References
[26] AGCOM, AGCM and Garante (2019), “Guidelines and Policy Recommendations on Big Data”.
[27] Australian Competition and Consumer Commission (2019), “Digital Platforms Inquiry”.
[31] Autorité de la concurrence and Bundeskartellamt (2019), “Algorithms and Competition”.
[30] Autorité de la concurrence, AMF, Arafer, Arcep, CNIL, CRE and CSA (2019), “Memorandum on data-diven regulation”.
[19] Cameron, F. et al. (2021), Strengthening international cooperation on AI.
[23] Cantero Gamito, M. (2017), “Regulation.com : self-regulation and contract governance in the platform economy : a research agenda”, European journal of legal studies, Vol. 9/2, pp. 53-68.
[20] Centre for International Governance Innovation; Chatam House (2016), Global Commission on Internet Governance: One Internet, https://www.cigionline.org/sites/default/files/gcig_final_report_-_with_cover.pdf (accessed on 8 November 2020).
[24] Cusumano, M., A. Gawer and D. Yoffie (2021), “Can self-regulation save digital platforms?”, Industrial and Corporate Change, http://dx.doi.org/10.1093/icc/dtab052.
[29] Department for Digital, Culture, Media and Sport (2019), “Digital Charter”.
[28] European Commission (2021), “Proposal for a digital services act”.
[18] European Commission (2021), “Proposal for a Regulation laying down harmonised rules on artificial intelligence (Artificial Intelligence Act) and amending certain union legislative acts”.
[22] European Commission (2016), “Code of Conduct on Countering Illegal Hate Speech Online”.
[10] OECD (2020), “Going Digital integrated policy framework”, OECD Digital Economy Papers, No. 292, OECD Publishing, Paris, https://dx.doi.org/10.1787/dc930adc-en.
[25] OECD (2020), Shaping the Future of Regulators: The Impact of Emerging Technologies on Economic Regulators, The Governance of Regulators, OECD Publishing, Paris, https://dx.doi.org/10.1787/db481aa3-en.
[2] OECD (2019), An Introduction to Online Platforms and Their Role in the Digital Transformation.
[7] OECD (2019), An Introduction to Online Platforms and Their Role in the Digital Transformation, OECD Publishing, Paris, https://dx.doi.org/10.1787/53e5f593-en.
[13] OECD (2019), Going Digital: Shaping Policies, Improving Lives, https://www.oecd-ilibrary.org/docserver/9789264312012-en.pdf?expires=1603724789&id=id&accname=ocid84004878&checksum=3011deb039edc657b5ac662f578f3ddf (accessed on 26 October 2020).
[8] OECD (2019), Going Digital: Shaping Policies, Improving Lives, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264312012-en.
[17] OECD (2019), “Recommendation of the Council on Artificial Intelligence”, OECD/LEGAL/0449.
[4] OECD (2019), Regulation and IRC: challenges posed by the digital transformation.
[5] OECD (2019), Vectors of digital transformation, https://dx.doi.org/10.1787/5ade2bba-en.
[9] OECD (2019), “Vectors of digital transformation”, OECD Digital Economy Papers, No. 273, OECD Publishing, Paris, https://dx.doi.org/10.1787/5ade2bba-en.
[12] OECD (2018), “Rethinking Antitrust Tools for Multi-Sided Platforms”, http://ww.oecd.org/competition/rethinking-antitrust-tools-for-multi-sided-platforms.htm.
[3] OECD (2018), Tax Challenges Arising from Digitalisation – Interim Report 2018: Inclusive Framework on BEPS, https://dx.doi.org/10.1787/9789264293083-en.
[6] OECD (2018), Tax Challenges Arising from Digitalisation – Interim Report 2018: Inclusive Framework on BEPS, OECD/G20 Base Erosion and Profit Shifting Project, OECD Publishing, Paris, https://dx.doi.org/10.1787/9789264293083-en.
[11] OECD (2017), “Algorithms and Collusion: Competition Policy in the Digital Age”, http://www.oecd.org/competition/algorithms-collusion-competition-policy-in-the-digital-age.htm.
[15] OECD (2015), Recommendation on Digital Security Risk Management for Economic and Social Prosperity, OECD, http://dx.doi.org/10.1787/9789264245471-en.
[14] OECD (2013), “Guidelines Governing the Protection of Privacy and Transborder Data Flows”.
[16] OECD.AI (2021), Database of national AI policies, https://oecd.ai/ (accessed on 9 17 2021).
[1] Tirole, J. (2019), “Regulating the disrupters”, Project Syndicate, https://www.project-syndicate.org/onpoint/regulating-the-disrupters-by-jean-tirole-2019-01.
[21] Twitter (2017), “Global Internet Forum to Counter Terrorism”, https://blog.twitter.com/official/en_us/topics/company/2017/Global-Internet-Forum-to-Counter-Terrorism.html.