Reconciling the conflicting goals of privacy protection and competition policy through making platform use data saleable – an institutional perspective on data markets

Kai Reimers (School of Business and Economics, RWTH Aachen University, Aachen, Germany)
Xunhua Guo (School of Economics and Management, Tsinghua University, Beijing, China)

Journal of Electronic Business & Digital Economics

ISSN: 2754-4214

Article publication date: 27 May 2024

75

Abstract

Purpose

It has become increasingly clear that the objectives of privacy and competition policy are in conflict with one another with regard to platform data. While privacy policies aim at limiting the use of platform data for purposes other than those for which the data were collected in order to protect the privacy of platform users, competition policy aims at making such data widely available in order to curb the power of platforms.

Design/methodology/approach

We draw on Commons' Institutional Economics to contrast the current control-based approaches to ensuring the protection as well as the sharing of platform data with an ownership approach. We also propose the novel category of platform use data and contrast this with the dichotomy of personal/non-personal data which underlies current regulatory initiatives.

Findings

We find that current control- and ownership-based approaches are ineffective with regard to their capacity to balance these conflicting objectives and propose an alternative approach which makes platform data saleable. We discuss this approach in view of its capacity to balance the conflicting objectives of privacy and competition policy and its effectiveness in supporting each separately.

Originality/value

Our approach clarifies the fundamental difference between data markets and other concepts such as data exchanges.

Keywords

Citation

Reimers, K. and Guo, X. (2024), "Reconciling the conflicting goals of privacy protection and competition policy through making platform use data saleable – an institutional perspective on data markets", Journal of Electronic Business & Digital Economics, Vol. ahead-of-print No. ahead-of-print. https://doi.org/10.1108/JEBDE-12-2023-0032

Publisher

:

Emerald Publishing Limited

Copyright © 2024, Kai Reimers and Xunhua Guo

License

Published in Journal of Electronic Business & Digital Economics. Published by Emerald Publishing Limited. This article is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of this article (for both commercial and non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licences/by/4.0/legalcode


Motto:

If it were asked–What discovery has most deeply affected the fortunes of the human race? We think after full consideration it might probably be said–The discovery that a Debt is a Saleable Commodity. (Macleod, 1875, p. 172)

Introduction

How to enable the emergence of data ecosystems instead of continuing to foster the rise of digital superstars? This question lies at the heart of the challenges that arise with the expected digital transformation of almost all sectors of the economy. This transformation is characterized by the emergence of platforms that aim to dominate whole industries, often created by outsiders with an intention to “disrupt” extant forms of organizing economic activities (Chen, 2021). While, from a policy perspective, such disruptions are generally welcomed since they are seen as instances of radical innovations, the accompanying rise to dominance of platforms is increasingly a cause for concern (Denemark, 2022). A monopoly position would be seen as acceptable as long as it can be challenged by newcomers, i.e. as long as it is contestable (European Commission, 2020a). However, the intense competitive manoeuvers of would-be platform players in the quest for dominance are supposedly made in view of the prospect of being able to erect high entry barriers which would effectively shield winners in this race from future competition and which, at the same time, are the cause for increasing anti-trust worries and regulatory action.

Network effects, the increase in value each user derives from using a service or product as the number of users increases (Katz & Shapiro, 1985; Farrell & Saloner, 1985), are often seen as constituting such entry barriers. However, empirical evidence points to the possibility of effective competition under conditions of network effects where promoters of new technologies can successfully oust presently dominant platform players (Cennamo, 2018). In fact, such a possibility is predicted when both extant and expected network effects are considered since, under suitable conditions, expected network effects can create a self-fulfilling prophecy, described as excess momentum, to contrast with the notion of excess inertia that is traditionally seen to underlie entrenchment of monopolistic positions based on network effects (Krugman, 1991). Hence, extant network effects may not be as effective entry barriers as is often assumed. Gregory, Henfridsson, Kaganer, and Kyriakou (2021, 2022) have identified a “data network effect”, which points to the potentially crucial role of data collected on platforms for erecting entry barriers. A data network effect pertains when the benefits of using a platform increase as the platform operator learns from the data that it collects from the users to improve its service in real time (Gregory et al., 2021, 2022). Indeed, recent efforts by the European Commission to curb the dominance of platforms focus on data, not on network effects. The main objective is to prevent platform operators from unfairly benefiting from the data collected on their platforms by limiting what they can do with such data and by enforcing access to such data by other companies through the so-called Digital Markets Act (DMA, European Commission, 2020a). Similarly, in its recently proposed so-called Data Act (DA), the EU envisages to render markets for devices that collect data about users, called IoT (Internet of Things) data in this proposal, more competitive, specifically with regard to repair and maintenance of such devices (so-called aftermarkets), by enabling users to share such data with third parties (European Commission, 2022).

However, in one recent assessment of the DMA, while praising the benefits of access it enforces to platform data for other companies, concerns have been raised whether such regulation might dampen incentives of platform operators to innovate by collecting and analyzing such data (Expertenkommission Forschung und Innovation, 2023). Similar concerns have been formulated with regard to the DA (Drexl et al., 2022; Leistner & Antoine, 2022). But how else could these regulatory objectives be reached without diluting the incentives of platforms for data-driven innovation? It seems to us that a promising candidate for such an alternative approach would be to allow platform operators to sell their data to other companies. While the creation of data markets has been recommended for the purpose of fostering innovative use of such data, especially with regard to the application of machine learning techniques (Guse, Thiebes, Hennel, Rosenkranz, & Sunyaev, 2022; European Commission, 2022), such efforts have not yet been linked to the regulatory objective of curbing the power of platforms (Abbas, Agahari, van de Ven, Zuiderwijk, & Reuver, 2021). Even the Data Governance Act (DGA), another regulatory proposal by the European Commission that was published in the same year as the DMA and that is often portrayed as providing the legal basis for the creation of data markets (e.g. Expertenkommission Forschung und Innovation, 2023), is not linked to the threat of monopolization of industries through platforms but to the aim of fostering the development of machine learning technologies (European Commission, 2020b). While the Data Act aims to “unlock” the aftermarket for IoT devices by making IoT data available to competitors of the device manufacturers, its reach is explicitly limited to such devices and does not extend to platforms and data collected by them (European Commission, 2022).

One likely reason for the absence of such a discussion is that the notion of selling platform data seems to be in strong conflict with accepted principles of privacy and data protection. However, our point of departure in this paper is that extant privacy principles and practices – not network effects – hinder the emergence of data ecosystems and lay the foundation for the power of platforms because they severely restrict the sharing of platform data in such a way that only the platforms themselves can benefit from their full potential (Höppner & Westerhoff, 2021; Jurcys et al., 2021; Huq, 2021). The question then arises how the conflicting objectives of competition and privacy policy can be balanced if allowing platforms to sell the data collected by them is seen as an effective means to share such data without diluting incentives to collect them? Currently, both data markets and privacy rules are framed in terms of physical control of platform use data: A data market is understood as a place where data are physically exchanged (Koutroumpis, Leiponen, & Thomas, 2020; European Commission, 2022) and privacy rules are based on regulating/limiting physical access to data. Hence, balancing the requirements of competition law and privacy law entails the negotiation and implementation of prohibitively complex configurations of access rights and data partitioning (Lubyová, 2022; Geiregat, 2022; Jurcys et al., 2021). We therefore in this paper ask: How can data markets be conceptualized so as to simultaneously consider the conflicting demands of competition and privacy policy?

We draw on Commons’ (1990) Institutional Economics to argue that answering our research question requires a conceptual move from understanding data markets as the site of the physical exchange of data towards an institutional understanding of such markets where ownership of data, instead of the data themselves is exchanged. Moreover, we follow Commons’ approach to defining ownership in terms of liberty (of the owner) and exposure (of all others), allowing us to encapsulate the trade-off between the conflicting objectives of sharing and protecting data in a single relation: ownership. We do not address the question of how markets for data should be designed in order to become commercially successful (on this question, cf. the systematic literature review by Abbas et al., 2021) or the question of how property rights to data can be monitored and enforced (on this question, cf. the ideas of Koutroumpis et al., 2020 regarding the use of technologies such as distributed ledgers to improve the provenance of data).

We begin by defining our notion of platform use data and distinguish this from the notion of personal data as defined in European law. In the main section, we then first show that present concepts of data markets and platform regulation are indeed based on the notion of physical control over such data. Second, we show how the move from data holdership to data ownership can be made based on Commons’ Institutional Economics. Third, we address the question of who should be assigned ownership rights. In the discussion we then show how our approach can be used to balance the conflicting requirements of data sharing and data protection and why platforms might actually be willing to sell their data.

What are platform use data?

Talking about the creation of data markets or data infrastructures often appears as if it does not matter what kinds of data are to be traded or exchanged (Abbas et al., 2021). But, with equal justification, one may talk about creating “goods markets” or “services markets”. While all these categories – data, goods, services – are distinct from one another and thus have a specific meaning, the physical and institutional characteristics of different kinds of goods and services are so variable that finer categories are needed in almost all cases for specifying the institutional characteristics of markets for goods and services. In particular, while it is certainly true that data are different from goods in that the costs of copying are negligible for data but not for goods, this characterization is not sufficient for deriving the institutional requirements of data markets. For example, markets for consumer profiles certainly need to meet very different institutional requirements than markets for stock prices. While markets for consumer profiles must, above all, be concerned with privacy rules, the most important concern for markets for trading stock prices is their currentness (Koutroumpis et al., 2020; Mulherin, Netter, & Overdahl, 1991). Thus, a more nuanced approach to defining platform use data is called for, which balances the objectives of enabling a free flow of data and those of privacy protection (Leistner & Antoine, 2022).

In this paper, we are concerned with the kind of data that need to be shared in order to curb the power of platforms. What kind of data are these? We submit that these are the data that are generated on the platforms themselves, since such data cannot be obtained other than from the platform operators. For example, connections between data objects are often created on a platform such as when a user “puts” a “product” into a “shopping basket”. Here, data representing the user, an IP address or other identifying data, are connected with data representing a product. By contrast, many kinds of data collected by platforms can also be obtained from other sources. For example, data that connect a person with a bank account are typically not created on a platform but may have to be provided by a user in order to complete a transaction (we refer to such connections as “pre-existing”). Data generated on a platform therefore do not comprise pre-exiting connections that may have to be provided to the platform; we refer to data generated on a platform as “platform use data” and define them as records of connections between data objects when such connections are created through actions of users on a platform. Such records can be created, among other ways, through clicking on elements of a website/app; moving the cursor across a website/app; opening a website/app; entering search terms/questions and queries. This definition excludes, among others, the following kinds of data: address data, financial data, health data, and data secretly recorded by the device about the user [1].

Our definition of platform use data thus excludes data that can be used for purposes considered to be particularly problematic in view of privacy protection, such as user profiles which can be used to exclude certain users or user groups from a company’s product or service offerings. On the other hand, data about patterns of use concerning a website, such as temporal patterns of interest in certain products or services or interest in certain combinations of products, can be made available to companies other than the platform operators who have collected them originally through trade. Since such data are only available from the platform operators, factual or legal barriers to sharing them will only increase the market power of the platforms since they can still use them for the purpose of training their algorithms.

Our definition intersects with the notion of “personal data” as defined by European law in the General Data Protection Regulation (GDPR), which defines personal data as

… information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person … (GDPR, Article 1, Point 1)

The sole criterion in this definition is whether data can be related to a natural person. Our definition of platform use data includes personal data when one of the data objects connected through such a record refers to an identifiable natural person. It excludes personal data if connections between data objects contained in such records have been established elsewhere (pre-existing connections). And lastly, records may be created about connections between data objects where all links to personal data have been removed, e.g. when a connection has been established between two products by having both “placed” in a “shopping basket” without also recording the link to the user. The relationship between our definition of platform use data and that of personal data as given by the GDPR is depicted in Figure 1.

From control to ownership

Current efforts to curb the power of platforms, as they have become manifest in the Digital Markets Act (DMA) and the Data Act (DA), focus on regulating physical control of data by limiting what platform operators can do with the data they have collected and, conversely, by enforcing access to subsets of such data by other firms, including competitors. Likewise, regulating physical control of and access to data is also the basis of efforts to protect consumers from harmful effects of such data, manifest in the General Data Protection Regulation (GDPR), and even of regulatory initiatives to create data exchanges, manifest in the Data Governance Act (DGA) and also the Data Act. We begin this section by briefly illustrating this claim – that all four regulatory initiatives are based on a preference for physically controlling what can be done with data – and then show how the notion of physical control differs from the institutional notion of ownership. We will also discuss the question of who should be assigned ownership of platform use data.

The control approach

The point of departure of the Digital Markets Act (DMA) is the idea that platforms act as gatekeepers between businesses and their customers. This is seen as a potential problem because platform operators may engage in the same commercial area as their business users, i.e. they may also act as their competitors. Since business users and their customers must, as part of doing business through the platform, provide data about themselves and their transactions, platform operators may use their privileged access to such data to compete with their business users, which is seen as unfair (European Commission, 2020a, p. 15). To prevent such behavior, gatekeepers must not use data “generated” by business users or their customers on the platform to offer services which compete with those of their customers if such data are not publicly available (Article 6a). Conversely, business users and their customers should be given effective and immediate (real time) access to data that they “generated” on the platform free of charge (Article 6h, i). Further versions of these general rules are given for more specific cases.

First, it is noteworthy that the DMA does not define the notion of user-generated data, even though the terms “data” (any digital representation of acts, facts, and information), “personal data” (by reference to the GDPR), and “non-personal data” (all other data) are defined (Article 2, Points 19–21). Hence, it is not clear to which subset(s), as defined above (see Figure 1), these rules refer. It is possible to argue that all three subsets are, in some way, generated by users. The rules furthermore comprise positive and negative elements, i.e. they prescribe certain actions and prohibit others. The prescriptions specify the modalities of physical access to data. The prohibitions cannot likewise deny physical access to data because platform operators, by definition, have access to them. They therefore prohibit use of the data under certain conditions, which amounts to denying access to these data because, for the platform operator, the effect is the same. The DMA thus regulates access to data by opening and closing access routes.

While the recently proposed Data Act (DA) does not concern platforms, but devices that collect data about their users in the course of normal use (“IoT data”), it is worth mentioning here since, in it, the “European strategy for data” finds full application. This strategy aims at creating “… a genuine single market for data …” (European Commission, 2022, p. 1). However, the European Commission has explicitly rejected the use of property rights for creating such a market in favor of a control approach (Geiregat, 2022). In essence, the proposal envisages that, by giving users control over such data, they will be able to share these data with third parties who could then, for example, offer maintenance and repair services for their devices in competition with the manufacturers themselves. However, it is also envisaged, that such data sharing will stimulate innovation in areas unrelated to the device (European Commission, 2022, p. 13).

The General Data Protection Regulation (GDPR) makes control of personal data by users its cornerstone: “Natural persons should have control of their own personal data” (The European Parliament and Council of the EU, 2016, p. 2). To this effect, it focuses on regulating the modalities of “processing” personal data (Article 1, Point 1), where processing “… means any operation or set of operations which is performed on personal data or on sets of personal data …” (Article 4, Point 2). It is directed towards “controllers” defined as a “… natural or legal person, public authority, agency or other body which, alone or jointly with others, determines the purposes and means of the processing of personal data …” (Article 4, Point 7) and operates by restricting processing through the controller, where “… ‘restriction of processing’ means the marking of stored personal data with the aim of limiting their processing in the future …” (Article 4, Point 3). Hence, regulating physical access is, again, the dominant mode of operation. It is noteworthy that the GDPR informally alludes to ownership in the above cited phrase (“… their own personal data …”), which is seen to rest with the natural person about whom some data are.

The Data Governance Act (DGA) aims to make “… public sector data available for re-use, in situations where such data is [sic!] subject to rights of others” (European Commission, 2020b, p. 1). It does so through the creation of a “… harmonized framework for data exchanges” (European Commission, 2020b, p. 10). However, the term “exchange” is not understood as a market but as a “… governance framework for data access and use, in particular regarding the re-use of certain types of data held by the public sector … (European Commission, 2020b, p. 11)”. The terms “data holder”, “data user”, “data sharing”, and “access [to data]” are all defined in terms of physical control over data (Article 2, Points 5–8). By contrast, the notion of ownership of data is not defined but indirectly thematized in the sense that the regulation should not affect any extant intellectual property rights (European Commission, 2020b, p. 14) while the granting of exclusive use rights to data is prohibited (Article 4, Point 1), unless that would serve a “general interest” (Article 4, Points 2 and 5). Thus, the aims of the regulation are again meant to be achieved through controlling physical access and institutional means, such as exclusive use rights, are expressly prohibited or limited.

The ownership approach

From an institutional perspective, for something to be tradable it needs to be owned since selling and buying things means transferring ownership of something (Commons, 1990). Data can, in principle, be owned (Drexl et al., 2017; Hazel, 2020; Jurcys et al., 2021; Thouvenin & Tamò-Larrieux, 2021) [2] and property in price data has been established more than a hundred years ago when another technological innovation, the telegraph, disrupted then extant forms of doing business (Mulherin et al., 1991) [3]. Data ownership in personal data by data subjects is also increasingly seen as a more effective means of control over personal data than oversight by public authorities in their stead (Huq, 2021; Jurcys et al., 2021), although this position has also been challenged (cf. Thouvenin & Tamò-Larrieux, 2021 for a summary of the earlier critique of property rights in data and Cofone (2021) and Drexl et al. (2022) for a recent restatement of that critique in the context of the public discussion of the European Commission’s DA proposal).

But what is ownership? Thouvenin and Tamò-Larrieux (2021) argue with regard to data ownership that there are two main legal understandings: ownership as private property and ownership as control. However, from an institutional perspective, ownership is private property (legal control), not physical control. In fact, the main argument by Commons is that ownership and (physical) control have not been sufficiently separated by the classical and neo-classical economists and that clearly separating these concepts is the hallmark of Institutional Economics. The most fundamental difference between these concepts is that physical control refers to the relationship between a person and nature whereas ownership refers to the relationship between persons. Ownership is often characterized as the right to exclude others from using something (e.g. Jurcys et al., 2021 with regard to data ownership). But Commons argues that, since ownership characterizes a relationship between persons, it always implies two legal moments. The right to exclude others from using something corresponds to the no-right of all others (the whole world) to mobilize collective action – through a court decision – against the owner to do what he or she likes with what is owned (Commons, 1990, p. 82). This two-way relationship creates two economic states: A state of exposure of the whole world to the consequences of the actions of the owner, which they have to tolerate, and a state of liberty on the part of the owner to do what he or she likes with what is owned. These two states of exposure and liberty are the legal basis for any market since they create the possibility of selling the right in question. The institutional approach does not ignore the aspect of physical control – the relationship between a person and nature – but places it in the future: “Legal control is future physical control.” (Commons, 1990, p. 87). Ownership, and the transfer of ownership through selling it, therefore always look to the future. A thing is evaluated on the basis of an expectation of the beneficial consequence – income, profit, pleasure – it will spawn in the future (Commons, 1990, p. 408).

The states of liberty and exposure created through ownership are not unlimited and a large part of the institutional work underpinning the functioning of markets consists of defining these limits and continuously adjusting them to changing circumstances. For example, in Germany it has been decided that the liberty of the owner of a forest is limited by the rule that he or she must not prevent the public from entering it because a forest also serves as a recreational area for city dwellers. Likewise, laws have been created at the state level in Germany that limit the liberty of owners of a house or apartment by the rule that they must not let it unoccupied for a longer period of time in order to avert speculation. In a similar manner, ownership of data can and should be limited, for example in order to protect data subjects from harmful effects of using such data. However, since the value of something owned consists of the expected benefit derived from it in the future, the tightening of these limits amounts to a taking of property in the sense that the value of what is owned will be diminished. A fundamental characteristic of the US-American institutional system therefore consists of the principle of due process: the definition of ownership must not be changed without due process of law (Commons, 1990, p. 63).

Who should own platform use data?

When ownership has been proposed in the context of personal data, authors typically recommend that ownership should be assigned to data subjects. Two recent papers offer alternative justifications for this proposal. Huq (2021) argues that data subjects, by owning data about them, can control such data even when the data are physically held by others elsewhere, e.g. by platform operators on their servers. By contrast, Jurcys et al. (2021) argue that even when data are physically controlled by data subjects, e.g. in a so-called “private data cloud”, assigning ownership to data subjects would strengthen their degree of control even further while also providing a more flexible approach to sharing such data with service providers who could “run” their services “on top” of personal data held in private data clouds (Jurcys et al., 2021).

However, these proposals do not aim to make personal data tradable. Rather, they aim at strengthening the control users have over such data, similar to the objective pursued by the European Commission with its proposal for the Data Act. In fact, the possibility that data subjects might sell their personal data is seen as a strong argument against assigning ownership rights to them because this would diminish the control rights that they currently have under modern data protection regimes (Cofone, 2021; Thouvenin & Tamò-Larrieux, 2021). By contrast, our main question is how platform use data can be made saleable. We therefore propose to assign ownership rights to platform operators, not data subject, because otherwise transaction costs for platform operators to acquire such data from data subjects would be prohibitive (Thouvenin & Tamò-Larrieux, 2021; Drexl et al., 2022; Kerber, 2023). We limit this proposal to platform use data, i.e. to subsets (b) and (c) as shown in Figure 1, which excludes a large part of personal data as defined by the GDPR (subset (a) in Figure 1). The rationale of our argument is that, in this way, platforms may have an effective incentive for sharing their data (below, we will discuss the strength of this incentive and also propose how the degree of exposure of data subjects created through such ownership could be limited). But apart from this functional legitimation, what moral basis could be quoted in support for this argument?

So far, two moral justifications have been given to support ownership by data subjects (see Jurcys et al., 2021). On the one hand, it has been argued that such data are created by the labor of users, manifest in the often used term “user-generated data”, and therefore users have a right to the fruit of their work. This is a version of the labor theory-justification of private property originally proposed by John Locke (see Jurcys et al. 2021), which has been rejected on the grounds that user involvement in the production of such data is often negligible and does not amount to labor in the accepted sense of the word (Hylton, 2019). On the other hand, it has been suggested that personal data are a virtual extension of the body and personality of the data subject and therefore belong to them to the extent that one has a natural property right to one’s body and personality (Jurcys et al., 2021; Cofone, 2021).

We propose that a moral basis for our proposal to endow platform operators with property rights in platform use data could be provided if the issue is broadened to include the ownership of platforms. This is an extension of the argument by Mansell and Sison (2020) who propose, based on an analysis of the medieval corporation, that corporations should be owned by their members on the grounds that they are all, to variable degrees, involved in producing the collective good that is the corporation’s output. We think that this argument is even stronger for the case of platforms whose value, as is widely acknowledged, is based on positive network effects produced by their users. Users therefore create a large portion of the value that can be extracted from operating a platform but which is currently typically appropriated by a small group of investors who own the platform, excluding users. If users where considered to be co-owners of the platform, similar to members of the medieval corporation, then it would seem that assigning ownership in platform use data to the platform operator could be morally justified too, in addition to the efficiency argument for platform ownership in such data based on transaction cost theory.

Discussion

The institutional understanding of ownership, as laid out above, integrates two conflicting objectives: that of granting liberty to the owner to do with what is owned as she or he likes, including selling it, and that of limiting the exposure of all others to the consequences of such actions. Applied to the realm of platform use data, these objectives are the enabling of sharing platform use data more widely in order to curb the power of platform operators on the one hand and the protection of platform users from the detrimental effects of data use by businesses on the other hand. By defining the limits of both the liberty of owners and the exposure of all others these objectives can be traded off against each other. By contrast, the current control-based approach prevents any such trade-off by taking the protection of personal data as an absolute cornerstone of platform regulation so that any data sharing is only possible within the constraints of such protection (Thouvenin & Tamò-Larrieux, 2021). Similarly, current property rights-based approaches to both data protection and data sharing define property right as the right to exclude others from using data (Drexl et al., 2017; Geiregat, 2022). While this is certainly an important aspect, focusing on this aspect alone glosses over the possibility of tempering ownership rights which therefore appear to give absolute liberty to the owner and to leave all others completely exposed to the actions of owners. By conceptualizing ownership as a liberty/exposure-relation the possibility of limiting both, and thus of tempering the otherwise absolute right of exclusion, comes into view.

Both current control- and ownership-based proposals to platform use data are thus not effective with regard to reconciling these conflicting objectives. Moreover, in our view, they are unlikely to even achieve each objective taken separately. The control-based approach gives rise to a highly complex web of contingent access rights which tends to cement the dominant position of platforms since only they have the resources to cut through this web of rules effectively (Jurcys et al., 2021; Hazel, 2020; Huq, 2021; Drexl et al., 2022). Conversely, it does not prevent platforms themselves from using user data with detrimental effects for platform users (Huq, 2021), mostly through forms of price discrimination, as has poignantly been shown by Clough and Wu (2022). Current ownership-based proposals, which assign ownership of platform use data to data subjects, also seem ineffective. On the one hand, the transaction costs of selling, and thus sharing, such data are prohibitive (Thouvenin & Tamò-Larrieux, 2021; Drexl et al., 2022; Kerber, 2023). On the other hand, ownership of personal data by platform users does not prevent businesses from training their algorithms on other data and then using the results against those users who have not allowed businesses to use their data, for example for purposes of price discrimination, as has been argued by Jurcys et al. (2021).

By contrast, our proposal to assign property rights in platform use data to platform operators while limiting both liberty and exposure created through such property rights promises to be effective with regard to both objectives. On the one hand, any limits imposed on property rights to platform use data would apply to both businesses who buy such rights and platform operators who collect such data and possibly sell the rights to them. Defining and limiting property rights in platform use data would thus effectively protect users from harmful actions also of platform operators, for example by prohibiting the use of platform use data for purposes of extreme price discrimination. On the other hand, this would enable effective sharing of such data through selling them and thus contribute to curbing the power of platform operators.

However, would platform operators actually want to sell platform use data if they could? There are two reasons which support such an expectation. First, it is likely that there exist firms that can make better use of platform use data than the platform operators themselves, i.e. that there are benefits of specialization. In an ingenious study, Agogo (2021) has empirically demonstrated widespread informal data sharing among websites, which he convincingly explains in terms of such specialization benefits. Second, platform operators may decide in favor of selling platform use data because this would be an effective means to alleviate any suspicion of anti-trust authorities that they are misusing their dominant position.

The proposal to assign ownership of platform use data to platform operators seems, at first glance, counter-intuitive with regard to the objective of curbing platform power and absurd with regard to protecting platform users from detrimental effects of using such data. But, on closer inspection, these objectives can be effectively achieved through our proposed approach while this approach also allows for trading off these objectives against each other. A crucial step in seeing this possibility, however, consists of adopting a more nuanced approach to defining platform data. The concept of “personal data” is far too broad for allowing effective regulation of platforms because it mixes highly sensitive data such as health and financial information – subset (a) in Figure 1 – with data that are less prone to be used against users while they also are of utmost importance for unleashing the expected benefits of applying new techniques such as machine learning in terms of improving resource allocation across almost all sectors of the economy, namely those which we have defined as subset (b) in Figure 1.

Our proposal, however, is not meant to suggest that the definition of property rights in platform use data and their assignment to platform operators will suffice to curb the power of platforms and to protect platform users from harmful effects of the use of such data. For this to be the case, markets for platform use data must be successfully established and currently many obstacles remain in this regard, as shown by Abbas et al. (2021). Moreover, adequate market designs and governance structures have yet to be found. While some forms of data sharing services have been successfully established, so-called many-to-many markets with full control over ownership rights have yet to emerge (Koutroumpis et al., 2020). Rather, our aim was to show that, based on an institutional understanding of markets, it is possible to envision data markets that could be effective with regard to both curbing the power of platforms and adequately protecting platform users from the detrimental effects of the use of such data.

Our proposal also clarifies the concept of a data market in relation to similar terms, often used synonymously, such as “data marketplace” and “data exchange”. Abbas et al. (2021) define a data marketplace as a multi-sided platform that matches data providers and data buyers (p. 3321). Koutroumpis et al. (2020) discuss what “matching” may mean in this context and distinguish between “outright sale” of data from other forms of contractual arrangements such as licensing and relational contracts with implied exchange agreements (pp. 650–651) and conclude that “… within industrial organization economics such transactions do constitute ‘markets’ because they involve prices (monetary or otherwise) for (incompletely) substitutable goods or services that are affected by one another ...” (p. 651). However, legal scholarship on the issue of data markets clearly distinguishes between two legal options for commercializing data: property rights and contract law (Zech, 2017). The main difference between these legal options is that property rights constitute a so-called right in rem, i.e. in a thing, while contract law constitutes personal rights (Zech, 2017; Hazel, 2020). Rights in rem are much stronger than personal rights because they burden “the whole world” (erga omnes) with a duty to tolerate the actions of the owner granted by property rights whereas personal rights established through contract law, such as licensing agreements, only bind the parties to the contract (Akkermans, 2015). The legal concept of rights in rem is strictly analogous to Commons’ concept of a market as a realm of liberty/exposure. Hence, our institutional perspective allows us to clearly distinguish between institutional arrangements that involve the exchange of valuable things based on contract law and those where ownership rights are transferred based on property rights and property law. The difference is far from negligible, as implied by Koutroumpis et al. (2020), who view both as markets, because an arrangement based on property rights allows for moving a thing, in this case a data set, freely around in a society so that novel uses for that thing can be discovered through trade. A market, understood as a site where ownership is transferred, thus performs a very different function from an arrangement where data are shared or exchanged based on contract law. We therefore propose to clearly distinguish between these two kinds of arrangements and do so terminologically by restricting the term “data market” to arrangements based on property rights as opposed to arrangements based on contract law, which we refer to as “data exchanges”. We suggest that the term “data marketplace” should be avoided because it tends to blur this distinction.

Conclusion

In this paper, we have addressed the question: How can data markets be conceptualized so as to simultaneously consider the conflicting demands of competition and privacy policy? We have shown that current control- and ownership-based approaches are insufficient in this regard. Control-based approaches tend to take the objective of data protection as absolute and then seek to enable data sharing, through defining further access rules, within the constraints of such protection. But even if data protection was seen as a variable objective that needs to be balanced with the conflicting objective of curbing the power of platforms, any effort of re-balancing would likely increase the complexity of the already complex rules governing access to platform data and thus ultimately, albeit unintendedly, strengthen the power of platforms because only they have the resources to cope with this complexity effectively. Current ownership-based approaches envision to assign property rights in platform data to platform users, not with an intention to make such data saleable but to increase factual control that users have over such data, either as an alternative to physical control possibilities or as an extra layer of legal control to reinforce the physical control users should be given through technical means. In this way, assigning property rights to platform users is seen as a countervailing power to the power of platform operators.

Our proposal, counter-intuitively, is to assign property rights in platform data to platform operators in order to curb their power. However, for this to be seen as an effective measure, two conceptual moves are necessary. On the one hand, a more nuanced approach to defining platform data is required. The current approach, based on the concept of “personal data”, is far too broad because it mixes highly sensitive data, which should not be shared under any circumstance, with data the sharing of which is necessary in order to unleash the potential efficiency gains promised by the combination of big data and machine learning techniques. On the other hand, the liberty created through assigning property rights needs to be tempered since to that liberty corresponds an exposure of “the whole world” to the actions of the owner. Thus, property rights should not be seen as absolute either but as limited by ongoing institutional work which adjusts the liberty granted to owners and the exposure of all others to changing circumstances. One clear candidate for such limitation is a prohibition to use platform use data for purposes of extreme price discrimination. Such a prohibition would apply to platform operators and other businesses alike and thus effectively protect platform users from one of the most detrimental and insidious effects of exploiting platform use data, a level of protection which current privacy regimes fail to offer. Moreover, our approach would effectively allow, based on a due process of law, for the continuous re-balancing of the conflicting objectives of data protection on the one hand and of curbing market power on the other by adjusting the limits imposed on the liberty of data owners and thus by limiting the exposure that all others, especially platform users, have to endure.

Our proposal also makes a conceptual contribution by clearly distinguishing between data markets and data exchanges. The word “exchange” has acquired a double meaning of a place where commodities are physically exchanged and of a site where ownership in things is transferred. We suggest that this difference needs to be made explicit by distinguishing physical control from ownership (legal control). On a data exchange, data, not ownership in data, are transferred, based on contract law, for example through licensing. Presently, platform data are already widely shared based on contract law. However, data markets, which allow data to be freely moved around in society so that novel uses for such data can be readily discovered, do not yet exist. Such markets can be established only if property rights in data are established. We suggest that making this distinction opens up promising routes for future research which, instead of understanding a market as a competitive algorithm for matching given supply and demand quantities, explores markets as realms of liberty/exposure that facilitate the discovery of new usages for extant resources. Such exploration seems to be particularly relevant for the case of platform use data, whose real value may lie in using them in novel ways for novel purposes.

Figures

Relationship between the EU definition of personal data and our definition of platform use data

Figure 1

Relationship between the EU definition of personal data and our definition of platform use data

Notes

1.

Thouvenin & Tamò-Larrieux (2021) have also called for a more differentiated approach to defining platform data. However, their proposal – to limit private property rights to those data that can be stored in a private cloud – is ultimately tautological.

2.

A contrary position is taken by Geiregat (2022), who argues that data ownership would not be compatible with Belgian law.

3.

Huq (2021) quotes another 100 years old court ruling which suggests that ownership of data is not recognized by US courts. However, this ruling refers to ownership of news information and thus concerns a special case. Price data generated on a stock exchange are a much better model for platform use data.

References

Abbas, A. E., Agahari, W., van de Ven, M., Zuiderwijk, A., & Reuver, M. D. (2021). Business data sharing through data marketplaces: A systematic literature review. Journal of Theoretical and Applied Electronic Commerce Research, 16(7), 33213339. doi: 10.18690/978-961-286-485-9.6.

Agogo, D. (2021). Invisible market for online personal data: An examination. Electronic Markets, 31(4), 9891010. doi: 10.1007/s12525-020-00437-0.

Akkermans, B. (2015). The numerus clausus of property rights. In M. Graziadei, & L. Smith (Eds), Comparative Property Law: Global Perspectives. Cheltenham: Edward Elgar.

Cennamo, C. (2018). Building the value of next-generation platforms: The paradox of diminishing returns. Journal of Management, 44(8), 30383069. doi: 10.1177/0149206316658350.

Chen, A. (2021). The Cold Start Problem: Using network effects to scale your business. London: Random House Business Books.

Clough, D. R., & Wu, A. (2022). Artificial intelligence, data-driven learning, and the decentralized structure of platform ecosystems. Academy of Management Review, 47(1), 184189. doi: 10.5465/amr.2020.0222.

Cofone, I. (2021). Beyond data ownership. Cardozo Law Review, 43(2), 501572.

Commons, J. R. (1990). Institutional economics -- its place in political economy. New Brunswick, London: Transaction.

Denemark, J. (2022). Google, antitrust, and digital market act: Is there new hope for the AdTech market?. in V. Šmejkal (Ed.). EU Antitrust: Hot Topics & Next Steps: Proceedings of the International Conference held in Prague on January 24−25, 2022 (pp. 4053).

Drexl, J., Hilty, R. M., Globocnik, J., Greiner, F., Kim, D., Richter, H., … Wiedeman, K. (2017). Position statement of the Max Planck Institute for Innovation and Competition of 26 April 2017 on the European Commission’s “public consultation on building the European data economy”. Max Planck Institute for Innovation & Competition Research Paper (pp. 17-108), München.

Drexl, J., Banda, C., González Otero, B., Hoffmann, J., Kim, D., Kulhari, S., … Wiedeman, K. (2022). Position statement of the Max Planck Institute for Innovation and Competition of 25 May 2022 on the commission’s proposal of 23 February 2022 for a regulation on harmonised rules on fair access to and use of data (data act). Max Planck Institute for Innovation and Competition research paper, (22-05). München.

European Commission (2020a). Proposal for a regulation of the European Parliament and the Council on contestable and fair markets in the digital sector (digital markets act). COM (2020) 842 final. Brussels.

European Commission (2020b). Proposal for a regulation of the European Parliament and the Council on European data governance (data governance act). Brussels.

European Commission (2022). Proposal for a regulation of the European Parliament and the Council on harmonized rules on fair access and use of data (data act). Brussels.

Expertenkommission Forschung und Innovation (2023). Gutachten zur Forschung, Innovation und technischer Leistungsfähigkeit Deutschlands: Gutachten 2022, Berlin.

Farrell, J., & Saloner, G. (1985). Standardization, compatibility, and innovation. Rand Journal of Economics, 16(1), 7083. doi: 10.2307/2555589.

Geiregat, S. (2022). The data act: Start of a new Era for data ownership?. Discussion Paper, submitted for publication in a law journal. FWO Research Foundation.

Gregory, R. W., Henfridsson, O., Kaganer, E., & Kyriakou, S. H. (2021). The role of artificial intelligence and data network effects for creating user value. Academy of Management Review, 46(3), 534551. doi: 10.5465/amr.2019.0178.

Gregory, R. W., Henfridsson, O., Kaganer, E., & Kyriakou, H. (2022). Data network effects: Key conditions, shared data, and the data value duality. Academy of Management Review, 47(1), 189192. doi: 10.5465/amr.2021.0111.

Guse, R., Thiebes, S., Hennel, P., Rosenkranz, C., & Sunyaev, A. (2022). Datenmarktplätze für Künstliche Intelligenz im Gesundheitswesen: Potenziale, Herausforderungen und Strategien zur Bewältigung. HMD Praxis der Wirtschaftsinformatik, 59(6), 15271544. doi: 10.1365/s40702-022-00922-4.

Hazel, S. H. (2020). Personal data as property. Syracuse Law Journal, 70, 10551113.

Höppner, T., & Westerhoff, P. (2021). Privacy by default, abuse by design: EU competition concerns about apple's new app tracking policy. Hausfeld Competition Bulletin, Spring 2021. doi: 10.2139/ssrn.3853981. Available from: https://ssrn.com/abstract=3853981 (accessed 26 May 2021).

Huq, A. Z. (2021). Who owns our data? We need a model of ownership that recognizes our collective interests. Boston Review. Available from: https://www.bostonreview.net/articles/who-owns-our-data/#:~:text=U.S.%20law%20today%20provides%20no,no%20individual%20right%20to%20it

Hylton, K. N. (2019). Digital platforms and antitrust law. Nebraska Law Review, 98, 272296.

Jurcys, P., Donewald, C., Fenwick, M., Lampinen, M., Nekrošius, V., & Smaliukas, A. (2021). Ownership of user-held data: Why property law is the right approach. Harvard Journal of Law and Technology, JOLT Digest.

Katz, M. L., & Shapiro, C. (1985). Network externalities, competition, and compatibility. The American Economic Review, 75(3), 424440.

Kerber, W. (2023). Governance of IoT data: Why the EU data act will not fulfill its objectives. GRUR International, 72(2), 120135. doi: 10.1093/grurint/ikac107.

Koutroumpis, P., Leiponen, A., & Thomas, L. D. W. (2020). Markets for data. Industrial and Corporate Change, 29(3), 645660. doi: 10.1093/icc/dtaa002.

Krugman, P. (1991). History versus expectations. Quarterly Journal of Economics, 106(106), 651667. doi: 10.2307/2937950.

Leistner, M., & Antoine, L. (2022). Attention, here comes the EU Data Act! A critical in-depth analysis of the Commission’s 2022 Proposal. Journal of Intellectual Property, Information Technology and E-Commerce Law, 13(3), 339349.

Lubyová, L. H. (2022). Digital markets act: A fair framework for the online world?. EU Antitrust: Hot Topics & Next Steps: Proceedings of the International Conference held in Prague on January 24–25, 2022 (pp. 5464). Prague, V. Šmejkal.

Macleod, H. D. (1875). The theory and practice of banking. London: Longmans et al.

Mansell, S. F., & Sison, A. J. G. (2020). Medieval corporations, membership and the common good: Rethinking the critique of shareholder primacy. Journal of Institutional Economics, 16(5), 579595. doi: 10.1017/s1744137419000146.

Mulherin, J. H., Netter, J. M., & Overdahl, J. A. (1991). Prices are property: The organization of financial exchanges from a transaction cost perspective. Journal of Law and Economics, 34(34), 591644. doi: 10.1086/467237.

The European Parliament and the Council of the European Union (2016). Regulation (EU) 2016/679 of the European Parliament and of the European Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing directive 95/46/EC (general data protection regulation). Official Journal of the European Union, (4.5.2016).

Thouvenin, F., & Tamò-Larrieux, A. (2021). Data ownership and data access rights: Meaningful tools for promoting the European digital single market?. in M. Burri (Ed.), Big Data and Global Trade Law (pp. 316339). Cambridge University Press.

Zech, H. (2017). Data as a tradeable commodity – implications for contract law. in J. Drexl (Ed.). Proceedings of the 18th EIPIN Congress: The New Data Economy between Data Ownership, Privacy and Safeguarding Competition. Edward Elgar.

Acknowledgements

We gratefully acknowledge support by the German Federal Ministry for Research and Education (BMBF) (Grant 01DO17024A/B), the National Natural Science Foundation of China (Grant 72293561), and the Research Center for Interactive Technology Industry of Tsinghua University (RCITI2022T002).

Corresponding author

Kai Reimers can be contacted at: reimers@wi.rwth-aachen.de

Related articles