2023/11/28

Transparency and sharing requirements for data in the EU - stimulating or killing innovations?

Authors: Rosa Ballardini & Rob van den Hoven van Genderen

Research group: Law, Technology and Design Thinking

1. Introduction

Rosa Ballardini

The European data Strategy[1] is directed at the creation of a European single open market for the use of all categories of data to enhance a competitive edge as well as give users and data-subjects access to such data. This gives the opportunity to put into active use also unused data, for e.g. data generated by Artifical Intelligence (AI) or other technologies.

 The strategy is supported by a legal framework to make this possible. The problem is that while an open and transparent data society sounds positive, it can cause problems as well. If all data, personal as well as non personal, should be available for third parties, the chance of intruding on privacy and trade secrets, as well as intellectual property rights and security can also increase. Moreover, the incentive for investments and creating new products, services or other inventions could be seriously diminished.

 2. An EU Strategy for Data

Rob van den Hoven van Genderen

Acknowledging the opportunities but also the high risks and challenges related to the use and processing of data, the European Union has launched in 2020 a European data strategy[2] harnessing existing barriers and creating a single European market for data, while fully respecting EU policies on fundamental rights such as privacy, data protection as well as competition. A driving principle of the data strategy relates to creating an appropriate balance between protection, regulation and innovation to allow data to flow freely within the EU and across sectors, in accordance with the ‘free movement of data’, which is one of the five pillars of the European internal market. 

To achieve the ambition of the EU data strategy, various regulatory actions have already been taken, and more legislative initiatives are underway. The Data Governance Act[3] and the upcoming Data Act[4] (that are often jointly referred to as the ‘Data Acts’) are the most recent pieces of law released as part of the European strategy for data. The Data Governance Act which entered into force in 2022, aims to facilitate the voluntary sharing of data by individuals and businesses and harmonises conditions for the use of certain public sector data.

The Data Act, was approved by the European Parliament on 9th November 2023 and is expected to enter into force in autumn 2025 (except from Article 3(1), the transition period of which is one year longer). It is a horizontal Regulation that covers different regulatory aspects related to (personal and non-personal) data collected by connected products and related services (in the B2B, B2C and B2G contexts). The Data Act will complement the Data Governance Act by better harnessing the potential of data sharing, providing further opportunities for the reuse of data by tackling both the problem that most data remain either unused and that its value is concentrated in the hands of relatively few large companies. In addition to data sharing obligations and access rights, the Data Act contains rules on switching between data processing services and international transfers of non-personal data. 

The ‘Data Acts’, however, do not exist in a vacuum. Instead, they strategically complement the already existing EU legal framework for data governance. This comprises inter alia the General Data Protection Regulation (GDPR)[5], the Free Flow of Non-Personal Data Regulation[6], the Open Data Directive[7], as well as the Database Directive[8], and the Platform to Business Regulation[9]. For instance, the 'Data Acts’ are consistent with existing rules in the GDPR on the processing of personal data and protecting the private life and the confidentiality of communications, as well as any data stored in and accessed from terminal equipment.[10] The ‘Data Acts’ further complement these privacy-focused provisions, particularly with regards to personal and non-personal data generated by a user’s product connected to a publicly available electronic communications network.

Especially, the forthcoming Data Act builds further on the Free Flow of Non-Personal Data Regulation in this regard. In fact, the Free Flow of Non-Personal Data Regulation aims at removing obstacles to the free movement of non-personal data between different EU countries and IT systems in Europe by ensuring that every organisation should be able to store and process data anywhere in the EU, and ensuring availability of data for regulatory control. It also introduces codes of conduct to facilitate switching data between cloud services to tackle the problem of ‘vendor lock-in’. The Data Act builds on all this, helping even more citizens and businesses to switch cloud providers and port data.

Moreover, the Data Act also tackles some of the long-lasting controversies existing in the context of the Database (DB) Directive. The DB Directive protects databases that have been created as a result of a ‘substantial investment’, even when the database itself is not ‘original’ in the sense of qualifying for copyright protection. A long-standing and highly debated issue here relates to whether databases containing data that are eg. machine-generated, would be entitled to protection under the dictate of the DB Directive.[11] The draft Data Act expressly provides that, in order not to hinder the exercise of the right of users to access and share data with third parties "the sui generis right provided for in Article 7 of Directive 96/9/EC does not apply to databases containing data obtained from or generated by the use of a product or a related service” (Art. 35 of the draft Data Act).

Moreover, the Data Act provides that although, as a rule, trade secrets must be protected, they may be disclosed if the data holder and the user “take all necessary measures prior to the disclosure” to preserve confidentiality (Art 5 of the Data Act). However, access may be refused only if the data holder, which is a “trade secret holder”, can demonstrate, and duly substantiate, that they are “highly likely to suffer serious economic damage” from the disclosure, on a case-by-case basis (Art. 4(3b) of the draft Data Act). So proof will lay with the data-holder who will bear all costs. Finally, the ‘Data Acts’ complement also both the Platform to Business Regulation, which imposed transparency obligations, requiring platforms to describe for business users the data generated from the provision of the service, and the Open Data Directive, which defines minimum standards for re-using data held by the public sector and of publicly funded research data made publicly available through repositories.[12]

As previously mentioned, there are also other forthcoming regulations that will impact the current (personal and non-personal) data governance rules, primarily the proposed Digital Markets Act[13], which requires certain providers of core platform services identified as ‘gatekeepers’ to provide more effective portability of data generated through business and end users’ activities. Also the so-called ‘Digital Services Act package'[14], comprising the Digital Services Act (DSA)[15], and the Digital Market Act (DMA)[16], which prohibit especially so-called “dark patterns” will be relevant, and so will the so called ‘Artificial Intelligence Act (AI Act)’, which is an abbreviation of the proposal for a regulation of the European Parliament and of the Council on harmonized rules on Artificial Intelligence[17], which is particularly relevant in the context of data regulation in relation to AI technologies.

3. The Dark Side of a Policy about “Sharing Just for the Sake of Sharing”

While securing a sustainable data governance framework for data sharing is absolutely essential for the well-functioning of the data economy and for incentivising innovations such as those related to AI, this way promoting progress and wellbeing, an open and transparent data society can also raise certain risks and dangers. First, when data, whether personal or non personal, is made available to any third parties, the chance of intruding on privacy and trade secrets, as well as intellectual property rights and security also increases. At the same time, enforcing data sharing in a way that is not balancing the interests of the dataholder that has invested effort and finances to develop and produce products and/or services and is forced to give entrance to e.g. trade secrets in a not necessarily proportional manner might also long-term disincentivize investments in creating new products and services.

Although an open and transparent data society sounds positive, it can cause problems as well. If all data, personal as well as non personal should be available for third parties, being it private, public, commercial as well as governmental institutions, the chance of intruding on privacy will increase. For example, from the point of view of privacy-related concerns, sensitive data are covered by several of the existing and forthcoming data sharing and data governance provisions in the EU, particularly the ones related to privacy and data protection like the GDPR. This is so because due to the increasing possibility to identify natural persons by AI technology, almost all data become personal data.

This is specifically mentioned as “factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person” (art 2.1 GDPR). Even more, the applicability of data protection on these kinds of data, as mentioned in article 2.14 GDPR, so-called “biometric data”, means personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images or dactyloscopic data. These kinds of sensitive data are, according to Article 9-bis, forbidden to be processed except with the consent of subjects, or other legitimate reasons listed in the article (e.g.  securing the data subject’s vital interest or reasons of national (security) interest). Also sharing is processing and will be subjected to this provision.

Moreover, Article 22 GDPR states that: “The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her”. These requirements could certainly complicate the transparency and sharing requirements of the Data Act. Referring to the Data Act, some of the data as emotional, biometric and medical data, is not allowed to be processed at all by AI as is stated in Article 5 of the accepted last version of the AI act (June 2023).

Moreover, from the perspective of legal instruments to incentivise innovations in data industries, the new mandatory access to data requirement, is a challenge to say the least. As we know, IPR are not very good to protect data as such nor datasets.[18] One of the few exceptions is the sui generis database protection for datasets, that however does not have a glorious reputation either. Therefore, the most used way to secure protection of non- personal data is via contracts and trade secrets.

The Data Act forces companies to provide users (both natural and legal persons) with access to the data generated by their connected Internet of Things (IoT) devices, also including cases when trade secrets are involved. In order words, manufacturers and other data holders of connected products will have to open user data for free to users and under fair, reasonable and non-discriminatory (FRAND) terms to third parties in the EU as well as other third parties outside the EU, however, not subject to the FRAND terms. Indeed, one of the key issues of the Data Act has been the protection of trade secrets and intellectual property rights included within such user data.

Especially, it is questionable whether a concept such as FRAND, that has been successfully used in the context of technical standards where pre-existing essential patent rights to be shared are actually clear, can work in a context of data sharing where no clearly pre-defined IPRs are present before sharing. Questions as to what is a fair and reasonable compensation/price for e.g. will certainly be difficult to determine in such unclear circumstances.

Ensuring transparency regarding the data to be generated and facilitating access for the user is of utmost importance for many reasons, including also to enhance possibilities to access key information relevant to repair items and thereby (in this way) promoting the circular economy. However, one can question whether this is the right way to achieve the goal of stimulating the data economy and increase the functioning of the single market. Indeed, obligations and limitations regarding the use of the shared data are imposed in order to protect the data holder’s interests.

For example, users and third parties are forbidden from using the data received to develop products competing with that from which the data originate. Also, trade secrets may only be disclosed if specific measures to preserve confidentiality are taken and, where the data is to be made available to third parties, if it is strictly necessary to fulfill the purpose agreed with the user. But who will decide on the proportionality, and how can this be done?

4. Conclusion

From all this, what can be concluded is that the EU data strategy seems to have some (disturbing) counterweight in other parts of the EU regulatory framework, especially in relation to important aspects such as privacy as well as IPR and trade secrecy. Although the Data Act prioritizes the GDPR when personal data are involved, risks of intruding on privacy of individuals will obviously increase by this widely open data sharing policy.

In addition, the strong push towards sharing data possibly including those covered by trade secrecy might indeed have the effect of disincentivizing innovation in Europe, while offering even bigger opportunities here to other markets, especially the USA and China. Certainly, the interpretation as to rules and the exceptions included in the DA will require several decision rounds, likely increasing lawyering work while not necessarily increasing legal certainty. Indeed, it will be interesting to see what weight is the heaviest in this balancing act.

At this stage, we cannot but wonder, even if transparency is a good principle in a data driven society, does it still stand if it endangers trade secrets, privacy and security, as well as decreasing legal certainty?


[2] Ibid.

[4]  https://eur-lex.europa.eu/legal-content/EN/TXT/HTML/?uri=CELEX:52022PC0068; accepted by EP on 9th of November, into force 2025

[11] See e.g. Pihlajarinne, T., & Ballardini, R. M. (2019). Owning Data via Intellectual Property Rights: Reality or Chimera? In R. M. Ballardini, P. Kuommamäki, & O. Pitkänen (Eds.), Regulating Industrial Internet through IPR, Data Protection and Competition Law Kluwer Law International.

[12] For a comprehensive overview on issues related to non-personal data governance Olga BATURA, Axel WION, Sofia Noelle GONZALEZ, J. Scott MARCUS, Ilsa GODLOVITCH, Lukas WIEWIORRA, Peter KROON, Serpil TAS and Nico STEFFEN, “The emergence of non-personal data markets“, Policy Department for Economic, Scientific and Quality of Life Policies Directorate-General for Internal Policies. Available at:  https://www.europarl.europa.eu/RegData/etudes/STUD/2023/740098/IPOL_STU(2023)740098_EN.pdf.

[15] Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act) (Text with EEA relevance).

[16] Regulation (EU) 2022/1925 of the European Parliament and of the Council of 14 September 2022 on contestable and fair markets in the digital sector and amending Directives (EU) 2019/1937 and (EU) 2020/1828 (Digital Markets Act) (Text with EEA relevance).

[17] Proposal for a Regulation of the European Parliament and of the Council laying down harmonised rules on artificial intelligence (artificial intelligence act) and amending certain union legislative acts COM/2021/206 final, https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=celex%3A52021PC0206.

[18] See e.g. Pihlajarinne, T., & Ballardini, R. M. (2019). Owning Data via Intellectual Property Rights: Reality or Chimera? In R. M. Ballardini, P. Kuommamäki, & O. Pitkänen (Eds.), Regulating Industrial Internet through IPR, Data Protection and Competition Law Kluwer Law International

No comments:

Post a Comment