Author: Krisztina Rozgonyi


Content map

Executive Summary

1. Introduction

2. Context: International legal standards on freedom of expression and media freedom as a base for the modernization of media laws

3. Overview of the latest legal provisions and attempts to regulate digital communications that affect media and journalism in Europe

4. Summary of key policy debates on regulating digital online communication in the EU

5. Modernization of the EU’s media-related legal frameworks

Conclusions

Recommendations for media reform in Lebanon


Executive Summary

Modernizing media-relevant laws, including copyright law and data protection regulations, is a crucial response to the rapid digitalization of media content and the rise of online global distribution via various platforms. In the EU context, we could witness a fundamental turn of the policy from a liberal economic perspective to a constitution-oriented approach, with a leading role of the Court of Justice of the European Union, aimed at opposing platform powers. Digital constitutionalism in the EU has delivered significant regulatory solutions to protect fundamental rights and democratic values while balancing the need for technological advancement.

In regulating online platforms concerning media content (i), the policy objective of the EU was to make online platform service providers take more responsibility for the content they host and their effects on society. Also, there was a clear need for a level playing field for digital services, ensuring responsible online platform behavior and fostering transparency and fairness. Furthermore, the EU was committed to advocating for the application of human rights and the promotion of unhindered, uncensored and non-discriminatory access to online services for all, according to international legal standards. The two central pieces of EU legislation that regulate online platforms’ directly relevant content are the Revised AVMSD and the DSA. In the Revised AVMSD, the EU extended the scope of the rules applicable to Video Sharing Platforms within the well-defined areas of protecting minors against harmful content online, combating hate speech and public provocation to commit terrorist offenses, and, in parallel, has put great emphasis on ensuring that national regulators – who are overseeing the application of the new rules – were to act as independent, professional and accountable public actors. The DSA is a remarkable piece of legislation and regulation of online platforms for the possible advancement of media content providers and journalists, with utmost relevance to platforms’ requirements on transparency and accountability.

Copyright in the digital era (ii) was an area of serious contestation in the EU, and the new Copyright Directive (CDSMD) introduced a framework for digitally updated copyright protection and for the liability for online content-sharing service providers (platforms). The new right for press publishers was provided to foster quality journalism vis-a-vis online platforms. Meanwhile, the policy objective of the so-called ‘upload-filters clause’ was to oblige online platforms to conclude license agreements about copyright in user-generated content with major copyright holders; however, the new rules were heavily criticized as incentives for online censorship and possible restrictions of the right to information. The EU debates should warn Lebanese policy-makers and legislators about careful considerations on the impact of copyright as a potential barrier to the freedom of expression.

Safeguarding media freedom and pluralism (iii) is an emerging area for new legislation in the EU, particularly regarding the draft European Media Freedom Act (EMFA), which proposed a new set of rules and mechanisms promoting media pluralism and independence across the EU. The EMFA proposal is currently being discussed in the European Parliament. At the same time, the underlying considerations on the need for new legal and regulatory safeguards ensuring editorial independence, the transparency of media ownership and enhancing the independence of national media regulatory authorities are relevant to the Lebanese context. Furthermore, the Lebanese stakeholders should consider the newly proposed legal responses to the rise of Strategic Lawsuits Against Public Participation (SLAPPs).

Data protection and privacy online (iv) in the EU entered a new legislative era with the rise of massive personal data processing by online digital platforms. The General Data Protection Regulation (GDPR) aims to foster transparency and accountability in data processing and protect individual rights to privacy vis-à-vis datafication and platformization. However, new privacy and personal data protection rules must be carefully balanced concerning their impact on freedom of expression; thus, appropriate and well-tailored exemptions for journalistic privileges must accompany the legal modernization process.

In sum, the recent and current period of EU legislation in the areas of (i) Regulating online media content and platforms, (ii) Copyright in the digital era, (iii) Safeguarding media freedom and pluralism, and (iv) Data protection and privacy are offering essential and meaningful insights into the various aspects but also tensions about the modernization of the law, which Lebanese policy-makers and legislators should consider. Importantly, protecting fundamental freedoms online should be balanced with other legitimate public policy objectives, with utmost care at setting the boundaries of state intervention.

1. Introduction

The modernization of media law, including copyright law and data protection regulations, has been prompted by the rapid digitization of media content and the fast-growing tech platforms that altered the process of content distribution online. As technology alters how we consume and share content, legal frameworks need to be adapted to address the challenges and opportunities triggered by these changes. The fundamental transformation of digital content production and dissemination via online platforms has enabled media content to be accessed and distributed globally without frontiers, yet it significantly limited the states’ ability to achieve policy objectives under their jurisdiction. All these challenges require an update of media laws to ensure that citizens’ and creators’ rights are protected and that legal jurisdiction in the digital realm is clearly defined.

Similarly, as the rise of user-generated content on online platforms has blurred the lines between content creation and consumption, laws and regulations need to be modernized to achieve the right balance between protecting copyright holders’ interests and accommodating legitimate use of content without endangering freedom of expression. Thus, traditional copyright laws, which were designed for a pre-digital era and may not adequately address the challenges of digital content distribution, need to be updated. Modernization of these laws should spur innovation and support development of sustainable business models while ensuring fair compensation of copyright holders and preventing monopolistic or restricting practices.

Data protection and privacy regulations gained new momentum with the digitization of information consumption. As tech platforms collect and process vast amounts of user data, legal provisions are needed to safeguard user privacy, prevent data breaches, and ensure that personal information is handled responsibly, holding tech platforms accountable for their data processing practices.

In all these areas, (i) regulation of online media content and platforms, (ii) copyright in the digital era, (iii) safeguarding media freedom and pluralism, and (iv) data protection and privacy, modernization of legislation and accompanying regulations should align with the international legal standards on freedom of expression by carefully balancing legitimate claims of individuals and other rightsholders, and public policy objectives.

In practical terms, modernized laws should ensure that individuals have access to diverse and high-quality content while respecting their rights to freedom of expression, access to information, and privacy. They should also protect the public from harmful forms of communication, such as disinformation.

Media-related laws are not modernized in a vacuum but in the context of international legal standards on freedom of expression and media freedom. According to these standards, freedom of opinion and expression are fundamental rights of every human and indispensable for individual dignity and fulfillment. They constitute essential foundations for democracy, the rule of law, peace, stability, sustainable and inclusive development, and participation in public affairs. States have the obligations to respect, protect and promote the rights to freedom of opinion and expression. All offline human rights, communication rights in particular, must be protected online.[1]

Article 19 of the International Covenant on Civil and Political Rights (ICCPR) includes the main provisions on the right to freedom of expression. This right applies to forms of expression regardless of the medium through which they are made, including digital platforms and online channels of distribution. The right to freedom of expression also includes the right to “impart”, “seek” and “receive” information.

Hence, freedom of expression enables everyone to contribute to the public sphere and access a wide range of information and viewpoints. These aspects of the right underpin policy concepts such as media pluralism and media diversity as well as the right to access information, highly relevant in the digital context. International standards on audiovisual communication (see the General Comment No. 34., concerning Article 19 of the ICCPR)[2] emphasize that media regulation should be nuanced and proportionate, according to the nature of each media segment, digital and online media content included.

In Europe, freedom of expression and information are protected by Article 10 of the European Convention on Human Rights (ECHR),[3] the flagship treaty for protecting human rights. It states that it is the role and the responsibility of each state to guarantee such freedoms and ensure media pluralism according to positive and negative obligations put forward by the article. The positive obligation is to create a communication environment that supports the free flow of information and ideas in society to allow free and independent media to flourish. As for the negative obligations, the right requires states not to interfere with exercising the right to seek, receive and impart information and ideas, except as permitted under international law. Restrictions on the exercise of freedom of expression may not jeopardize the right itself. The U.N. Human Rights Committee has repeatedly underscored that the relation between the right and the restriction, and between the norm and the exception, must not be reversed.[4] Notably, any such restrictions must pass the so-called three-part cumulative test.[5]

The modernization of media-related laws and regulations in Europe, particularly in the EU, is a balancing act between the state’s positive and negative obligations in ensuring freedom of expression and an enabling environment.

The innovations in information and communications technologies have not only created new opportunities for individuals to impart and disseminate information, but have also brought about new challenges. Social media platforms in particular have transformed all aspects of freedom of expression. Imparting information and the exposure of individuals to information have quantitatively exploded in recent years; however, the growing phenomenon of “filter bubbles”[6] might hinder qualitative diversity.[7] Meanwhile, content dissemination on a large scale allowed for increased participation of citizens in the public sphere but also boosted the threats stemming from online disinformation, specifically endangering the right to free elections.[8] At the same time, the media industry and news organizations have also become heavily reliant on social media platforms[9] and needed to adapt to the digital transformations of their news production and dissemination processes that fundamentally altered the traditional routines in the journalistic profession.[10]

These unprecedented changes had to be reflected in the overall legal provisions on protecting human rights, particularly freedom of expression and privacy as protection online is as important as protection offline. In sum, the general standards regarding the balance between freedom of expression and privacy in Europe had to be reconsidered in light of the specific manifestations of individual autonomy as well as of the different interactions that took place in the digital, platformized environment, including the access to, and use of, social media by journalists and media actors.[11]

The EU is bound and committed to respect, protect and promote the freedom of opinion and expression as guided by the relevant provisions of the Treaty of the European Union (TEU) and the EU Charter of Fundamental Rights as well as in line with their international and European human rights obligations,[12] guided by the universality, indivisibility, inter-relatedness and interdependence of all human rights, whether civil, political, economic, social or cultural.[13] However, although the EU’s media law rests as much on economic as on human rights foundations, the primary legal framework in which it operates is commercially driven and follows the rules on free movement and fair competition.[14] That being said, the economic and human rights frameworks of the EU media law are coherent.[15]

4. Summary of key policy debates on regulating digital online communication in the EU

The digital transformation of communication and the media has challenged the law in several aspects, particularly with respect to protecting individuals’ fundamental rights, such as freedom of expression, privacy, and data protection. The traditional legal mechanisms of the state to protect its citizens, such as those stipulated in constitutional law, have been endangered by the tendency of private transnational corporations[16] operating in the digital environment, primarily tech platforms, to perform quasi-public functions in the transnational context, which brought them in competition with public actors.

“From a constitutional law perspective, the notion of power has traditionally been vested in public authorities; a new form of (digital) private power has now arisen due to the massive capability of organizing content and processing data. Therefore, the primary challenge involves not only the role of public actors in regulating the digital environment but also, more importantly, the ‘talent of constitutional law’ to react against the threats to fundamental rights and the rise of private powers, whose nature is much more global than local.”[17] These unprecedented changes gave rise to a new phase of European constitutionalism (i.e., digital constitutionalism).[18] In the EU context, a fundamental turn of the policy from a liberal economic perspective to a constitution-oriented approach could be witnessed,[19] especially in content and data, with the Court of Justice of the European Union taking a leading role and aimed at opposing platforms’ power. Arguably, digital constitutionalism in the EU has delivered “regulatory solutions to protect fundamental rights and democratic values” and promoted “the European model as a sustainable constitutional environment for the development of artificial intelligence technologies in the global context.”[20]

The latest regulatory proposals from the European Commission, on topics such as the regulation of online media content concerning hate speech and the protection of minors, viral spreading of fake news on social media and the fight against copyright infringement on video-sharing platforms have spawned heated debates among stakeholders. Real tension emerged between the responsibility of tech companies for illegal, harmful or misleading content hosted on their social media platforms and the role of the judiciary and state authorities within the newly emerging regulatory chains.[21]

It was first the revision of the Audiovisual Media Services Directive (AVMSD) in 2018[22] that made ground-breaking steps towards regulating online media content platform services. At the time, European audiences were in shock by how the 2016 United States elections and the United Kingdom Brexit referendum were influenced by hate speech and dis/misinformation, and policy-makers across Europe were keen to see new regulations of content platform services.[23]

Meanwhile, European audiovisual industries became eager to level the playing field as they competed with US-based tech giants for sources of income and increasingly fragmented audiences.[24] In response, the then newly adopted AVMSD for the first time held Video Sharing Platforms (VSP) responsible for protecting users and adhering to advertising standards.

One year later, the Directive on Copyright in the Digital Single Market (CDSMD)[25] put an end to the regulatory impunity of internet intermediaries such as Online Content-Sharing Service Providers.[26] The relatively safe harbor they had enjoyed since 2010 under the liability exemptions of the EU E-Commerce Directive[27] for third-party content, which generally freed them from the obligation to monitor such content, thus came to an end. The underlying concepts were similar in that they sought greater responsibility for online platforms for illegal and harmful user-generated content. The Commission’s goal was to promote co-regulatory and self-regulatory solutions to level the “playing field for comparable digital services” while expecting “responsible behavior of online platforms to protect core values.”[28]

Since then, the EU has progressively moved forward with legal, regulatory and policy measures aimed to respond to digital threats such as misinformation, the unethical exploitation of information asymmetry through advanced technological capabilities, lack of transparency and accountability, and risks to freedom of expression. It has put much emphasis on developing Europe-wide ethical standards, particularly on tackling disinformation,[29] and offered co-regulatory mechanisms through the Code of Practice on Disinformation (2018 and 2022).[30]

Most recently, the Digital Markets Act (DMA)[31] and the Digital Services Act (DSA)[32] were adopted in 2022, two pieces of legislation designed to counter platforms’ power through “hard law” provisions.

In the following sections, this paper will provide an overview of the EU’s latest and ongoing efforts to modernize media-related laws and regulations in the following areas:

  1. Regulating media content on online platforms
  2. Copyright in the digital era
  3. Safeguarding media freedom and pluralism, and
  4. Data protection and privacy

The analysis is focused on the new areas and issues introduced by these initiatives, which can be seen as part of the EU’s response to “the challenges to human dignity in an algorithmic society.” [33]

5.1 Regulating media content on online platforms

The EU’s policy objective in its latest media-related laws and regulations is to make online platform service providers take more responsibility for the content they host and their impact on society. The Communication on Online Platforms[34] sets out the policy principles guiding the EU’s actions according to the economic agenda of the Digital Single Market. It emphasizes the need for a level playing field for digital services, ensuring online platforms’ responsible behavior and fostering trust, transparency and fairness. Furthermore, as part of these efforts, the EU has claimed that it is committed to advocating for the application of human rights, including the right to freedom of opinion and expression and the promotion of unhindered, uncensored and non-discriminatory access to online services for all, under international law.[35] Two central pieces of EU legislation regulating online platforms are directly relevant to media content: the Revised AVMSD and the DSA.

5.1.1 The Audiovisual Media Services Directive (AVMSD)

The AVMSD, first adopted in 2010, is the centerpiece of media policy in the EU. Over the multiple revisions during the last two decades, the scope of the AVMSD has been extended, to cover first on-demand audiovisual services and, most recently, in 2018, Video-Sharing Platforms (VSPs) that disseminate user-generated content.

The Revised AVMSD aimed explicitly at:

(1) creating a level playing field for emerging audiovisual media and providing rules to shape technological developments;

(2) preserving cultural diversity and investments in European content;

(3) protecting users against hatred and children from online harms while regulating online platforms; and

(4) safeguarding media pluralism and guaranteeing the independence of national media regulators.

Regarding VSPs, the focus was on protecting minors against harmful content online, combating hate speech and public provocation to commit terrorist offenses on the internet. According to the new provisions, VSPs must comply with a series of obligations including preventing minors’ exposure to harmful content and ensuring that users are not exposed to unlawful content.[36] The Revised AVMSD introduced an obligation for VSPs to include measures to protect users within their terms of service. Thus, the onus is primarily on the VSP providers to implement rules to achieve the objectives of the AVMSD, under the oversight of the national regulators.[37]

The national media regulators are required to enforce the new obligations. This regulatory oversight should be a “systemic type of regulation,”[38] focused on procedures and processes. There is no expectation from media regulators to focus on individual content items; their task is to only assess the measures VSPs are taking. That means that the AVMSD does not invest regulators with investigative powers or impose transparency or access requirements on VSPs.

The Revised AVMSD represents a new approach to content regulation, which can be characterized as a systemic approach, under a minimum harmonization regime, with distinct transparency rules and with the active user seen as a regulatory actor.[39]

On the other hand, the Revised AVMSD introduced a new set of rules aimed to safeguard the independence of media regulators. According to the EU’s guiding principles, regulators are in theory presumed to be appropriately insulated against political and commercial influences and can thus best perform their duties in the public interest. Therefore, the EU strives for all public authorities that exercise formal regulatory powers over the media to be protected against interference, particularly of a political or economic nature, including through the appointment of the members of the regulatory authority invested with decision-making power, a process that should be transparent, allow for public input and not be controlled by any particular group of interests. Through this newly introduced set of obligations (Article 30), the Revised AVMSD has stepped up its expectations on the EU national governments to ensure, through their respective legislation, both de jure and de facto independence of their national regulator and its accountability towards the public.

5.1.2 The Digital Services Act (DSA)

Another response of the EU to the growing power of online platforms was the adoption of the Digital Services Act (DSA) package, which aimed to “create a safer digital space where the fundamental rights of users are protected and to establish a level playing field for businesses.”[40] The DSA and its sister regulation, the Digital Markets Act (DMA), adopted by the European Parliament in 2022, started to apply incrementally, with full enforcement required by February 17, 2024.

The DSA created a uniform regulatory framework for intermediary service providers,[41] including online platforms, directly applicable across the EU. The DSA does not impose any additional rules specific to media content or its dissemination online, apart from the provision stipulating that what is illegal offline should also be illegal online and thus not be made available. However, the DSA introduces mechanisms to counter the availability of illegal content online, safeguards for online users whose content is removed or restricted by an online intermediary, and wide-ranging transparency requirements applicable on online platforms, including those related to content moderation and recommender systems. In essence, the DSA sets asymmetric obligations on different types of intermediaries depending on the nature of their service, reach, and societal impact. The bigger and more socially significant a service is, the more stringent obligations it must fulfill.

Consequently, the most prominent players in the market, the so-called very large online platforms (VLOPs) and very large online search engines (VLOSEs), must comply with all the rules, including the most far-reaching ones. By contrast, less consequential service providers (e.g., small and microservices) must only comply with the essential and more general obligations. Notably, the supervision and enforcement of the DSA rules will be shared between the European Commission and the EU national governments, with assistance from a new European Board for Digital Services, whereas VLOPs and VLOSEs will be directly regulated by the Commission.

Even though the DSA is not media-specific legislation, it puts forward several provisions that can affect the relations between media outlets and online platforms.[42] The direct relevance of the DSA to media-related content and services is the requirement on online platforms to take more responsibility regarding illegal information offered on their services. The DSA also lays out a set of key transparency requirements on the terms and conditions and recommender systems employed by platforms to control the availability and findability of content. Furthermore, the DSA establishes enforceable rights for users to challenge platforms, particularly when their content is removed or otherwise restricted. The most media-content relevant provisions of the DSA are the following:[43]

a). Protection for the integrity of news media and journalistic content

According to Article 14 of the DSA, the providers of intermediary services, i.e., online platforms, “shall include information on any restrictions that they impose about the use of their service in respect of the information supplied by the recipients of the service, in their terms and conditions.” Furthermore, they are required to “act in a diligent, objective and proportionate manner in applying and enforcing (such) restrictions,” with due regard to … “the fundamental rights of the recipients of the service, such as the freedom of expression, freedom and pluralism of the media, and other fundamental rights and freedoms.”

Under this provision, platforms must inform their users, including media outlets using platforms to disseminate news content, about possible restrictions based on their terms of service. While platforms could previously moderate (journalistic) content in an non-transparent manner, once the DSA’s provisions on VLOPs and VLOSEs are introduced, platforms will have to act transparently and in a non-discriminatory manner when applying any content moderation measures, which mainly affect the availability, visibility, and accessibility of content, such as demotion, demonetisation, disabling of access to, or removal thereof. These transparency obligations should alleviate the problem of media content availability online.

b). Risk assessment based on media pluralism objectives

The DSA pursues a so-called risk-based approach towards regulating platforms. In particular, VLOPs and VLOSEs will have to comply with strict risk assessment requirements (Article 34) and risk mitigation measures (Article 35), and provide for independent auditing on compliance (Article 37).  Significantly, these risks are specifically related to media content. Therefore, risk assessments should entail “any actual or foreseeable negative effects for the exercise of fundamental rights”, in particular, freedom of expression and information (including freedom and pluralism of the media) (Article 34(1)(b)) but also “any actual or foreseeable negative effects on civic discourse” (Article 34(1)(b) and (c)). Therefore, VLOPs and VLOSEs are expected to identify such risks, carry out risk assessments, and “put in place reasonable, proportionate and effective mitigation measures tailored to the specific systemic risks.” After the complete application of the DSA, the European Commission will monitor and, if necessary, enforce compliance with the new requirements.

c). Platforms’ Positive Obligation on Tackling Disinformation

The DSA’s policy objectives specifically refer to “ensuring a safe, predictable and trusted online environment, addressing the dissemination of illegal content online and the societal risks that the dissemination of disinformation or other content may generate” (Recital 8). Moreover, the DSA emphasizes the risk that “manipulative techniques can negatively impact entire groups and amplify societal harms, for example, by contributing to disinformation campaigns or by discriminating against certain groups” (Recital 69). Therefore, once VLOPs and VLOSEs prepare their risk assessments and mitigation strategies required by the law, they must pay particular attention to how their services disseminate or amplify misleading or deceptive content, including disinformation (Recital 84). This positive obligation vis-à-vis platforms will likely prompt online platforms to advance specific risk mitigation measures, such as the prioritization of media content, and adjust their algorithmic systems to promote media freedom.[44]

5.2 Copyright in the digital era

Copyright is at the heart of freedom of expression, both as an enabler and an obstacle to the enjoyment of the right to freedom of expression. The relation between copyright law and freedom of expression remains ambiguous. On the one hand, “copyright law protects the free expression of creators by ensuring that they reap the benefits of their work” allowing artists “to express themselves without worrying about the potential reproduction of their words, art or music.” On the other hand, “copyright law restricts the form of expression by forbidding the free use of copyrighted materials,”[45] and potentially collides with the right to information. With the advent of digital content production and dissemination, this conflict has highlighted several normative issues of copyright enforcement but also questioned the fundamental, and to some extent moral pillars of intellectual property protection in light of digital content abundance. Thus, copyright could become a major obstacle for the media in fulfilling its democratic role in society if the legislation did not provide for robust exceptions from and limitations on the licensed use of protected works for the benefits for the media and journalistic activities.

The EU copyright law, which consists of 13 directives and two regulations, harmonizing the essential rights of authors, performers, producers and the media, and reflecting the international legal framework,[46] also faces some controversy. It became apparent that EU copyright law does not accommodate the needs of new forms of media players that rely on digital technology to discover, gather and analyze information from online sources and to inform the public via digital channels. Therefore, in 2019, the EU for the first time in almost 20 years introduced a new framework for copyright protection in the digital era, specifically covering the liability for online content-sharing service providers (platforms). The new Copyright Directive (CDSMD)[47] is also to be interpreted as a significant element of the rise of digital constitutionalism in the EU[48] and an attempt to counter platforms’ power while ensuring users’ and copyright holders’ rights. Two specific provisions of the CDSMD are relevant for the modernization of media-related legislation, namely the protection of press publications in the use of online content (Article 15) and the rules on the use of protected content by online content-sharing service providers (Article 17).

The newly introduced right for press publishers (ancillary copyright for press publishers; Article 15) was aimed to foster plural, independent and quality journalism in the publishers’ competition with online platforms and to “increase their legal certainty, strengthen their bargaining position and have a positive impact on their ability to license content and enforce the rights on their press publications” (Explanatory Memorandum to the CDSMD).

The rationale of the new rules was the power imbalances and the difficulties that press publishers faced when seeking to license the use of their publications and prevent unauthorized uses by online platforms. Thus, the CDSMD presented the newly introduced press publishers’ right as a form of support for a “free and pluralist press” in its function “to ensure quality journalism and citizens’ access to information” (Recital 54) and to allow for better licensing of press content, asserting that the “organisational and financial contribution of publishers in producing press publications needs to be recognised and further encouraged” (Recital 55).

The underlying assumption was that by generating additional revenues for the publishers of news content, their business model could be beefed up, which in turn would safeguard quality journalism, and their role in a democratic society. In essence, the new legal provisions ensure that platforms enter into licensing agreements with news publishers nailing down the conditions for the publication of their news content. Moreover, the new right ensures that publishers receive compensation when their content is used as short summaries and headlines.

Currently, EU Member States are in the process of implementing and enforcing the new right;[49] hence, it is too early to draw a conclusion as to whether the legislative provision has achieved the stated policy objectives and whether copyright law is a suitable solution for the protection of media freedom and pluralism. During the summer of 2023, Google announced that it had signed agreements with publishers of all sizes, publisher associations and collecting societies, covering over 1,500 publications across 15 countries.[50]

The other provision relevant for the media introduced by the CDSMD is Article 17, known as the “upload-filters clause,” the directive’s most debated new provision. Article 17 stipulates that disseminating user-generated content by online content-sharing service providers (OCSSPs, i.e, platforms) is considered an “act of communication to the public or making it available to the public”under EU copyright law. Thus, platforms are legally required to obtain authorization from the rightsholders for such uses of copyrighted works, for instance, by concluding a licensing agreement.

“If no authorisation is granted, online content-sharing service providers shall be liable for unauthorised acts of communication to the public” unless they have “demonstrated that they made best efforts to obtain an authorisation, in accordance with high industry standards of professional diligence, and any event, acted expeditiously, upon receiving a sufficiently substantiated notice from the rightsholders, to disable access to, or to remove the notified works from their websites, and made best efforts to prevent their future uploads”.[51]

The policy aim of the CDSMD is to oblige online platforms, particularly YouTube, to conclude license agreements with major copyright holders (such as the music industry) and collective rights management organizations (CMOs). However, the key dilemma that emerged is that platforms’ primary liability for user-generated content massively increases their legal risks, forcing them to actively check all content before publication and block the content they consider illegal. They do that, in practice, via the use of “upload filters,” which, some experts argue, can be an incentive for online censorship and a possible restriction of the right to information. In other words, it is argued that Article 17 of the CDSMD had created a potential conflict between the obligation of platforms to do their “best efforts” to prevent infringements of exclusive rights and their duty not to harm the freedom of expression and the right to information of users.[52]

To support national governments in implementing and enforcing the new legal provisions, the European Commission provided a non-binding Guidance[53] for harmonization purposes. The Guidance states that for platforms to comply with their best-efforts obligation under Article 17(4), they must agree on concluding licenses that are “offered on fair terms”and maintain “a reasonable balance between the parties.” The Guidance adds a minimum threshold of the obligation on platforms to engage proactively with rightsholders that can be easily identified and located, notably those with broad catalogs (e.g., CMOs). Some scholars have also evaluated the compatibility of upload filters with human rights principles and legal standards, recommending that for Article 17 to be a human rights-compliant response, upload filters must be explicitly targeted at online infringement of copyright on a commercial scale.[54]

5.3 Safeguarding media freedom and pluralism

Media freedom and pluralism have come under attack in recent years all over the world,[55] including in the EU. Media pluralism is usually evaluated through a three-dimensional lens: plurality of sources for news and information (1); accessibility, availability and affordability of the physical infrastructure of communication (2); and the diversity of perspectives and opinions (3). In the EU, the first and the third aspects are most at risk.

The latest Media Pluralism Monitor report, an EU-financed assessment of the state of media pluralism in the EU, found that no European country is risk-free in terms of media pluralism.[56] In its Rule of Law Mechanism,[57] the EU also included a section on media freedom and pluralism where it analyzes media regulatory authorities, transparency of media ownership, government interference and the framework for protecting journalists. Since 2020, with the publication of its first Rule of Law report,[58] the EU has continuously monitored and assessed the rule of law situation, focusing on the justice system, the anti-corruption framework, media pluralism, and other institutional checks and balances. 

In response to the growing concerns about the worsening state of media freedom and pluralism, in December 2022, the EU proposed a new Regulation, known as European Media Freedom Act (EMFA), now available as a draft.[59] It introduces safeguards against political interference in editorial decisions and against state-mandated surveillance, with a focus on the independence and stable funding of public service media and on the transparency of media ownership and of the allocation of state advertising. The proposed EMFA builds on the revised Audiovisual Media Services Directive (see above), introducing a new set of rules and mechanisms that aim to promote media pluralism and independence across the EU. Its main pillars are:

  • Declaration of the rights of recipients of media services (Article 3) for a plurality of news and current affairs content, produced with respect for editorial freedom
  • Promulgation of the rights of media service providers (Article 4) for effective editorial freedom and the protection of journalistic sources, including solid safeguards against the use of spyware against media, journalists and their families
  • Legislative, EU-wide safeguards for the independent functioning of public service media providers (Article 5) [60]
  • Meaningful provision on enhancing media ownership transparency (Article 6)[61]
  • Additional safeguards and compliance, as well as accountability mechanisms for the independence of national regulators (Article 7)
  • Protection of media content online and specifically on very large online platforms (Article 17) [62]
  • New user right to the customisation of audiovisual media offer (Article 19)[63]
  • New safeguards on the transparent and fair allocation of economic resources (Section 6), including audience measurement (Article 23)[64] and the transparent allocation of state advertising (Article 24).[65]

The other area of concern related to journalism in the EU are the deteriorating professional conditions for journalists,[66] particularly physical attacks; increasing online harassment; and the rising number of Strategic Lawsuits Against Public Participation (SLAPPs, a particular form of harassment used primarily against journalists and human rights defenders to prevent, inhibit or penalize speaking up on issues of public interest.

The EU has expressed its commitment to “promoting and protecting the freedom of opinion and expression worldwide, condemning the increasing level of intimidation and violence that journalists, media actors and other individuals face in many countries across the world for exercising the right to freedom of opinion and expression online and offline.” The EU called on states to “take active steps to prevent violence and promote a safe environment for journalists and other media actors, enabling them to carry out their work independently, without undue interference or fear of violence or persecution.”[67]

The European Commission published in 2022 a Proposal for a Directive on strategic lawsuits against public participation (SLAPP).[68] The proposed directive is to provide courts and targets of SLAPPs with the tools to fight back against manifestly unfounded or abusive court proceedings. The proposed safeguards will apply in civil matters with cross-border implications. At the time of writing, the proposed SLAPP Directive was discussed by the European Parliament. It is primed to be presented for negotiations between the EU co-legislators.[69]

5.4 Data protection, privacy online and the media

“Freedom of expression and privacy are mutually reinforcing rights – all the more so in the digital age.” … “At the same time, one person’s right to freedom of expression may influence someone else’s right to privacy and vice versa. Digital technologies exacerbate this tension. Whilst they have been central to the facilitation of the exercise of freedom of expression and the sharing of information, digital technologies have also greatly increased the opportunity for violations of the right to privacy on a scale not previously imaginable. In particular, digital technologies present serious challenges to enforcing the right to privacy and related rights because personal information can be collected and made available across borders on an unprecedented scale and at minimal cost for both companies and states. At the same time, the application of data protection laws and other measures to protect the right to privacy can have a disproportionate impact on the legitimate exercise of freedom of expression”.[70]

These inherent tensions between freedom of expression and the right to privacy, including personal data protection, also surface in the EU legislation. The EU has explicitly warned that the right to freedom of expression, the right to privacy and the protection of personal data “may suffer violations as a result of unlawful or arbitrary surveillance, interception of communications or collection of personal data, in particular when carried out on a mass scale.” [71] It expressed its commitment to promote “measures for the protection of the right to privacy and data protection including by calling on and supporting third countries to bring their relevant national legislation regarding transparency and proportionality of government access to personal data in conformity with international human rights law, where applicable.”[72]

Data protection has been recognised as a fundamental right in the EU.[73] With the adoption of Directive 95/46/EC (Data Protection Directive),[74] the EU embarked on regulating the processing of personal data as a response to the challenges that emerged in the internet age, particularly the increase of data usage and processing spurred by digital technologies. The Data Protection Directive provided for the free movement of data within the EU, emphasizing the economic approach of the EU policy while also guaranteeing the fundamental rights of EU citizens. However, since online platforms increasingly rely on automated decision-making technologies to moderate online content and capture users’ attention, their massive use of personal data is a key component of their power.[75] To address this issue, the EU switched to a more proactive approach to the protection of personal data by introducing positive obligations in the General Data Protection Regulation (GDPR),[76] whose main aim was to foster a degree of transparency and accountability in data processing.

The GDPR introduced regulatory requirements for the processing of personal data, including the collection, analysis, storage, and other processing activities. Under the GDPR, organizations must have a legal basis when they process personal data, and they must adhere to specific retention periods, conduct various assessments, facilitate individual rights, maintain a documented record of processing activities, and report data breaches, among other obligations introduced by the regulation. Two provisions in the GDPR are most relevant to the modernisation of media-related legislation: (1) protection of individuals’ rights to privacy in the context of datafication and platformization and (2) reporting by journalists and the media acting as watchdogs of public interest in democratic conditions.

When it comes to individuals’ rights, the GDPR’s focus is on “personal data”, which is defined as “any information relating to an identified or identifiable natural person (“data subject”), whereby this relation can be direct or indirect.”[77] As long as data relates to a data subject, the GDPR applies automatically and regulates the processing thereof. Media reporting must also protect data subjects, either as subjects of news reporting or involved in the reporting process in any other way, and comply with the GDPR. The controller of the data handling is the journalist and the media outlet that determines the purposes (e.g., publication) and medium (e.g., channel and platform of publishing) for processing personal data.[78]

Furthermore, the GDPR strictly regulates the principles of personal data processing, requiring the controller to ensure that the personal data is processed lawfully, fairly and in a transparent manner and collected only for specified, explicit and legitimate purposes (“purpose limitation”); limited to what is necessary for such purposes (“data minimisation”); and kept up to date (“accuracy”).[79]

The most relevant GDPR provision for media is the right to erasure, better known as the right to be forgotten.[80] In a notable case involving Google Spain,[81] which was decided under the Data Protection Directive but on similar legal grounds as enshrined in the GDPR, the issues of accurate and relevant journalistic reporting were considered by the European Court of Justice to be at odds with the rights of individuals, the subjects of news reporting, to protection of their data. The right to be forgotten turned out to be central to how individuals’ control over media publishing and in potential conflict with freedom of expression and journalistic freedoms. This right can be invoked by individuals to prevent publication or to have content about themselves (as data subjects) removed in case of a breach of their data protection right.

In the case involving Google Spain, a Spanish citizen requested the tech company Google to remove or conceal certain information about him from the search results, information that was lawfully published in a newspaper several years earlier. The European Court of Justice considered the competing legal claims and decided that the individuals’ rights to privacy and data protection were “overriding” both the economic interest of Google in processing the data as well as the interest of the public in having access to the information since no “preponderant” public interest was demonstrated in the case by the media outlet.

Notably, the European Court of Justice played a crucial role in setting the EU’s protection standards and enforcing fundamental rights enshrined in the EU Charter of Fundamental Rights. Google Spain was the first case law in the EU showing an attempt by the judiciary to fight the power of online platforms.

The perspective of journalists’ and the media in reporting, acting as watchdogs in a democracy, is most clearly covered through the journalism-related exemption from the GDPR rules on data processing and freedom of expression and information.[82] The exemption was not a new concept in EU data protection legislation. The Data Protection Directive included a similar provision, “updated” to some extent in the GDPR.

According to the journalistic exemption, national legislators within the EU were obliged to reconcile the right to data protection with freedom of expression and information, mainly when personal data was processed for journalistic purposes, and provide for necessary and legitimate exceptions. These rules were tailored to situations whereby the data controller, the journalist or the media outlet, reasonably assumes that a publication would be in the public interest. Leaving the journalistic exemption to be regulated by national legislators and enforced by national authorities and the judiciary led to the proliferation of divergent, not-aligned national regulatory approaches.[83]

For non-EU countries, it is equally relevant to closely study the application of the GDPR, namely its territorial scope.[84] The GDPR applies to processing personal data related to activities run by organizations, including media outlets, from the EU, regardless of whether the processing occurs there. At the same time, the GDPR also applies to processing personal data by organizations not established in the EU if the media company offers goods and services to data subjects located in the EU or monitors their behavior.

“The consequence of such a rule is twofold. On the one hand, this provision involves jurisdiction. The GDPR’s territorial scope of application overrides the doctrine of establishment developed by CJEU case law since even those entities not established in the EU will be subject to the GDPR. On the other hand, the primary consequence of such an extension of territoriality is to extend EU constitutional values to the global context”.[85]

Conclusions

The recent and ongoing legal developments in the EU legislation in the areas of regulating online media content and platforms, copyright in the digital era, safeguarding media freedom and pluralism, and data protection and privacy are offering key insights into the various aspects and even tensions that non-EU countries should consider when modernizing their media law (see the Lebanon-focused brief accompanying this report).

With the advent of digitization and platformization, whose impact is felt globally, legislators and regulators had to keep pace by addressing the challenges brought about by these trends when it comes to fundamental human rights and policy objectives. A decade ago, digitization promised “more freedoms” on all communication levels. Yet, we can see now that such potential can only be harnessed if digitization is protected through adequate safeguards.

The rise of modern digital constitutionalism in the EU was one of the major legalistic responses to the growing concerns regarding the technology-driven power amassed by platforms, “transnational corporations operating in the digital environment to perform quasi-public functions on a global scale,” which has been challenging fundamental rights and democratic values.[86]

This article has summarized the key underlying assumptions and considerations of EU legal acts, highlighting the positive impact that these laws and regulations have or can have on freedom of expression and media independence in the digital space and warning about a potential negative impact, which might further limit fundamental freedoms.

Attempts at modernizing media law have to consider the corresponding international standards on freedom of expression against the tenet that what is protected offline should enjoy the same level of protection online. The role of the state, the EU and the national legislators was analyzed in the framework of those standards, which also emphasized the limits of state intervention.

Recommendations for media reform in Lebanon

Since the Lebanese Constitution guarantees freedom of expression in a similar manner as in Europe,[87] the modernization of the law in Lebanon, including the Publication Law and the TV and Radio Broadcasting Law, should adhere to international standards and take stock of the ongoing legal arguments and debates around revamping the legislative framework to make it fit for the digital era.

The DSA can be considered a remarkable piece of legislation and regulation of online platforms that can potentially advance the interests of media content providers and journalists; however, the DSA is far from solving all the problems of quality media due to the large power exerted by platforms in the digital communication space. For Lebanon’s media sector, the DSA’s requirements on platforms’ transparency and accountability are of utmost relevance as they provide a systemic, risk-based regulatory approach towards online intermediaries.

On a different note, the impact of the newly introduced EU legal provisions on the availability of copyrighted content, with fair remuneration of rightsholders on the one hand, and the freedom of expression of users of online content-sharing platforms on the other is not known yet since the implementation in the EU countries is now ongoing. However, Lebanese policymakers and legislators should consider future legal developments in this area.

Copyright is a delicate matter that can affect freedom of expression and the media through its impact on access to and use of information. Copyright erects communication barriers, usually through exclusive rights, whereas exceptions from and limitations to such rights could help strike the right balance between competing claims. That being said, copyright should not be used as a barrier to the activities of journalists related to retrieving information material that they have access to. Such access should be weighed also against the public interest and not exclusively against the proprietary interests of rightsholders. Thus, well-targeted exceptions and limitations are crucial regulatory tools to ensure the free access of the media and users to the public sphere, which should not be endangered in any way by technical protection measures. Copyright is also central to shaping a digital constitutional framework for access to information and, therefore, must be considered a critical instrument of media freedom.

When it comes to personal data protection, the EU regulation is highly relevant to journalistic activities and media reporting. With regards to data protection, Lebanese stakeholders should strive to ensure among other things a balance between protecting individuals’ right to privacy while allowing for journalistic privileges in the form of exceptions from, and limitations to, the right to privacy, to ensure public interest is well served.

Finally, the EMFA proposal, expected to be adopted in October 2023,[88] is of utmost relevance for the Lebanese context mainly because of the process underlying the proposal, including the important issue of activating media pluralism tests within the EU. Yet, assessing the impact of media market concentrations on media pluralism and editorial independence is highly recommended in Lebanon before any steps towards the modernization of the law in Lebanon are made to duly justify any future laws and regulations. Lebanese policymakers should thus follow and study the progress closely and consider the necessity of similar legal safeguards of Lebanese journalists’ safety and protection.

In conclusion, using the EU experience in modernizing the media law, including both its advances and shortcomings, the following recommendations for the Lebanese lawmakers and policymakers should be considered:

  1. Regulating online media content and platforms should focus on well-defined areas such as the protection of minors against harmful content online, combating hate speech online, and requirements on platforms for transparency and accountability. In parallel, emphasis should be put on ensuring that national regulators, which oversee the application of the new rules, act as independent, professional and accountable public actors;
  2. Copyright in the digital era should carefully balance protection of rightsholders and the impact of copyright as a barrier to the freedom of expression;
  3. Safeguarding media freedom and pluralism should take advantage of new legal and regulatory instruments ensuring editorial independence, the transparency of media ownership and the independence of national media regulatory authorities;
  4. Data protection and privacy in the digital era must ensure the protection of individual privacy and balance such legal provisions with appropriate and well-tailored exemptions for journalistic privileges.

References

  1. U.N. Human Rights Council ’Resolution L13 – The Promotion, Protection and Enjoyment of Human Rights on the Internet’, 6 July 2012.

  2. General Comment No. 34 concerning Article 19 of the ICCPR adopted on 29 June 2011, by the UN Human Rights Committee, states the following (para 39):

    “States parties should ensure that legislative and administrative frameworks for the regulation of the mass media are consistent with the provisions of paragraph 3.92 Regulatory systems should take into account the differences between the print and broadcast sectors and the internet, while also noting the manner in which various media converge. … States parties must avoid imposing onerous licensing conditions and fees on the broadcast media, including on community and commercial stations. The criteria for the application of such conditions and licence fees should be reasonable and objective, clear, transparent, non-discriminatory and otherwise in compliance with the Covenant. Licensing regimes for broadcasting via media with limited capacity, such as audiovisual terrestrial and satellite services should provide for an equitable allocation of access and frequencies between public, commercial and community broadcasters. It is recommended that States parties that have not already done so should establish an independent and public broadcasting licensing authority, with the power to examine broadcasting applications and to grant licenses.”

    Paragraph 40 of the same document also establishes that “The State should not have monopoly control over the media and should promote plurality of the media. Consequently, States parties should take appropriate action, consistent with the Covenant, to prevent undue media dominance or concentration by privately controlled media groups in monopolistic situations that may be harmful to a diversity of sources and views.”

  3. Article 10 ECHR reads as follows: “1. Everyone has the right to freedom of expression. This right shall include freedom to hold opinions and to receive and impart information and ideas without interference by public authority and regardless of frontiers. This Article shall not prevent states from requiring the licensing of broadcasting, television or cinema enterprises; 2. The exercise of these freedoms, since it carries with it duties and responsibilities, may be subject to such formalities, conditions, restrictions or penalties as are prescribed by law and are necessary in a democratic society, in the interests of national security, territorial integrity or public safety, for the prevention of disorder or crime, for the protection of health or morals, for the protection of the reputation or rights of others, for preventing the disclosure of information received in confidence, or for maintaining the authority and impartiality of the judiciary.”

  4. See CCPR/C/21/Rev.1/Add.9, paras. 11 – 16.

  5. 1. They must be provided for by law, transparent and accessible to everyone (principle of legal certainty, predictability and transparency). 2. They must pursue one of the purposes set out in article 19.3 ICCPR, i.e., to protect the rights or reputations of others; to protect national security, public order or public health or morals (principle of legitimacy). 3. They must be proven necessary, as the least restrictive means required, and commensurate with the purported aim (principles of necessity and proportionality).

  6. Eli Pariser. (2011). The Filter Bubble: What the Internet Is Hiding From You. London: Penguin.

  7. Balázs Bodó, Natali Helberger, Sarah Eskens, & Judith Möller. (2019). Interested in Diversity. Digital Journalism, 7(2), 206–229. DOI: https://doi.org/10.1080/21670811.2018.1521292

  8. Krisztina Rozgonyi. (2020). Disinformation Online: Potential Legal and Regulatory Ramifications to the Right to Free Elections – Policy Position Paper. In F. Loizides, M. Winckler, U. Chatterjee, J. Abdelnour-Nocera, & A. Parmaxi (Eds.), Human Computer Interaction and Emerging Technologies: Adjunct Proceedings from the INTERACT 2019 Workshops.Ubiquity Press, 57–66. DOI: https://doi.org/10.18573/book3

  9. Reuters Institute for the Study of Journalism (RISJ). (2022). Reuters Institute Digital News Report 2022. Oxford: Reuters Institute for the Study of Journalism. https://reutersinstitute.politics.ox.ac.uk/sites/default/files/2022-06/Digital_News-Report_2022.pdf

  10. Reuters Institute for the Study of Journalism (RISJ). (2023). Journalism, Media, and Technology Trends and Predictions. https://reutersinstitute.politics.ox.ac.uk/sites/default/files/2023-01/Journalism_media_and_technology_trends_and_predictions_2023.pdf.

  11. Joan Barata Mir. Freedom of Expression and Privacy on Social Media: the Blurred Line Between the Private and the Public Sphere. 1 August 2023. MediaLaws. https://www.medialaws.eu/freedom-of-expression-and-privacy-on-social-media-the-blurred-line-between-the-private-and-the-public-sphere/.

  12. Articles 2, 6, 21, 49 of TEU and articles 7, 8, 10, 11, 22 of the EU Charter of Fundamental Rights.

  13. EU External Action. (2021). EU guidelines on freedom of expression online and offline. https://www.eeas.europa.eu/sites/default/files/09_hr_guidelines_expression_en.pdf.

  14. Perry Keller. (2011). The Media in European and International Human Rights Law. In P. Keller (Ed.), European and International Media Law: Liberal Democracy, Trade, and the New Media. Oxford: Oxford University Press. DOI: https://doi.org/10.1093/acprof:oso/9780198268550.003.0007.

  15. Keller. (2011). The Media in European…, cit.

  16. Matthias C. Kettemann (2020). The Normative Order of the Internet: A Theory of Rule and Regulation Online. Oxford, New York: Oxford University Press. DOI: https://doi.org/10.1093/oso/9780198865995.001.0001

  17. Giovanni De Gregorio. (2021). The Rise of Digital Constitutionalism in the European Union. International Journal of Constitutional Law, 19(1), 41–70. DOI: https://doi.org/10.1093/icon/moab001.

  18. Dennis Redeker, Lex Gill, & Urs Gasser. (2018). Towards digital constitutionalism? Mapping attempts to craft an Internet Bill of Rights. International Communication Gazette, 80(4), 302-319. DOI: https://doi.org/10.1177/17480485187571

  19. Edoardo Celeste. (2019). Digital Constitutionalism: A New Systematic Theorisation. International Review of Law, Computers & Technology, 33(1), 76–99. DOI: https://doi.org/10.1080/13600869.2019.1562604

  20. De Gregorio, The Rise of Digital Constitutionalism…, cit., p. 67, 70.

  21. Krisztina Rozgonyi (2018). A New Model for Media Regulation, Intermedia 46(1), 18–23., p. 66.

  22. Directive (EU) 2018/1808 of the European Parliament and of the Council of 14 November 2018 amending Directive 2010/13/EU on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) in view of changing market realities, OJ L 303, 28.11.2018, pp. 69–92.

  23. Krisztina Rozgonyi and Sally Broughton Micova. (2021).Editorial. Journal of Digital Media & Policy 12(3), 337–44. DOI: https://doi.org/10.1386/jdmp_00069_2.

  24. Sally Broughton Micova, Felix Hempel, & Sabine Jacques. (2018). Protecting Europe’s Content Production from US Giants. Journal of Media Law 10(2), 219–43. DOI: https://doi.org/10.1080/17577632.2019.1579296

  25. Directive (EU) 2019/790 of the European Parliament and of the Council of 17 April 2019 on copyright and related rights in the Digital Single Market and amending Directives 96/9/EC and 2001/29/EC, OJ L 130, 17.5.2019, p. 92–125.

  26. CDSMD, Article 17

  27. Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (‘Directive on electronic commerce’), OJ L 178, 17.7.2000, p. 1–16.

  28. Commission Recommendation of 1.3.2018 on measures to effectively tackle illegal content online.

  29. Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions, Tackling online disinformation: a European Approach, COM/2018/236 final.

  30. See at: https://digital-strategy.ec.europa.eu/en/library/2022-strengthened-code-practice-disinformation.

  31. Regulation (EU) 2022/1925 of the European Parliament and of the Council of 14 September 2022 on contestable and fair markets in the digital sector and amending Directives (EU) 2019/1937 and (EU) 2020/1828 (Digital Markets Act), OJ L 265, 12.10.2022, p. 1–66.

  32. Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act), OJ L 277, 27.10.2022, p. 1–102.

  33. Giovanni De Gregorio. (2022). Digital Constitutionalism in Europe: Reframing Rights and Powers in the Algorithmic Society. Cambridge Studies in European Law and Policy. Cambridge: Cambridge University Press. DOI: https://doi.org/10.1017/9781009071215

  34. Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions, Online Platforms and the Digital Single Market Opportunities and Challenges for Europe, COM/2016/0288 final.

  35. EU External Action. EU guidelines on freedom of expression online and offline.

  36. According to Article 28b (1) of the AVMSD, national legislation should introduce rules to hold VSPs responsible for ensuring any VSPs under their jurisdiction put in place appropriate measures to protect: minors from harmful content (which may impair their physical, mental or moral development), access to which shall be restricted; the general public from programs, user-generated videos and audiovisual commercial communications containing incitement to violence or hatred directed against a group of persons or a member of a group based on any of the grounds referred to in Article 21 of the Charter of Fundamental Rights of the European Union; the general public from programs, user-generated videos and audiovisual commercial communications containing content which is a criminal offense under European Union law (for example provocation to commit a terrorist offense or offenses concerning child pornography).

  37. Art. 28b (3) contains the set of measures on VSPs that each EU country has to introduce in the national legislation: (a) including and applying in the terms and conditions of the VSP services the requirements for protections; (b) including and applying in the terms and conditions of the VSP services the requirements for audiovisual commercial communications that are not managed by the VSP providers; (c) having functionality for users to indicate whether videos contain commercial communication; (d) establishing and operating transparent and user-friendly mechanisms for users of a VSP to report or flag to the VSP provider the content falling within one of the protected areas described in the previous section; (e) establishing and operating systems through which VSP providers explain to users what effect has been given to the reporting and flagging referred to in point (d); (f) establishing and operating age verification systems for users concerning content which may impair the development of minors; (g) establishing and operating easy-to-use systems allowing users to rate the content; (h) providing for parental control systems that are under the control of the end-user concerning content which may impair the development of minors; (i) establishing and operating transparent, easy-to-use and effective procedures for the handling and resolution of users’ complaints to the video-sharing platform provider about the implementation of the measures referred to in points (d) to (h); (j) providing effective media literacy measures and tools and raising users’ awareness of those measures and tools.

  38. Lubos Kuklis. (2019). Video-Sharing Platforms In AVMSD-A New Kind Of Content Regulation.n Forthcoming, Research Handbook on EU Media Law and Policy. Cheltenham: Edward Elgar Publishing DOI: https://doi.org/10.2139/ssrn.3527512.

  39. Kuklis. (2019). Video-Sharing Platforms…, cit.

  40. See the DSA text at https://digital-strategy.ec.europa.eu/en/policies/digital-services-act-package.

  41. According to Article 3 (g) of the DSA, the definition of an “intermediary service” refers to “mere conduit”, to “caching” and to “hosting” services, which includes the provision of services on online platforms, such as social media and similar others.

  42. See the EBU (2023) Digital Services Act-A Handbook for Public Service Media.

  43. Based on The Digital Services Act and the Implications for News Media and Journalistic Content (Part 1)’ by the DSA Observatory at: https://dsa-observatory.eu/2022/09/29/digital-services-act-implications-for-news-media-journalistic-content-part-1/.

  44. The Digital Services Act , cit.

  45. Alexandra Couto. (2008). Copyright and Freedom of Expression: A Philosophical Ma., In A. Gosseries, A. Marciano, & A. Strowl (Eds.), Intellectual Property and Theories of Justice.London: Palgrave Macmillan UK 160–87. DOI: https://doi.org/10.1057/978-0-230-58239-2_9.

  46. Many of the EU directives reflect Member States’ obligations under the Berne Convention and the Rome Convention, as well as the obligations of the EU and its Member States under the World Trade Organisation ‘TRIPS’ Agreement and the two 1996 World Intellectual Property Organisation (WIPO) Internet Treaties (the WIPO Copyright Treaty and the WIPO Performances and Phonograms Treaty). In the last years the EU has signed two other WIPO Treaties: the Beijing Treaty on the Protection of Audiovisual Performances and the Marrakesh Treaty to Facilitate Access to Published Works for Persons who are Blind, Visually Impaired or otherwise Print Disabled.

  47. Directive (EU) 2019/790 of the European Parliament and of the Council of 17 April 2019 on copyright and related rights in the Digital Single Market and amending Directives 96/9/EC and 2001/29/EC.

  48. De Gregorio, The Rise of Digital Constitutionalism in the European Union, cit.

  49. See in Italy, Resolution no. 3/23/CONS of 19 January 2023, of the Italian Communications Authority (AGCOM), a first step in protecting copyright and related rights in the digital single market envisaged by Directive 2019/790 (and in particular in Article 15), approved the Regulation on fair compensation.

  50. Google. (2023). Google licenses content from news publishers under the EU Copyright Directive. Author: Sulina Connal. https://blog.google/around-the-globe/google-europe/google-licenses-content-from-news-publishers-under-the-eu-copyright-directive/

  51. Article 17 (4) CDSMD.

  52. Article 17(7) CDSMD.

  53. Communication from the Commission to the European Parliament and the Council Guidance on Article 17 of Directive 2019/790 on Copyright in the Digital Single Market, COM(2021) 288 final. The Guidance is a 27-page document that is divided into seven sections: Introduction (I); a specific copyright authorization and liability regime (II); Service providers covered (III); art. 17(1) and (2) authorizations (IV); art. 17(4) specific liability mechanisms (V); safeguards for legitimate uses of content and complaint and redress mechanisms (VI); and transparency and information obligations (VII).

  54. Felipe Romero Moreno. (2020). “Upload Filters” and Human Rights: Implementing Article 17 of the Directive on Copyright in the Digital Single Market. International Review of Law, Computers & Technology 34(2), 153–82. DOI: https://doi.org/10.1080/13600869.2020.1733760

  55. UNESCO. (2022). World Trends in Freedom of Expression and Media Development: Global Report 2021/2022. https://www.unesco.org/reports/world-media-trends/2021/en

  56. European University Institute, Konrad Bleyer-Simon, Elda Brogi, Roberta Carlini, Iva Nenadić, et al. (2023). Monitoring media pluralism in the digital era : application of the Media Pluralism Monitor in the European Union, Albania, Montenegro, the Republic of North Macedonia, Serbia and Turkey in the year 2022. Centre for Media Pluralism and Media Freedom, European University Institute. , DOI: https://doi.org/10.2870/087286

  57. See more at http://tinyurl.com/5yj33nme.

  58. See more at https://ec.europa.eu/info/publications/2020-rule-law-report-communication-and-country-chapters_en.

  59. Proposal for a Regulation of the European Parliament and of the Council establishing a common framework for media services in the internal market (European Media Freedom Act) and amending Directive 2010/13/EU, COM/2022/457 final.

  60. Where public service media exist, their funding should be adequate and stable to ensure editorial independence. The head and the governing board of public service media will have to be appointed in a transparent, open and non-discriminatory manner. Public service media providers shall provide a plurality of information and opinions, in an impartial manner, in accordance with their public service mission.

  61. Media service providers will have to ensure transparency of ownership by publicly disclosing such information and take measures with a view to guaranteeing the independence of individual editorial decisions.

  62. Building on the DSA, the EMFA put forward additional safeguards against the unjustified removal of media content produced according to professional standards. In cases not involving systemic risks such as disinformation, very large online platforms that intend to take down certain legal media content considered to be contrary to the platform’s policies will have to inform the media service providers about the reasons before such takedown takes effect. Any complaints lodged by media service providers will have to be processed with priority by those platforms.

  63. The EMFA proposes to introduce a right of customisation of the media offer on devices and interfaces, such as connected TVs, enabling users to change the default settings to reflect their own preferences.

  64. Audience measurement systems and methodologies shall comply with principles of transparency, impartiality, inclusiveness, proportionality, non-discrimination and verifiability.

  65. The EMFA is to establish new requirements for the allocation of state advertising to media so that it is transparent and non-discriminatory.

  66. CMPF, Media Pluralism Monitor (MPM), 2023, cit.

  67. EU External Action, EU guidelines on freedom of expression online and offline.

  68. Proposal for a Directive of the European Parliament and of the Council on protecting persons who engage in public participation from manifestly unfounded or abusive court proceedings (“Strategic lawsuits against public participation”); COM/2022/177 final.

  69. See the EU Legislation in Progress at http://tinyurl.com/ue3dctc6.

  70. Article 19. (2017). The Global Principles on Protection of Freedom of Expression and Privacy. https://www.article19.org/resources/the-global-principles-on-protection-of-freedom-of-expression-and-privacy/

  71. EU External Action, EU guidelines on freedom of expression online and offline.

  72. EU External Action, EU guides…, cit.

  73. Article 8, Protection of personal data of the EU Charter of Fundamental Rights.

  74. Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data OJ L 281, 23.11.1995, p. 31–50.

  75. Damian Tambini and Martin Moore. (2018). Digital Dominance: The Power of Google, Amazon, Facebook, and Apple.New York, NY, USA: Oxford University Press.

  76. Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (Text with EEA relevance) OJ L 119, 4.5.2016, p. 1–88.

  77. GDPR, Article 4(1).

  78. GDPR, Article 4(7).

  79. GDPR, Article 5.

  80. GDPR, Article 17.

  81. Case C-131/12, Google Spain SL and Google Inc. v. Agencia Española de Protección de Datos (AEPD) and Mario Costeja González, ECLI:EU:C:2014:317 (May 13, 2014).

  82. GDPR, Article 85.

  83. Natalija Bitiukova. The GDPR’s Journalistic Exemption and its Side Effects: GDPR anniversary – what does it mean for the media?. June 16, 2023, VerfBlog, https://verfassungsblog.de/the-gdprs-journalistic-exemption-and-its-side-effects/. DOI: 10.17176/20230616-111120-0.

  84. GDPR, Article 3.

  85. De Gregorio, The Rise of Digital Constitutionalism in the European Union, cit.

  86. De Gregorio, The Rise of Digital Constitutionalism in the European Union, cit.

  87. Article 13 of the Lebanese Constitution stipulates that “The freedom of opinion, expression through speech and writing, the freedom of the press, the freedom of assembly, and the freedom of association, are all guaranteed within the scope of the law.”

  88. See the Legislative Train Schedule at https://www.europarl.europa.eu/legislative-train/theme-a-new-push-for-european-democracy/file-european-media-freedom-act.

Author

Dr Krisztina Rozgonyi is Senior Scientist at the Institute for Comparative Media and Communication Studies (CMC) of the Austrian Academy of Sciences (ÖAW) and a senior international media, telecommunication and IP legal and policy expert. She works with international and European organizations (such as the ITU/UN, UNESCO, Council of Europe, European Commission, World Bank InfoDev, OSCE and BBC MA), with national governments, and regulators as an adviser on media freedom, spectrum policy and digital platform governance.

Editors

Marius Dragomir and Judit Szakács

Published by

Media and Journalism Research Center (MJRC)

MJRC is an independent media research and policy think tank that seeks to improve the quality of media policymaking and the state of independent media and journalism through research, knowledge sharing and financial support. The center’s main areas of research are regulation and policy, media ownership and funding, and the links between tech companies, politics and journalism.

Maharat Foundation

Maharat Foundation is a women-led freedom of expression organization based in Beirut dedicated to campaigns grounded in research and strengthening connections between journalists, academics, and policy makers.

It advances and enables freedom of expression, quality information debate and advocates for information integrity online and offline. Maharat promotes innovation and engages the journalistic community and change agents within Lebanon and the wider, MENA region to promote inclusive narratives and debates and to counter misinformation, disinformation, and harmful content.

Project brief

This publication is within the project entitled “Media Reform to Enhance Freedom of Expression in Lebanon”, implemented by Maharat Foundation, Legal Agenda and the Media and Journalism Research Center (MJRC) with the support of the Delegation of the European Union to Lebanon.

The project aims at enhancing Freedom of Expression in Lebanon through the promotion of media law reform as a priority on the national agenda and improvement of the environment for media coverage on the transparency and accountability of elections process.

The project supports the publication of background papers produced by Maharat Foundation on the local Lebanese context and by MJRC on the European standards and best fit recommendations for Lebanon.

The papers cover 6 main themes: Protection of journalists and their sources, Associations of journalists, Decriminalization, Incentives, Innovation, and Regulation, co-regulation and self-regulation opportunities for the media.

See more on our project page.

Disclaimer

This publication was funded by the European Union. Its contents are the sole responsibility of Media and Journalism Research Center (MJRC) and do not necessarily reflect the views of the European Union.

Cite this article

Krisztina Rozgonyi. (2024). How to Modernize Media Laws to Cope With Digital Change. Tallinn/London/Santiago de Compostela: Media and Journalism Research Center (MJRC).

Article download