It can take a while depending on the size of the document..please wait
Discuto
0 Tage noch (endet 27 Apr)
Beschreibung
Less than a year ago, the European Commission announced their plan for the Digital Services Act. This is about a legislative tool, either Directive or Regulation, that will recast the current E-Commerce Directive. The E-Commerce Directive dates back in 2000 and is the framework based on which Member States have adjusted their national laws as regards the function, responsibilities and rights of the “information society services” in the internal market. More specifically, it addresses issues such as the “country of origin” of the information society services, the tackling of “unsolicited commercial communication”, electronic contracts, the limited liability of intermediaries online and the restriction of any general obligation to monitor the information transmitted or stored by them.
Ahead of this legislative proposal that is expected by the end of this year, three Committees in the European Parliament have been assigned with the drafting of own-initiative Reports which will serve as a first opinion by the Parliament and a valuable guide for the Commission’s proposal. These reports are the “Digital Services Act: Improving the functioning of the Single Market” in the Internal Market and Consumer Protection Committee (IMCO), the “Digital Services Act and fundamental rights issues posed” in the Civil Liberties, Justice and Home Affairs Committee (LIBE) and the “Digital Services Act: adapting commercial and civil law rules for commercial entities operating online” in the Legal Affairs Committees (JURI). On top of these Reports the relevant Committees of the Parliament are assigned with the Opinion to these Reports, in order to finally have a text, that responds to the needs of the EU citizens taking into account all the relevant angles.
In this process, I have been allocated as a Rapporteur in JURI for an Opinion to the Report of IMCO, which is the item of the respective call. Apart from that, I am allocated as a Shadow Rapporteur for the Report in JURI, the Report in LIBE, and the LIBE Opinion to IMCO’s Report.
The Opinion in JURI, for which I am the Rapporteur, focuses solely on the Report prepared in IMCO. This means that taking into account the competences of the JURI Committee, the Opinion should provide useful and efficient proposals that should be taken into account by the the IMCO Rapporteur. Taking the above into consideration, I have addressed very specific points in my draft Opinion. Also, due to the limitation set by the administration as regards the first draft, the text cannot be extended at the moment, however, your input is valuable and it could be taken into consideration in the future, when I will consider the addition of Amendments.
As you may see the Opinion is drafted as such:
-
Right to Privacy online and anonymous use of digital services: requests anonymity online where technically and reasonable, privacy and data protection laws should be respected
-
General monitoring and automated tools: asks for a ban on any general monitoring obligation and mandatory upload-filters
-
Contract Terms and conditions: demands fairness and compliance with fundamental rights standards of terms and conditions, otherwise those terms shall remain non-binding
-
Addressing illegal content: goes through a “notice and action” procedure based on the duty of care of the intermediaries. Keeps the current “limited liability” regime, where intermediaries are deemed liable and could be requested to remove content solely when they have actual knowledge of the illegal activity. Any “notice and action” procedure shall remain clear, proportionate and with respect to fundamental rights, especially the freedom of expression and the right to privacy.
-
Addressing the spread of unwanted content: clarifies the difference between “illegal” and “harmful” content and calls for alternative methods to tackle what would be deemed as “harmful” by the intermediaries. Platforms shall not act as the “internet police” and content shall be removed based on existing laws and judicial orders on order to respect the rights of both the users and the businesses online.
-
Interconnectivity of platforms: calls for giving users the opportunity to choose which platform to use but still stay connected with users that decide to use different platforms, which could be achieved via API access cross-platform interaction.
As mentioned above, the aim is to have all Reports adopted in plenary by September 2020. The European Commission will evaluate these Reports and will issue its legislative proposal by December 2020. After that, the European Parliament and the European Council will consider their positions for a final legislative text to be adopted and implemented across the EU. This means that this is only the beginning of one of the most significant “digital” files of this mandate. We would therefore like to invite you to be part of this fruitful discussion through your feedback to my draft. My team and I remain at your disposal for any further questions or concerns.
Weitere Informationen
LETZTE AKTIVITÄT
GRAD DER ZUSTIMMUNG
AM MEISTEN DISKUTIERT
LETZTE KOMMENTARE
-
As much as we support the content provider's right for a counter-notice, there are reasonable exceptions in which it might be legitimate to refrain from informing the provider about blocking/removing their content. After the sentence 'that the content provider shall be heard before disabling access to their content', it can thus be considered to add something along this line: 'unless it would risk impeding criminal investigations in exceptional cases (e.g., sharing child sexual abuse material)'.
-
Given the ongoing development of automated technologies, and their already established application in some areas (e.g., recognizing child sexual abusive material), we would suggest the wording 'automated tools struggle to differentiate' instead of 'are unable to differentiate' in the first sentence, to make the paragraph more future-proof. In addition, we support the comment of Access Now.
P2
The Committee on Legal Affairs calls on the Committee on the Internal Market and Consumer Protection to incorporate the following suggestions:
Kommentar hinzufügen
P3
A. Whereas the rules enshrined in Directive 2000/31/EC on electronic commerce have allowed for the development of the Internet and of digital services in the EU since two decades, and are key in protecting fundamental rights as well as in safeguarding an innovative business environment; considering that their revision should not be envisaged without thorough scrutiny and utmost caution.
Kommentar (1) anzeigen/hinzufügen
P5
- Stresses that wherever it is technically possible and reasonable, intermediaries shall be required to enable the anonymous use of their services and payment for them, as anonymity effectively prevents unauthorized disclosure, identity theft and other forms of abuse of personal data collected online; where the Directive on Consumer Rights requires commercial traders to communicate their identity, providers of major market places could be obliged to verify their identity, while in other cases the right to use digital services anonymously shall be upheld.
Kommentare (2) anzeigen/hinzufügen
P7
- Notes that since the online activities of an individual allow for deep insights into their personality and make it possible to manipulate them, the collection and use of personal data concerning the use of digital services shall be subjected to a specific privacy framework and limited to the extent necessary to provide and bill the use of the service. Public authorities shall be given access to Internet subscriber and metadata only to investigate suspects of serious crime with prior judicial authorisation.
Kommentare (3) anzeigen/hinzufügen
P9
- Reiterates that hosting service providers or other technical intermediaries shall not be obliged to generally monitor user-generated content.
Kommentare (4) anzeigen/hinzufügen
P10
- Notes that automated tools are unable to differentiate illegal content from content that is legal in a given context; highlights that a review of automated reports by service providers, their staff or their contractors does not solve this problem as private staff lacks independence, qualification and accountability; therefore stresses that the Digital Services Act shall explicitly prohibit any obligation on hosting service providers or other technical intermediaries to use automated tools for content moderation; content moderation procedures used by providers shall not lead to any ex-ante control measures or upload-filtering of content;
Kommentare (3) anzeigen/hinzufügen
P12
- Underlines that the fairness and compliance with fundamental rights standards of terms and conditions imposed by intermediaries to the users of their services shall be subject to judicial review. Terms and conditions unduly restricting users’ fundamental rights, such as the right to privacy and to freedom of expression, shall not be binding.
Kommentar (1) anzeigen/hinzufügen
P14
- Highlights that, in order to constructively build upon the rules of the e-Commerce Directive and to ensure legal certainty, the Digital Services Act shall exhaustively and explicitly spell out the obligations of digital service providers rather than imposing a general duty of care; highlights that the existing legal regime for digital providers liability should not depend on uncertain notions such as the ‘active’ or ‘passive’ role of providers.
Kommentare (2) anzeigen/hinzufügen
P15
8. Stresses that the responsibility for enforcing the law, deciding on the legality of speech online and ordering hosting service providers to remove or disable access to content as soon as possible shall rest with independent and democratically accountable public authorities; only a hosting service provider that has actual knowledge of illegal content and its illegal nature shall be subject to content removal obligations.
Kommentare (3) anzeigen/hinzufügen
P17
10. Suggests that major commercial hosting service providers should provide a publicly and anonymously accessible notice and action mechanism for reporting allegedly illegal content published on their platform, and that notices should be examined by qualified staff based on clear criteria, that the content provider shall be heard before disabling access to their content, and that adequate redress mechanisms, both via dispute settlement bodies and judicial authorities, should be made available; while applying reasonable time-frames; highlights that persons who systematically and repeatedly submit wrongful or abusive notices shall be sanctioned; underscores that smaller commercial and non-commercial providers shall not be subject to these obligations.
Kommentare (5) anzeigen/hinzufügen
P18
- Calls on the Commission to consider obliging major hosting service providers to report serious crime to the competent law enforcement authority, upon obtaining actual knowledge or awareness of such a crime.
Kommentare (3) anzeigen/hinzufügen
P19
- Stresses that proportionate sanctions should be applied to violations of criminal and civil law, which shall not encompass excluding individuals from digital services.
Kommentar (1) anzeigen/hinzufügen
P20
- Highlights that in order to protect freedom of speech standards, to avoid conflicts of laws, to avert unjustified and ineffective geo-blocking and to aim for a harmonised digital single market hosting service providers shall not be required to remove or disable access to information that is legal in their country of origin.
Haben Sie gewusst, dass man über Kommentare abstimmen kann? Sie können auch direkt auf Kommentare antworten!