It can take a while depending on the size of the document..please wait
Discuto


0 days left (ends 27 Apr)
description
Less than a year ago, the European Commission announced their plan for the Digital Services Act. This is about a legislative tool, either Directive or Regulation, that will recast the current E-Commerce Directive. The E-Commerce Directive dates back in 2000 and is the framework based on which Member States have adjusted their national laws as regards the function, responsibilities and rights of the “information society services” in the internal market. More specifically, it addresses issues such as the “country of origin” of the information society services, the tackling of “unsolicited commercial communication”, electronic contracts, the limited liability of intermediaries online and the restriction of any general obligation to monitor the information transmitted or stored by them.
Ahead of this legislative proposal that is expected by the end of this year, three Committees in the European Parliament have been assigned with the drafting of own-initiative Reports which will serve as a first opinion by the Parliament and a valuable guide for the Commission’s proposal. These reports are the “Digital Services Act: Improving the functioning of the Single Market” in the Internal Market and Consumer Protection Committee (IMCO), the “Digital Services Act and fundamental rights issues posed” in the Civil Liberties, Justice and Home Affairs Committee (LIBE) and the “Digital Services Act: adapting commercial and civil law rules for commercial entities operating online” in the Legal Affairs Committees (JURI). On top of these Reports the relevant Committees of the Parliament are assigned with the Opinion to these Reports, in order to finally have a text, that responds to the needs of the EU citizens taking into account all the relevant angles.
In this process, I have been allocated as a Rapporteur in JURI for an Opinion to the Report of IMCO, which is the item of the respective call. Apart from that, I am allocated as a Shadow Rapporteur for the Report in JURI, the Report in LIBE, and the LIBE Opinion to IMCO’s Report.
The Opinion in JURI, for which I am the Rapporteur, focuses solely on the Report prepared in IMCO. This means that taking into account the competences of the JURI Committee, the Opinion should provide useful and efficient proposals that should be taken into account by the the IMCO Rapporteur. Taking the above into consideration, I have addressed very specific points in my draft Opinion. Also, due to the limitation set by the administration as regards the first draft, the text cannot be extended at the moment, however, your input is valuable and it could be taken into consideration in the future, when I will consider the addition of Amendments.
As you may see the Opinion is drafted as such:
-
Right to Privacy online and anonymous use of digital services: requests anonymity online where technically and reasonable, privacy and data protection laws should be respected
-
General monitoring and automated tools: asks for a ban on any general monitoring obligation and mandatory upload-filters
-
Contract Terms and conditions: demands fairness and compliance with fundamental rights standards of terms and conditions, otherwise those terms shall remain non-binding
-
Addressing illegal content: goes through a “notice and action” procedure based on the duty of care of the intermediaries. Keeps the current “limited liability” regime, where intermediaries are deemed liable and could be requested to remove content solely when they have actual knowledge of the illegal activity. Any “notice and action” procedure shall remain clear, proportionate and with respect to fundamental rights, especially the freedom of expression and the right to privacy.
-
Addressing the spread of unwanted content: clarifies the difference between “illegal” and “harmful” content and calls for alternative methods to tackle what would be deemed as “harmful” by the intermediaries. Platforms shall not act as the “internet police” and content shall be removed based on existing laws and judicial orders on order to respect the rights of both the users and the businesses online.
-
Interconnectivity of platforms: calls for giving users the opportunity to choose which platform to use but still stay connected with users that decide to use different platforms, which could be achieved via API access cross-platform interaction.
As mentioned above, the aim is to have all Reports adopted in plenary by September 2020. The European Commission will evaluate these Reports and will issue its legislative proposal by December 2020. After that, the European Parliament and the European Council will consider their positions for a final legislative text to be adopted and implemented across the EU. This means that this is only the beginning of one of the most significant “digital” files of this mandate. We would therefore like to invite you to be part of this fruitful discussion through your feedback to my draft. My team and I remain at your disposal for any further questions or concerns.
Further info
LATEST ACTIVITY
LEVEL OF AGREEMENT
MOST DISCUSSED PARAGRAPHS
-
P17 10. Suggests that major commercial hosting s
5 3
-
P24 Stresses that in order to overcome the lock-
4 0
-
P9 Reiterates that hosting service providers or
4 4
-
P16 Underlines that illegal content should be re
3 2
-
P20 Highlights that in order to protect freedom
3 0
-
P18 Calls on the Commission to consider obliging
3 5
-
P15 8. Stresses that the responsibility for enfo
3 1
-
P10 Notes that automated tools are unable to dif
3 0
-
P7 Notes that since the online activities of an
3 0
-
P14 Highlights that, in order to constructively
2 3
-
P5 Stresses that wherever it is technically pos
2 2
-
P6
2 2
LATEST COMMENTS
-
As much as we support the content provider's right for a counter-notice, there are reasonable exceptions in which it might be legitimate to refrain from informing the provider about blocking/removing their content. After the sentence 'that the content provider shall be heard before disabling access to their content', it can thus be considered to add something along this line: 'unless it would risk impeding criminal investigations in exceptional cases (e.g., sharing child sexual abuse material)'.
-
Given the ongoing development of automated technologies, and their already established application in some areas (e.g., recognizing child sexual abusive material), we would suggest the wording 'automated tools struggle to differentiate' instead of 'are unable to differentiate' in the first sentence, to make the paragraph more future-proof. In addition, we support the comment of Access Now.
-
We recommend adding "this is without prejudice to the consumer contracts derogation to the country of origin principle, which shall be maintained." at the end of this paragraph. In line with the Rome I Regulation, the consumer contracts derogation to the country of origin principle must be preserved (cf. Article 3 and the Annex of the current eCommerce Directive). The country of origin principle covers the Directive’s information obligations but does not apply to contractual obligations concerning consumer contracts. For the latter, the consumer’s home country law prevails if goods, services or digital content are targeted to that country and the consumer protection level is higher. Any push to expand the scope of the Internal market clause to e.g. consumer protection/contracts would be an explosive paradigm change of EU consumer law. While consumer law has been more harmonised since 2000, EU law still leaves margin of manoeuvre to Member States to act so countries can address national, regional and local issues.
-
It is important not to narrow the scope of cooperation between providers and competent authorities to criminal activities. From a consumer perspective, for example, it’s of utmost relevance that online marketplaces (which are a type of hosting services) collaborate with market surveillance, consumer protection and customs authorities with regard to illegal products sold or promoted online. Therefore, we recommend replacing "serious crime" with "illegal activities" and "crime" with "illegality". Similarly, we recommend deleting "law" from "law enforcement authorities" to cover other competent authorities.
-
- Protecting consumers against illegal activities should not be dependent on the size of a company. Another question is whether enforcement could focus on major players and whether more liability or obligations could be imposed on players with a dominant influence on the value chain. Therefore, we recommend deleting "major" and the last sentence of this paragraph. - We recommend adding "in complex areas of law" right after "should be examined by qualified staff based on clear criteria, that...". See BEUC’s position on notice and action: https://www.beuc.eu/publications/2012-00543-01-e.pdf
-
- Changing “speech” for “activities” is important as the scope of the eCommerce Directive covers EU law areas that go way beyond “speech”, such as product safety or consumer protection rules. See also the wording of Article 14 1, a) of the eCommerce Directive - Of course, it is not up to hosting service providers to define what is legal or not, but hosting service providers must comply with legal obligations to tackle alleged illegal activities. Hence, why we suggest to add "primarily" before "rest". - We suggest to eliminate "only". This is because hosting service providers are not the only category of online intermediaries set forth in the eCommerce Directive. Even within the category of “hosting service providers”, - “or awareness” should be added after "knowledge", in line with Article 14 of the eCommerce Directive. - In line with research by the European Law Institute , BEUC asks for a special liability regime for those digital services that facilitate the conclusion of contracts, as per the definition of online marketplaces set forth in the Omnibus Directive. Online marketplaces are not mere “hosting providers”. Therefore, we recommend adding the following: "without prejudice to exercise a right to redress towards the party at fault, online marketplaces must be liable for damages, contract performance and guarantees, and consumers must be able to exercise the same rights and remedies available against the supplier or producer for failure to inform about the supplier of the goods or services; for providing misleading information, guarantees, or statements; or where the platform has a predominant influence over suppliers. Online marketplaces must also be liable upon obtaining credible evidence of illegal activities, without incurring a general duty to monitor the activity of platform users."
-
Would recommend adding "the law" after compliance with" and considering "not be applicable and shall be deleted if they are deemed illegal or unfair clauses" instead of "not be binding". EU law ensures possibilities to condemn companies, including online platforms, for unfair or illegal contractual terms. See, for example our French member organisation UFC-Que Choisir’s actions against Google, Twitter and Facebook after several years of court proceedings: • Twitter was obliged to delete 250 unfair or illegal clauses from its terms and conditions: https://www.quechoisir.org/action-ufc-que-choisir-reseaux-sociaux-et-clauses-abusives-l-ufc-que-choisir-obtient-la-suppression-de-centaines-de-clauses-des-conditions-d-utilisation-de-twitter-n57621/ • The court ruled that Google resorted to 209 unfair/illegal contract clauses, including on use of location data: https://www.quechoisir.org/action-ufc-que-choisir-donnees-personnelles-l-ufc-que-choisir-obtient-la-condamnation-de-google-n63567/ • The court ruled Facebook had 430 unfair or illegal contractual clauses: https://www.quechoisir.org/action-ufc-que-choisir-donnees-personnelles-l-ufc-que-choisir-obtient-la-condamnation-de-facebook-n65523/
-
It is important not to rule ex-ante measures out to reduce the scale of illegal activities online. For example, it is important to ensure more transparency and accountability of advertisements. Prior checks can avoid or at least reduce illegal ads (e.g. promoting unsafe products) circulating on the web. Therefore, we suggest deleting "any ex-ante control measures or".
-
- It is unclear what is meant by “a specific privacy framework”. There are existing rules to help address this problem, notably the General Data Protection Regulation (GDPR). We recommend changing "specific privacy framework" to "the application and enforcement of privacy and data protection legislation". - Data access does not only happen in the context of criminal matters. Companies should also share relevant data (e.g. when they are flagged illegal activity) with competent authorities. Sharing data with researchers also helps to bring more accountability. See, for example: • European Data Protection Supervisor, A Preliminary Opinion on data protection and scientific research, https://edps.europa.eu/sites/edp/files/publication/20-01-06_opinion_research_en.pdf • Ben Wagner, Krisztina Rozgonyi, et al, Regulating transparency?: Facebook, Twitter and the German Network Enforcement Act, https://dl.acm.org/doi/abs/10.1145/3351095.3372856 Therefore, we suggest adding "The Digital Services Act should ensure platforms share relevant data with competent authorities and independent researchers, redacting and anonymising personal data, as appropriate."
-
The Digital Services Act should establish a robust business user authentication obligation and a service and product verification process obligation, while preserving the right to anonymity of consumers. The anonymity of users should not be confused with anonymity of professional traders or corporations, particularly when aiming to sell or promote products online. As mentioned during the description of the Opinion, “privacy and data protection laws should be respected”, but these do not extend to the anonymity of legal persons. To reduce the scale of illegal activities, bring more accountability and better redress, platforms should enhance legal checks before including suppliers on their sites and applications. This requirement should complement Article 5 of the current eCommerce Directive and Article 5 of the Consumer Rights Directive, ensuring that service providers, sellers, manufacturers and products are identifiable. An example of why these requirements are important: it only took our member organisation Which? a few minutes to list an unsafe car seat on Amazon MarketPlace. This is despite Which? flagging the sale of killer child car seats to the platform in 2014, in 2017 and in 2019, followed by a BBC investigation in 2020; and despite Amazon only allegedly allowing approved sellers to list child car seats on its Marketplace. Which? managed to do this testing by duplicating the listing of another seller. This means that a business user authentication and a product or service verification process to list them needs to be robust. Which? staff was not asked for any proof of compliance at any stage, and their illegal car seat listing stayed live for two weeks until Which? itself chose to remove it in view of Amazon’s lack of action.
-
I am not very acquainted with this subject and am mostly a concerned citizen passing by. As long as the social network decides what to show at all (i.e. curates content), the problem of them manipulating the order in which it is showed seems irrelevant to me. So for the chronological order provision to be useful, we would need not only that curation requires consent, but "meaningful consent". In this case, by "meaningful consent" I mean that the service cannot depend on consent to curation, i.e. no "take it or leave it" schemes.
-
Single sign-on is another instrument that can be used by the big platforms to track users in all their online activities, and it is yet another walled garden - you can only pick among Google, Facebook or Apple. There are technical efforts to define open, federated standards for Internet-wide single sign-on, allowing users to pick any provider or even self-host their identity, but they will never be supported by websites because there are no users, and users will not use them because they are not supported. So it would be important to require that, whenever the closed single sign-on systems by some of the big players are supported, at least one open and federated identity system must be supported as well, having a regulatory agency pick one of the open, non-proprietary frameworks (e.g. OpenID Connect + ID4me).
-
We would recommend removing 'awareness' of criminality as such awareness could be obtained through companies' automated systems. This could have the perverse effect of disincentivising them to invest in automated tools lest they are fixed with knowledge of illegality and are therefore held liable for it or are found in breach of their reporting obligations in circumstances where the content was flagged and removed without a human in the loop.
-
This paragrah seems to suggest that platforms would be required to decide the legality of content in the first instance - which, in our view, would be undesirable. It is unclear what would happen if a notice is adequately filled out but the content being notified is borderline so that it may well be lawful. What would be the consequences for the intermediary if they do not remove the content. We have set out the kinds of notice and action procedures we think would be appropriate depending on the content at issue in the following document: https://www.article19.org/wp-content/uploads/2020/04/ARTICLE-19s-Recommendations-for-the-EU-Digital-Services-Act-FINAL.pdf
-
ARTICLE 19 would generally recommend making explicit reference to the courts or judicial authority in relation to the determination of legality of content. Beyond reasons of principle, courts are arguably not 'democractically accountable', at least not in the same way as a public body (e.g. by laying down annual reports before Parliament). Judges are also not elected in many countries.
-
We suggest the following changes to the text: Calls on the Commission to consider obliging major hosting service providers to report manifestly illegal content irrespective of its context [DELETE 'serious crimes and REPLACE WITH 'manifestly illegal content irrespective of its context] to the competent law enforcement authority, upon obtaining actual knowledge [DELETE 'or awareness of such a crime' and ADD 'established by a valid notice.'] For clarifications, see the attachment to this comment.
-
The state is a primary duty bearer in protecting fundamental rights of online users. Notice-and-action is a broad term that comprises several mechanisms with different types of responses to illegal content. They are all, however, initiated by a notice. Hence, the Digital Services Act needs to establish the most suitable notice-and-action mechanisms. Different types of illegal online content and activities may require different responses specifically tailored to the type of user-generated content that they are supposed to tackle. However, the law has to clearly define the procedures and provide appropriate safeguards for their application. Our recommendations to improve the paragraph: 1) We suggest modifying the sentence as follows: “Digital Services Act should establish the details of notice-and-action procedures. Based on requirements and minimum safeguards imposed by the formal legislative framework, major commercial hosting service providers should provide a publicly and anonymously accessible notice and action mechanism for reporting allegedly illegal content published on their platform. Notices should be examined by qualified staff based on clear criteria, that the content provider shall be informed and heard before disabling access to their content. States must regularly evaluate the possible unintended effects of any restrictions before and after applying particular notice-and-action procedures.” [see the attachment to this comment.] 2) We recommend to add the following sentence at the end of the paragraph: “The Digital Services Act should abstain from imposing notice-and-stay-down mechanisms that establish an obligation to prevent the content from ever being available in the future, usually through automated measures that imply general monitoring.” [see the attachment to this comment.] 3) In order for a notice to be valid, it has to contain sufficient information for platforms to act upon. We propose to include the requirement for legal definition of what constitutes valid notice: “The Digital Services Act should establish conditions that a notice needs to meet in order to be valid and acted upon. However, concrete requirements for a valid notice should be tailored to the type of content in question.” [see the attachment to this comment.] 4) While abusive or ‘wrongful’ notices are a serious concern, the legal provision establishing consequences for such a conduct should not intimidate users. A typical example could be the US Digital Millennium Copyright Act that allows for counter-notices against alleged copyright violations which is however, rarely used in practice due to users’ fear of financial sanctions.
-
We recommend adding the following sentences for better understanding of what constitutes the actual knowledge: Only a valid notification can constitute actual knowledge. The Digital Services Act needs to establish clear standards to determine when and how intermediaries obtain “actual knowledge” of illegal content on their platforms.
-
We suggest to add the following sentence: In order to prevent “encouragement to deploy proactive measures” on semi-voluntary basis, States’ supported application of proactive automated measures by online platforms for content recognition cannot result in actual knowledge and consequently, legal liability. Finally, we recommend to include the sentence that underlines the need for robust and meaningful transparency: Any use of automated tools has to be based on clear and transparent policies, including transparency mechanisms for the independent assessment of their creation, functioning, and evaluation that should be established by the Digital Services Act.
-
We suggest to add the following paragraph: Specific monitoring obligations should not be mandated by law. Specific monitoring of online content is largely performed using automated tools such as content recognition technologies. In order to be effective, these technologies would have to be applied to all user-generated content hosted by online platforms, regardless of the different context for the content in question. Therefore, specific monitoring may lead to imposing the obligation on online platforms to prevent the upload of illegal content and thus, to actively monitor all content on their platforms in order to achieve that end. This would enable the circumvention of the general monitoring prohibition.
-
For the suggestion P7, We abstain from voting for the following reasons: We recommend an explicit mention to the ePrivacy Regulation as the privacy framework providing an additional layer of protection for the online activities of data subjects. Additional rules on ad-tech shall also be considered by the EU Commission while ensuring their work together with the General Data Protection Regulation and the future ePrivacy Regulation. We also recommend the final sentence to be modified as follows: “Public authorities shall be given access to a user’s [DELETE 'Internet subscriber and' REPLACE with 'a user's'] metadata only for the purpose of an investigation of a crime and with prior judicial authorisation [DELETE 'investigate suspects of serious crime' and REPLACE with 'for the purpose of an investigation of a crime']. The proposed reformulation of the paragraph is clarified in the attachment to this comment. With this edit, we aim to address a number of issues 1) There is no official EU definition of “subscriber data” and this proposal is not the appropriate avenue to address this issue. 2) Access Now holds the position that the concept of “serious crimes” is not well-defined and regulated under EU law to use as a reference for this provision. Instead, we argue for a standard that consists of 1) a formal valid request from relevant authorities, 2) the existence of an investigation of a crime and 3) a prior judicial authorisation. We recognise that this doesn’t address the problem that Member States can have different and overbroad penal codes which defines the notion of “crimes” but that question goes beyond the DSA proposal.
-
I think it's right that providers do not have to remove content that is legal in their country only because it is illegal somewhere else - this would set very dangerous precedents. However, this only works in practice if the other countries have another way to prevent access to that illegal (for them) content to their citizens, e.g. geoblocking or ISP content filters - otherwise this principle will not be workable in practice and will harm the other countries' rights.
-
Excellent point, however I am not sure that EUMS would accept the platform's country of origin as a free speech standard, as in most cases it would be the US standard, or some other country's. A way to tackle this is to introduce an additional criteria of geographic scope of content restrictions as an integral part of the proportionality test by the judiciary in determining an appropriate course of action. The @IJurisdiction has developed this concept with input from over 40 senior stakeholders from governments, internet companies, technical operators, civil society, leading universities and international organizations from around the world. Attached is a 2 page brief on the issue. I'm happy to provide more info or discuss further.
-
Even if nobody likes content blocking, in Europe it is generally used for positive purposes, to block content that harms the rights and safety of other people, such as hate speech, racist propaganda, counterfeit pharmacies, illegal gambling and so on. There are cases in which removal at the source is not possible, because the hosting country is not cooperating promptly, or because the content is legal in the hosting country. So European democracies should still have the right to block content if they democratically decide to do so; otherwise, the only result would be to give the power to block content to private companies in non-European jurisdictions, which in my opinion would be even less democratic. However, given how this potentially affects freedom of expression, it would be useful to establish a clear European framework for content blocking (while leaving to each member state the decision on what to block). Currently the blocking scenario is extremely fragmented both in terms of technologies used and of rules applied by each European country. A uniform framework could establish clear constraints (e.g. require appeal and redress procedures, promote transparency on what is being blocked...) and also, by making the technological choices uniform and open, avoid disadvantaging the small ISPs and non-profit providers that have to implement non-standard national blocking mechanisms on their own. It would also help to clarify the responsibilities of ISPs, that often have to cope with unclear legal responsibilities, and are criticized both by digital rights orgs (for blocking too much) and by governments and citizens groups (for not blocking enough).
-
Thanks for making this very important point - a few comments: 1. The term of art for this feature usually is not "interconnectivity" but "interoperability". 2. To prevent obsolescence, this provision should be aimed not just at "social networks and messaging services" but at any dominant Internet communication service. A regulating agency could be tasking with maintaining a list of which services are dominant and publicly relevant. 3. APIs are just one of the standard interfaces through which interoperability can happen; often (e.g. in email and instant messaging) it is not a matter of APIs but of communication protocols; in other cases it is a matter of data formats. Again, the wording should be generic and just mention "open interfaces" (or, if you want, "open protocols, standards and programming interfaces"); a regulating agency can work out together with the industry which open interfaces are to be used. 4. It is however important that the interfaces are open, public and free from any licensing fees or other legal and commercial constraints, otherwise the provision can be circumvented just by not documenting the interface, or by patenting/copyrighting some of its elements (at least in jurisdictions that allow this) and requesting extortionate compensation for the intellectual property, or by imposing impossible terms and conditions of use for access to the interface. 5. In addition to interoperability, which allows horizontal interconnection between competing similar services, there is also the need to prevent vertical lock-in, i.e. a dominant platform in one sector exploiting its market power to impose the use of a specific provider in an adjacent sector. A typical example would be desktop operating systems promoting the use of a specific browser over others, or mobile operating systems promoting the use of a specific app over others. The concept of "device neutrality", similar to "network neutrality" but for platforms rather than for ISPs, has been proposed for this, and I would recommend you to take it into consideration.
-
Besides demanding API access for *cross-platform* connectivity, it could be suggested separately(?) to demand API access for interacting with the platform using client software of the user’s choice (‘interface neutrality’). This does not enable interconnectivity of platforms, but does allow for easier multi-homing, customising the experience, etcetera. To compare: one can both send emails to users of a different email provider; *and* choose which email client (outlook, thunderbird, etc.) one would like to use. Note that an overlapping demand is mentioned above, in the suggestion to let users use an alternative curation system of their choice. An alternative client would also help customise one’s curation.
-
What is a ‘social network’, is it different from other platforms? Would we want such a distinction to enter into legal definitions at all? Is e.g. eBay or Tinder a social network (each with a specific purpose)? Likewise, does ‘messaging’ include video chats? It may help to stick with a more generic term (e.g. ‘communication-enabling platform/service’?) to not end up with legislation that is outdated again as soon as the popular means of communication change shape. Moreover, ‘dominant’ may be taken to mean the concept from competition law; which may or may not be desirable. The current formulation could be sufficient for the current opinion, but it may be worth scrutinising the definitions in the resulting report(s).
-
While recognising the importance of the rights of the content provider, such a setup would need to be very carefully thought through. For instance, what is the time frame given for a reply and how are possible delays justified in balance to the rights of other users, for instance in cases of racial slurs or incitement to violence? Mandatory intermediary steps before full disabling could be an option here, i.e. adding a disclaimer that the content in question has been marked as potentially illegal or temporarily imposing a reduction in visibility.
-
What would these responsibilities be? Given that this field is changing rapidly, it seems difficult to have such as list of obligation rather than a statement of duties that is then clarified through the courts. The latter is not the ideal situation - see GDPR - but might be an easier to implement. Alternatively, the obligation would have to be reviewed and updated on a regular basis, e.g. every 2 years.
MOST ACTIVE USERS
![]() |
0 | 4 |
![]() |
0 | 3 |



P2
The Committee on Legal Affairs calls on the Committee on the Internal Market and Consumer Protection to incorporate the following suggestions:
Add comment
P3
A. Whereas the rules enshrined in Directive 2000/31/EC on electronic commerce have allowed for the development of the Internet and of digital services in the EU since two decades, and are key in protecting fundamental rights as well as in safeguarding an innovative business environment; considering that their revision should not be envisaged without thorough scrutiny and utmost caution.
Add/View comment (1)

P5
- Stresses that wherever it is technically possible and reasonable, intermediaries shall be required to enable the anonymous use of their services and payment for them, as anonymity effectively prevents unauthorized disclosure, identity theft and other forms of abuse of personal data collected online; where the Directive on Consumer Rights requires commercial traders to communicate their identity, providers of major market places could be obliged to verify their identity, while in other cases the right to use digital services anonymously shall be upheld.
Add/View comments (2)


P7
- Notes that since the online activities of an individual allow for deep insights into their personality and make it possible to manipulate them, the collection and use of personal data concerning the use of digital services shall be subjected to a specific privacy framework and limited to the extent necessary to provide and bill the use of the service. Public authorities shall be given access to Internet subscriber and metadata only to investigate suspects of serious crime with prior judicial authorisation.
Add/View comments (3)



P9
- Reiterates that hosting service providers or other technical intermediaries shall not be obliged to generally monitor user-generated content.
Add/View comments (4)




P10
- Notes that automated tools are unable to differentiate illegal content from content that is legal in a given context; highlights that a review of automated reports by service providers, their staff or their contractors does not solve this problem as private staff lacks independence, qualification and accountability; therefore stresses that the Digital Services Act shall explicitly prohibit any obligation on hosting service providers or other technical intermediaries to use automated tools for content moderation; content moderation procedures used by providers shall not lead to any ex-ante control measures or upload-filtering of content;
Add/View comments (3)



P12
- Underlines that the fairness and compliance with fundamental rights standards of terms and conditions imposed by intermediaries to the users of their services shall be subject to judicial review. Terms and conditions unduly restricting users’ fundamental rights, such as the right to privacy and to freedom of expression, shall not be binding.
Add/View comment (1)

P14
- Highlights that, in order to constructively build upon the rules of the e-Commerce Directive and to ensure legal certainty, the Digital Services Act shall exhaustively and explicitly spell out the obligations of digital service providers rather than imposing a general duty of care; highlights that the existing legal regime for digital providers liability should not depend on uncertain notions such as the ‘active’ or ‘passive’ role of providers.
Add/View comments (2)


P15
8. Stresses that the responsibility for enforcing the law, deciding on the legality of speech online and ordering hosting service providers to remove or disable access to content as soon as possible shall rest with independent and democratically accountable public authorities; only a hosting service provider that has actual knowledge of illegal content and its illegal nature shall be subject to content removal obligations.
Add/View comments (3)



P17
10. Suggests that major commercial hosting service providers should provide a publicly and anonymously accessible notice and action mechanism for reporting allegedly illegal content published on their platform, and that notices should be examined by qualified staff based on clear criteria, that the content provider shall be heard before disabling access to their content, and that adequate redress mechanisms, both via dispute settlement bodies and judicial authorities, should be made available; while applying reasonable time-frames; highlights that persons who systematically and repeatedly submit wrongful or abusive notices shall be sanctioned; underscores that smaller commercial and non-commercial providers shall not be subject to these obligations.
Add/View comments (5)





P18
- Calls on the Commission to consider obliging major hosting service providers to report serious crime to the competent law enforcement authority, upon obtaining actual knowledge or awareness of such a crime.
Add/View comments (3)



P19
- Stresses that proportionate sanctions should be applied to violations of criminal and civil law, which shall not encompass excluding individuals from digital services.
Add/View comment (1)

P20
- Highlights that in order to protect freedom of speech standards, to avoid conflicts of laws, to avert unjustified and ineffective geo-blocking and to aim for a harmonised digital single market hosting service providers shall not be required to remove or disable access to information that is legal in their country of origin.
Add/View comments (3)



Did you know you can vote on comments? You can also reply directly to people's comments.