The Digital Services Act (DSA) has the potential to be a cornerstone in digital services. While we can all agree that the current e-Commerce Directive can be greatly improved, it is a great hurdle for those who are tasked with its upgrade.
In the past months we have seen the rise of disinformation in the digital sphere, aided by the lack of regulation surrounding social media platforms. The lack of accountability of intermediary service providers poses serious dangers, at all levels. The effects range from the possibility to influence buyers, thus creating the monopolies of tomorrow, to security threats during elections.
These effects, stemming from the current e-Commerce Directive, are directly proportional to the company size and its reach. Large companies are at a higher risk to be the target of these black-market, large scale marketing strategies. For startups, the new Directive has the possibility to impede their growth, without carrying significant improvements to the status quo. However, without further regulation, startups are also at risk. The social media giants currently do have the power to unfairly grow or shrink their market share through promotion of fake news, without any liability for it.
Harmful vs. Illegal vs. Within Limits
An interesting point regarding this example is that you cannot objectively distinguish what is harmful from what is illegal. Similarly, without properly educating the public, one risks increasing the polarization out of fear of free speech censorship. We see this now, during a world crisis, as people protest because they don’t believe the coronavirus exists. How would this mass of people react if their posts on social media were deleted because of disinformation?
And this sensitive content is not limited to this global pandemic, there is a lot of harmful content online. I’m referring here to violence against women, minorities, people with disabilities and so on. Moreover, most of the time the words are not harmful in themselves, but hate speech arises from the context. This is easy for a person to determine, but not so much for an AI. An AI has to analyze text and pictures, historical evidence, personal features of the victim, pop culture references and so many more parameters at the same time. As a software engineer, I can attest that we are not close to having this technology. The roadblocks would be even greater if a startup would need to develop / buy such a filtering technology.
Digital Services Act Loopholes
As it is, it is dangerous to consider automatic filtering enough, since there are obvious loopholes which can be exploited at this time. It adds extra complexity for startups. Large companies, however, have the resources to implement this type of filtering – and most of them already have. But they can also turn a blind eye and say they did not have enough data to deem something harmful, or even illegal. “I have no knowledge, therefore no liability.”
Furthermore, we can underline more ways to exploit the Digital Services Act loophole. Groups can organize around it and thus avoid filtering by just replacing words. As long as they are in a closed group and no one reports the spread of misinformation, they can operate without threat. Examples include Facebook groups, or even dedicated platforms in which the rules could enforce hate speech (such as infamous incel communities, or terrorist organizations).
This brings me to the one thing I want to leave you with.
We do not want apparently well regulated, expensive platforms which do not fulfill their promise to increase accountability, and which hide legislative loopholes.
We are confident that, as long as the regulators take into account diverse opinions, they will be able to construct a future proof solution. And we hope this solution will also serve to close the divide among people and generate social good. However, without public discussions on the matter, our main concern is that proper safeguards will be hard to find. We speak up for safeguards which would not require a tremendous amount of resources. In this case, startups would be at a disadvantage, and the Single Market would be at risk of strengthening monopolies. We do hope regulators listen and put people at the center of the Digital Services Act.
If you liked this article, feel free to check out these too:
- Digital future and gender balance
- StepFWD – in preparation for Demo Day 2019
- [Beautiful Engineering] Ruxandra Burtică: from networks to Machine Learning – a road paved with startups
This perspective was presented by the author, Silvia Cristina Stegaru, at the DSA 4 Startups webinar organized by Codette, Allied for Startups and BESCO – the Bulgarian Startup Association. For more information about the topic, check out the concept startup community statement, signed by all three entities.