Response to European Union Digital Services Act

The Digital Services Act was drafted and the European Commission is currently accepting feedback from the public. This is the response I submitted on October 31, 2024.

I’m an American citizen, have been engaged in tech regulation efforts since 2019, and

I’m providing my feedback on the EU Digital Services Act as an individual member of

civil society.

The DSA claims that it exists to facilitate research on the systemic risks or mitigation

measures available to protect the European Union, but the very existence of this

system would create unacceptable risk to EU citizens. It does not yet include safety

mechanisms for Data Subjects to provide informed consent, control their data (this may

not even be possible), dispute the use of their data, or receive reports on how their

data is being used, although all of these rights are owed to them as citizens of the EU,

through the Charter of Fundamental Rights of the European Union.

The existence of the DSA system would create unacceptable risk to a Data Subject’s

business information and operations, to the security of their intellectual property, to

their personal and professional relationships and to their well-being.

The DSA implies that researchers will not be granted access to the data if they have

commercial interests, but it also states that they should disclose their funding sources.

It’s not possible for researchers to possess the highly specialized skillset of data analysis

and not have commercial interests or problematic sources of funding. It’s most likely

that funding sources are coming from militaries to develop weapons and tools of

psychological warfare or corporations to develop addiction mechanisms and dark

patterns. The DSA does not state that it would protect Data Subjects from these

unacceptable use cases.

The DSA would facilitate international data transfers, which would put the citizens of

one country at risk of being targeted and subjected to the military and corporate

interests of another country.

If a Data Subject was paying for an account with a very large online platform and/or

very large online search engine, then the existence of the DSA’s service would mean

the user would be paying to be exploited by corporations and militaries whose

researchers accessed the DSA system.

The DSA will not ask Data Subjects for permission to use their data. But if they did,

there’s no way Data Subjects would be able to remove data related to the

protected categories of information in the EU Charter Article 21 on Non-Discrimination from

their files or from their behavior nor stop the researchers from making inferences (which may

or may not be accurate) related to these categories based on their online behavior, or

torturing them because of those inferences. In the past, manipulation of this type of

online data has led to grave consequences for countries, communities and individuals.

DSA Article 13 on the Dispute Settlement Procedure provides highly detailed

information on how to settle disputes between Data Providers and Researchers, which

in my estimation is highly unlikely to occur, but no procedure for settling disputes

between Data Subjects versus Data Providers and Researchers. The current mediation

procedure would consolidate power against Data Subjects.

Digital Services Coordinators would have no incentives for denying a request to access

the data, so they’d be likely to grant access to as many people and organizations as

possible, which would create chaos in the lived experience of Data Subjects.

Since there would be no restrictions on the analytic tools researchers could use with the

data, it would create the possibility that they would make cybernetic loops with the

data: they wouldn’t be passively analyzing the data, they’d be actively controlling a

person’s or a group of people’s behavior, which would create the possibility of

exploitation, human trafficking, targeted bioterrorism and other risks.

The section on personal data does not acknowledge that the data is high risk. It fails to

outline in plain English the actual steps it would take to protect the data.

Previous
Previous

Activism on Twitter through a private-but-public account

Next
Next

On Cybernetics