Crypto lawyers accuse Telegram CEO Pavel Durov of ‘crimes’ – is it legal?
2 months ago Benito Santiago
Each month our panel of Crypto Lawyers looks at the legal implications of some of the thorniest issues facing the industry in different jurisdictions around the world.
The arrest of Telegram CEO Pavel Durov in France continues a global debate over the rights and responsibilities of social media platforms.
Is it right to arrest a founder of criminal behavior on an unrelated platform? Critics liken it to arresting a phone company chief because criminals talked about crime on the phone.
The EU has enacted increasingly restrictive laws through the Digital Services Act (DSA) and the General Data Protection Regulation (GDPR).
The DSA places strict obligations on online platforms to tackle illegal content and ensure transparency. Meanwhile, GDPR is a comprehensive law that governs how personal data is collected, processed and stored.
As user-generated content (UGC) flows across global platforms, where do we draw the line between free speech, internet security and privacy?
Magazine spoke to a group of legal experts to learn more: Catherine Smirnova, founder of Digital & Analog Partners in Europe, Joshua Chu, co-chairman of the Hong Kong Web3 Association from Asia, and Charlene Ho, managing partner of Rica Law, from the United States.
The discussion has been edited for clarity and brevity.
Magazine: Durov has been charged in France with allowing criminal activity and illegal content on his social media and messaging platform. We don't often see tech executives directly accountable for what happens on their platforms. Why do you think this case is different?
Ho: I'm surprised something like that would get a CEO arrested. Often, cases of encouraging or allowing illegal activity on the platform can be highly publicized, but typically do not result in the CEO being arrested. There are many platforms that allow the same types of communication that Telegram allows. But arresting the CEO is very interesting.
Smirnova: The jurisdiction was surprising, I say, because without such a clear regulation regarding digital platforms we can expect it in any country, but in France.
From the very beginning, I did not think that this arrest and detention was in any way related to the creation of Telegram itself or the DSA. This was highly speculative as DSA was now in motion. DSA is about corporate liability, not personal liability.
Chou: When the news broke, it was easy to quickly step aside because the French police also did a poor job of disclosing information about drip feeding. We didn't know why he was arrested, and many people assumed he was looking at Telegram messages. Over time, it became apparent that one of the main issues was some illegal writing published in a public forum, essentially a blog.
If you're a tech platform and you've been alerted by law enforcement that you're viewing child pornography, for example, you can't simply ignore it.
Read more
Features
Block by Block: Blockchain technology is changing the real estate market.
Features
Real AI matters in crypto, No. 3: Smart contract audits and cyber security
Magazine: There is a growing tension between platform responsibility and user freedoms. How do you see regulatory frameworks such as the DSA or the Digital Markets Act making platforms accountable for user content?
Smirnova: The DSA may not be as well known as its counterpart the DMA (Digital Markets Act). It applies to all online platforms, not just large companies targeted by DMA.
First, Internet regulation in the EU and the UK was based on the principle that no online platform could be held responsible for content posted by others. But the Internet has changed dramatically since its inception, and it's only fair and reasonable to find a balance. On the one hand, we have the internet and freedom of speech; On the other hand, we need to make the Internet a safe place – comparable to a city street.
In the US, you can see the same trend there. While there is no federal law yet, several states have introduced laws aimed at protecting minors online. This reflects the approach of the European Union, where the predecessors of the DSA were national laws focused on internet safety, especially for minors.
H: As Kathryn said, there aren't very many specific internet security laws at the federal level. [in the US]. There are some laws that are broader and may affect internet safety, especially for children.
There are pressures for laws at the state level. In California, you have an age-appropriate design code that models the UK's age-appropriate design code, but that has faced legal challenges in the courts and has yet to be fully released.
Internet security is a very complex topic. There is content moderation that can be covered under the Communications Decency Act. One of the main points is that unless you are the publisher of the content, you are generally not liable. But a few years ago, federal legislation was introduced to address child exploitation materials. It's called SESTA. Regardless of the actual publisher of the content, there were certain liabilities that could apply to the platform.
Read more
Features
How to launch the ‘Metaverse dream' in 2023
Features
Forced Innovation: Why Bitcoin Grows in Former Socialist States
Magazine: What limitations do local governments face when enforcing their laws in international arenas?
Chu: The data privacy law in Hong Kong is governed by the Personal Data Privacy Ordinance (PDPO), which is often criticized as archaic. It reflects the degree to which even the UK has moved away from the introduction of the GDPR after the handover. Furthermore, Hong Kong has several privacy laws that, although passed, have not been enacted in over 20 years. This situation is of interest to companies because cross-border data transfer issues have not yet been implemented, making Hong Kong an attractive business center due to the lack of regulatory changes influenced by political and commercial factors.
Tying this to the topic of publishing platforms, the issue of content removal comes into play. For example, if you want to remove content hosted in the US from YouTube, the Hong Kong government can only enforce the law within its own jurisdiction. The most they can achieve is to keep it geo-blocked from being accessible in Hong Kong, rather than being completely removed from the internet.
A police officer is a tourist outside his home jurisdiction unless he has permission from another jurisdiction.
Smirnova: GDPR has had a significant impact on the market. In fact, I would say not only the European market, but all markets worldwide.
[It’s similar to] SEC. We all know that the SEC does whatever it wants to investigate around the world, even when it comes to companies that are not headquartered in the United States. The same applies to GDPR.
The GDPR affects every company, regardless of where it is headquartered or has legal representatives in the EU. The important thing is that the company holds the personal data of European citizens. GDPR affects US regulations because they are always trying to harmonize their approach to data. It has affected all companies in many ways, forcing the data of European users in the EU to be localized and stricter rules on data transfer across borders.
Ho: The way the SEC works and how the privacy laws work don't really compete. The SEC is an enforcement agency in the US, and they clearly have vague jurisdictional boundaries. As we have seen, there is much debate as to whether they exceeded their authority.
An executive agency in the US must be authorized by federal law to exercise certain powers, and if they exceed that authority, they are essentially acting outside their legal boundaries. I think the SEC is not a model for how society should be governed.
Laws are made by elected legislatures, at least in Europe and America. Regardless of one's political stance, this is how laws are made.
In terms of privacy law, and in particular the GDPR, Articles 2 and 3 clearly indicate who is responsible for compliance. A company established in the EU or a company outside the EU that processes EU data or provides goods and services.
Read more
Features
EXCLUSIVE: 2 years after John McPhee's death, widowed Janice is broken and needs answers.
Features
DeFi vs. CeFi: Decentralization for the win?
Magazine: Platforms are being held responsible for controlling harmful or illegal content. What do you see as the limits of this responsibility, and how should we balance privacy, security, and freedom of speech?
Chu: These forums are not law enforcement agencies and have no obligation to regulate the Internet by approving content. They are more reactive, and it is up to authorities to refer to content as a problem. Even then, they have to go through proper ways to solve these problems. For example, because the Internet is largely borderless, what an overseas tech company can do in the face of a court order is to geo-block certain content. In order to properly remove content, one must go through the relevant jurisdictions to obtain the necessary court orders.
Smirnova: I agree that they are not the police, and their primary duty is to respond when they receive information about illegal content. I am not saying that they should only receive this information from the pre-DSA police. The EU e-commerce directive of 2000 had a similar rule: you, as a platform, are not liable unless you have been notified that the content is illegal. Therefore, there were no pre-moderation obligations.
However, given the amount of data we produce and use every day, society needs new control tools – positively, of course – although these can be used negatively like anything else. Especially with AI-generated content, it would be unrealistic to expect that a specific department within the police or the FBI would be responsible for deciding what content is allowed and what is not, if not after the claims process has been completed. . It doesn't work like this anymore. In some countries it still works this way, like Brazil, where there is a judge [Alexandre] De Moraes has special responsibility for the Internet in a country of 200 million people.
Ho: Depending on who uses the platform, there are First Amendment issues in the United States. We have had instances where political parties have pressured media companies like Meta to stop posting messages related to Covid. If the government directs a private company to block messages, it may raise constitutional issues.
A forum where the common man gets confused is that they themselves are under no obligation to grant freedom of speech – because they are not the government. Only the government should respect the law of rights. Forum reserves the right to introduce content moderation policies, and you may decide how much or how little you want to police content.
Subscribe
A very engaging read in Blockchain. It is given once a week.
John Yun
Yohan Yun is a multimedia journalist who has been reporting on blockchain since 2017. He has contributed as an editor to crypto media outlet Forkast and covered Asian technology stories as an assistant reporter for Bloomberg BNA and Forbes. He spends his free time cooking and experimenting with new recipes.