39 Matching Annotations
  1. Dec 2020
    1. The EU will protect these principles and respond effectively to disinformation. We often talk about silos that inhibit an effective approach to tackling problems in Europe. In the field of disinformation, we are working every day to increase cooperation across different EU Institutions and member states and develop the EU’s Rapid Alert System (RAS) against disinformation. A network of officials in the EU institutions and the EU member states that are dealing with disinformation related issues to enable common situational awareness and threat assessment and to strengthen coordination with researchers, civil society organisations and our international partners. 

      Overcoming silos in dealing disinformation.

    2. The EU has been working on tackling disinformation (link is external) for many years now. The European External Actions Service (EEAS) has been a pioneer in monitoring pro-Kremlin disinformation, and has then expanded its focus and toolbox. Today, EEAS taskforces focus on three different regions: the East, Southern Neighbourhood and the Western Balkans. We have recently published our 5th Special Report on COVID-19 disinformation, which shows again how much these activities can cause considerable damage during a global health crisis.

      Do we have any coverage of disinformation in EU? How does it work?

      ||StephanieBP||||MarcoLotti||

    Created with Sketch. Visit annotations in context

    Created with Sketch. Tags

    Created with Sketch. Annotators

    Created with Sketch. URL

    1. “fear” (arrests of netizens/bloggers/activists and censorship laws), “friction” (content filtering and blocking websites), and “flooding” (manipulation of public opinion via propaganda and disinformation).

      3 ways authoritarian regimes exert control over internet

    Created with Sketch. Visit annotations in context

    Created with Sketch. Tags

    Created with Sketch. Annotators

    Created with Sketch. URL

  2. Nov 2020
    1. Reasons

      It is interesting to observe how the reason behind content removal appeals changed over time. This could add a qualitative note to our rather quantitative overview of countries' data. ||Jovan||

    Created with Sketch. Visit annotations in context

    Created with Sketch. Tags

    Created with Sketch. Annotators

    Created with Sketch. URL

  3. Oct 2020
    1. Relevant points in this document

      • 5 eyes (Australia, Canada, USA, UK, New Zealand) are joint by Japan and India
      • against end-to-end encryption for two reasons: a) companies cannot enforce their internal rules on content moderation; b) law enforcement agencies cannot enforce laws.
      • problems should be solved on the way how applications are designed
      • need to solve problem of encryption via standards
      • argue that they can both protect privacy and limit encryption.

      ||VladaR||||AndrijanaG||||Jovan||||MarcoLotti||||AndrijanaG||

    Created with Sketch. Visit annotations in context

    Created with Sketch. Tags

    Created with Sketch. Annotators

    Created with Sketch. URL

    1. o balance the need to tackle illegal content and activity online with that of protecting users rights and freedoms, while supporting innovation in Europe.

      Reading between the lines: to balance the need to tackle any illegal content we come across (vs the illegal content we're informed about, as is currently the law), without being exposed to liability over infringements to freedom of speech, as is the case in the US (New York Post and all that s*). In order for you to support innovation, you need to make sure we are not exposed to the same BS situations

    2. t is framed more as a matter of the freedom of speech of the service provider than as a tool to protect them from liability for illegal content

      This is the difference between the US and the EU's approach, according to Big Tech. And they're right.

    3. under Section 230(c) of the Communi-cations Decency Act

      Ha! Ha! Ha! Ok, we give up. Here goes the reference to Section 230. Happy?

    4. it is not entirely clear if these measures confer ‘actual knowledge’ on the service provider1. The possibility that a service provider might be consid-ered to have ‘actual knowledge’ due to the introduction of go

      PRECISELY. Here's the crux of the problem. This creates a legal uncertainty for platforms, esp with the new DSA which will put a spotlight on the behaviour of social platforms

    5. a safeguard for service providers,

      But add a safeguard. Call it whatever you like. In this paper, we've called it 'necessary safeguard', or the EU equivalent of the US' Good Samaritan principle. We haven't mentioned Section 230, but that's what we also mean: the DSA's equivalent of Section 230

    6. he current liability regime and the prohibition on general monitoring

      Notice-and-takedown stays. Prohibition of filters stays.

    7. want to do more to voluntarily and proactively remove illegal content from their services, and society wants the same

      Reading between the lines: We are sick and tired of hearing governments complain that we don't do enough. Do you want us to go beyond notice-and-takedown? Give us the necessary safeguards from liability that we may face due to content removal

    8. y a governance structure.

      Reading between the lines: We agree to oversight by a dedicated body. Better than being hauled to court.

    9. US ‘Good Samaritan’ principle but should be distinct in its basis in EU law and respect of European values and fundamental rights. This should be balanced with minimum information levels in notices and a human review of removal appeals, an

      Reading between the lines: We do have similar limited liability rules. They're called Section 230. But, Trump wants to kill Section 230. So does Biden. And in Europe, you're moving very fast. So, let's get Europe settled, by introducing a 230-like clause into DSA. (Choice of words in their article is amazing)

    10. a European-centric principle, based on EU law, and subject to European oversight

      Reading between the lines: Since the EU is moving fast with DSA, we want updated limited liability rules (which we consider 'necessary safeguards') ASAP. It would take long to have global rules, and we're not sure that's even possible. So, EC, push these into DSA ahead of 2 December, please

    Created with Sketch. Visit annotations in context

    Created with Sketch. Tags

    Created with Sketch. Annotators

    Created with Sketch. URL

    1. Comprehensive coverage of content policy and practicies by social media platforms

      ||MarcoLotti||||Katarina_An||||StephanieBP||||AndrijanaG||

    2. chart 3

      ||Katarina_An|| This is even more interesting with categorisation of requests.

    3. Where both camps agree is in their unease that it is falling to social networks to decide what speech is acceptable.

      agreement across political spectrum

    4. as misinformation about covid-19 flourished, Facebook took a harder line on fake news about health, including banning anti-vaccination ads. And this month it banned both Holocaust denial and groups promoting QAnon, a crackpot conspiracy.

      banning at Facebook

    5. YouTube, a video platform owned by Google with about 2bn monthly users, removed 11.4m videos in the past quarter, along with 2.1bn user comments, up from just 166m comments in the second quarter of 2018. Twitter, with a smaller base of about 350m users, removed 2.9m tweets in the second half of last year, more than double the amount a year earlier. TikTok, a Chinese short-video upstart, removed 105m clips in the first half of this year, twice as many as in the previous six months (a jump partly explained by the firm’s growth).

      ||Katarina_An|| Can we find statistics about other companies (preferable with some APi that can be updated)/

    6. see chart 1

      ||Katarina_An|| Could we find this statistics about Facebook?

    Created with Sketch. Visit annotations in context

    Created with Sketch. Tags

    Created with Sketch. Annotators

    Created with Sketch. URL

    1. The Framework would acknowledge that the service provider is best placed to determine how to adapt their systems quick-ly, incorporate the necessary safeguards for both users and service providers and meet their obligation of proportional re-sponsibility.

      What companies are saying:

      We have two problems:

      1. The first relates to 'harmful' content. Illegal content is not an issue by itself, as the law offers a legal basis act. If it's illegal, it will be removed. Rather, the issue is, how do you define 'harmful-but-not-illegal'? This should not be left to companies to decide, as is now the case with companies setting terms of use and having to update them often.
      2. Especially in the EU, laws vary so much on content which is illegal vs 'harmful-but-not-illegal'. Example: blasphemy.

      For the EU and the upcoming DSA, this is how to solve it:

      • The DSA should first/for now tackle content which is already defined as (i) illegal (ii) across the EU. The DSA would acknowledge that companies are best placed to determine how to incorporate the "necessary safeguards" (point of contention, as MEPs do not want ex-ante filters) and how to meet their obligation of "proportional responsibility" (ex, Google being 20% at fault, while the author is 80% at fault).
      • Then, policymakers should then consider aspects related on 'harmful-but-not-illegal', taking into account the issues which service providers have so far faced, and the methods which service providers already have in place.
      • Until then, companies will continue to develop "best practices to tackle harmful content" + having a strong procedure for appeals.
    Created with Sketch. Visit annotations in context

    Created with Sketch. Tags

    Created with Sketch. Annotators

    Created with Sketch. URL

    1. A few points from this article

      • Internet is Facebook for many people in Myanmar (digital monopoly)
      • how 'Free Basics' created Facebook monopoly in Myanmar
      • how social media emphasis societal weaknesses (lack of critical thinking, etc.)
      • limits in AI-driven filtering due to use of different languages
    2. Critical thinking is absent from school curricula, depriving pupils of the skills necessary to distinguish fact from fiction. Myanmar is a hierarchical society, and divisive posts are often written by powerful figures, from senior monks to generals. Islamophobia also taps into an anxiety, unleashed by the opening up of the economy, that Muslims will enrich themselves at the expense of others. That fear harks back to Myanmar’s past as a British colony, when hundreds of thousands of Indians migrated to Burma, as it was known, and often prospered there.

      a few reasons why Facebook adds to miscommunication in Myanmar:

      • lack of critical thinking in school curricula
      • hierarchical society relying on authority of religous and military leaders
      • strong islamophobia
    Created with Sketch. Visit annotations in context

    Created with Sketch. Tags

    Created with Sketch. Annotators

    Created with Sketch. URL

    1. This article links content policy, anti-monopoly regulation and human rights in an effective way. These issues are closely interlinked. Here are the man points from the article:

      • explaining position of main actors focusing on fake and false news and right on freedom of expression;
      • growing censorship in the Western societies (UK, France)
      • can tech executives be in charge of freedom of expression;
      • how to avoid monopolies in content - the most reasonable approach is to develop interoperability with users having possibility to switch between platforms (move data) without using its network and contacts (analogy to mobile phone number); in this scenario the social networks would become utilities.
      • internatoinal human rights law provide a solid international framework for dealing with content (reasonable balance between freedom of speech and relevant and proportionate restrictions.

      ||MarcoLotti||||Pavlina||||StephanieBP||||AndrijanaG||

    2. Biden decision

      what is this Biden decision? ||Pavlina|| ||MarcoLotti||

    3. In London the Metropolitan Police requests that they take down legal, but troubling, posts. In June France’s Constitutional Council struck down a deal between the government and the tech companies because it curbed free speech—an initiative that is sure to be revisited after Mr Paty’s murder.

      examples of emerging censorship in western societies.

    4. The left says that, from the conspiracy theories of QAnon to the incitement of white supremacists, social media are drowning users in hatred and falsehood. The right accuses the tech firms of censorship, including last week of a dubious article alleging corruption in the family of Joe Biden, the Democratic presidential nominee.

      Here are mani reasons why right and left attack tech companies.

    Created with Sketch. Visit annotations in context

    Created with Sketch. Tags

    Created with Sketch. Annotators

    Created with Sketch. URL

    1. The US has seized 92 web domains used by Iran, including four which purported to be genuine English language news sites, the Justice Department announced Wednesday.

      The US authorities seized 92 domains used by Iran, including English language news website.

      The seizure shifts content removal from tech platforms (Facebook, Twitter) to the domains which are part of the core functioning of the Internet.

      This practice could shift ICANN and domain name industry into new policy realm of content policy which they have been avoiding for long time.

      In addition, justification of seizure considers domain name as property which ICANN and domain name community also avoided for long time.

      ||Pavlina|| it seems that seizure is not based on sanctions regime but on regular criminal code. Could you double check?

    Created with Sketch. Visit annotations in context

    Created with Sketch. Tags

    Created with Sketch. Annotators

    Created with Sketch. URL

  4. Sep 2020
    1. New developements on TikTok saga: US court banned President Trump's executive order on banning TikTok.

      ||djordjej|| Pogledaj referencu za websites. ||AndrijanaG||

    Created with Sketch. Visit annotations in context

    Created with Sketch. Tags

    Created with Sketch. Annotators

    Created with Sketch. URL

    1. EU's Digital Services Act (DSA) is getting contour from EU commissioner Thierry Breton in the interview with Financial Times. The main elements are:

      1. tech companies with monopolistic power could be:
      • forced to break up or sell of their European operators ('TikTok clause);
      • exclude large tech companies from the EU market.
      1. introduction of a rating system for allow the public and stakeholders to control behaviour of tech companies (tax compliance, content control).

      2. EU will remain with limited liability of tech platforms for content (stay with Section 230 logic).

      Mr. Breton compares tech companies to big banks before the financial crisis 2008. Thus, one can expect that EU's tech policy will be inspired via analogy to the regulation of financial sector.

      ||StephanieBP|| ||Jovan||

    Created with Sketch. Visit annotations in context

    Created with Sketch. Tags

    Created with Sketch. Annotators

    Created with Sketch. URL

    1. the European Union’s Digital Services Act (DSA)

      EU's Digital Service Act (DSA) is gaining momentum.

      ||StephanieBP||

    Created with Sketch. Visit annotations in context

    Created with Sketch. Tags

    Created with Sketch. Annotators

    Created with Sketch. URL

  5. Jul 2020
    1. TikTok, the hugely popular video platform owned by Chinese tech giant Bytedance, has an estimated 120 million users in India, making the country one of its biggest markets.

      Is ||MarcoLotti|| in charge of content? If yes, Marco please have a look at interesting text by the Economist arguing that TikTok was tool for masses (including untouchables) in India. TikTok managed to bypass dominance of Bollywood. https://www.economist.com/asia/2020/07/04/indias-ban-on-tiktok-deprives-the-country-of-a-favourite-pastime

    Created with Sketch. Visit annotations in context

    Created with Sketch. Tags

    Created with Sketch. Annotators

    Created with Sketch. URL

  6. Jun 2020
    1. Twitter and Facebook have differing business models

      Article explains how business model explains content policy of Twitter and Facebook.

    Created with Sketch. Visit annotations in context

    Created with Sketch. Tags

    Created with Sketch. Annotators

    Created with Sketch. URL

    1. semi-independent self-regulatory groups, such as Facebook’s “oversight board”, also known as its “supreme court”

      Are we following Facebook on ccontent policy

    2. In May 2016 the European Union (EU) agreed to a “Code of Conduct” with Facebook, Microsoft, Twitter and YouTube to counter online hate speech.

      Do we have this document?

      ||Katarina_An||

    Created with Sketch. Visit annotations in context

    Created with Sketch. Tags

    Created with Sketch. Annotators

    Created with Sketch. URL

    1. he Harvard Kennedy School’s Misinformation Review.

      Misinformation Review

      ||Katarina_An||

    2. The “infodemic” around covid-19, declared by the World Health Organisation (WHO) in February, is not the world’s first outbreak of misinformation.

      WHO introduced term infodemic as outbreak of misinformation.

      ||Jovan|| ||Katarina_An||

    Created with Sketch. Visit annotations in context

    Created with Sketch. Tags

    Created with Sketch. Annotators

    Created with Sketch. URL