Google, Meta, and others will have to explain their algorithms under new EU legislation

6 comments
  1. > The final text of the DSA has yet to be released, but the European Parliament and European Commission have detailed a number of obligations it will contain:

    >* Targeted advertising based on an individuals’ religion, sexual orientation, or ethnicity is banned. Minors cannot be subject to targeted advertising either.
    >* “Dark patterns” — confusing or deceptive user interfaces designed to steer users into making certain choices — will be prohibited. The EU says that, as a rule, cancelling subscriptions should be as easy as signing up for them.
    >* Large online platforms like Facebook will have to make the working of their recommender algorithms (e.g. used for sorting content on the News Feed or suggesting TV shows on Netflix) transparent to users. Users should also be offered a recommender system “not based on profiling.” In the case of Instagram, for example, this would mean a chronological feed (as it introduced recently).
    >* Hosting services and online platforms will have to explain clearly why they have removed illegal content, as well as give users the ability to appeal such takedowns. The DSA itself does not define what content is illegal, though, and leaves this up to individual countries.
    >* The largest online platforms will have to provide key data to researchers to “provide more insight into how online risks evolve.”
    >* Online marketplaces must keep basic information about traders on their platform to track down individuals selling illegal goods or services.
    >* Large platforms will also have to introduce new strategies for dealing with misinformation during crises (a provision inspired by the recent invasion of Ukraine)

    This is sooo fucking good..

  2. > New obligations include removing illegal content and goods more quickly, explaining to users and researchers how their algorithms work, and taking stricter action on the spread of misinformation.

    It’s the algorithm for content detection(like illegal content), not the algorithm that gives them a competitive advantage. The title is clickbait. But developing an algorithm and sharing it with a company outsider will kill some competition. e.g. if LinkedIn develops an algorithm to detect propaganda in professional social networks and shares it, Xing can simply copy it without investing much. Without that algorithm, Xing would have lost 2%(assuming a german company, so there is always preferential treatment and a discount of 4%).

    > Margrethe Vestager, the European Commissioner

    Isn’t she the former prime minister of England? How did she become the EU competition commissioner?

Leave a Reply