Canadian Sikh Facebook users receive notifications that their posts are being taken down because they’re in violation of Indian law

  • @[email protected]
    link
    fedilink
    1071 year ago

    In other words, “Our Indian moderation subcontractor applied Indian law to Canadian posts in Canada. Oops.”

    • @[email protected]
      link
      fedilink
      1
      edit-2
      1 year ago

      Wow, I didn’t think “Facebook sucks” would be such a controversial statement on a place like Lemmy or kbin

    • tripfiend
      link
      fedilink
      -351 year ago

      I wonder if people would have the same reaction if the person in question was a known Islamic Terrorist. If there were Facebook groups praising the legacy of Osama Bin Laden, would Meta be then justified to carry out similar censorship?

        • @[email protected]
          link
          fedilink
          -181 year ago

          This guy is a wanted terrorist in India. I agree he wanted a place for Sikhs but on what cost? He wanted to split the nation and it is unacceptable. He created so much violence inside the nation for that, many people were killed. It is like those proud boys want to split America and make Texas a country.

      • @coach
        link
        English
        11 year ago

        deleted by creator

    • @[email protected]
      link
      fedilink
      -371 year ago

      Yeah a random minute wage Indian worker applied policy wrong. Absolutely fantastic reason to not use the platform!!! 🤦‍♂️

      This type of shit is exactly why nobody listens to privacy advocates. People see this reaction and just laugh.

      • @12345
        link
        31 year ago

        What does this persons comment have to do with being a privacy advocate? Maybe they’re just vehemently anti monopoly and hate Facebook for that.

        Judging by the downvotes everyone saw your reaction and laughed instead of the one you’re replying to…

  • @[email protected]
    link
    fedilink
    English
    46
    edit-2
    1 year ago

    So they received a take down request from the Indian government, mistook the users for being in India, followed the law that they’re required to follow in India, and when it was brought to their attention that those users were actually based in Canada they went back and allowed the posts. This doesn’t seem as malicious as people are making it out to be, they should probably work on their geo-blocking, but with 3 billion users in 150+ countries with their own local laws it’s probably safer to be aggressive when it comes to removing content when requested.

    • @[email protected]
      link
      fedilink
      31 year ago

      I think this comes under the ‘Never attribute to malice that which is adequately explained by stupidity’ Hanlons razor

      • @[email protected]
        link
        fedilink
        English
        11 year ago

        Well they don’t, hence why they’re taking down posts as required by the countries they operate in and willing to accept a noticable false positive rate to do it.

          • @[email protected]
            link
            fedilink
            English
            11 year ago

            and willing to accept a noticable false positive rate to do it.

            It’d probably help if you fully read the comments you’re replying to lol

              • @[email protected]
                link
                fedilink
                English
                0
                edit-2
                1 year ago

                Your first comment was incredibly vague… I was responding to this part:

                Glad corporations get the power to make these decisions.

                However, a high false positive rate is different than assuming every post is “guilty until proven innocent”, and they aren’t mutually exclusive either. Current example here would be the automated removal of CSAM on Lemmy. A model was built to remove CSAM and it has a high rate of false positives. Does this mean that it assumes everything is CSAM until it’s able to confirm it isn’t? No. It could work that way, that’s an implementation detail that I don’t know the specifics of, but it doesn’t necessarily mean it does.

                But really, who cares? The false positive rate matters for site usability for sure, but the rest is an implementation detail in an AI model, it isn’t the court of law. Nobody’s putting you in Facebook prison because they accidentally mistook your post for rule breaking.

  • magnetosphere
    link
    fedilink
    391 year ago

    If anyone needs another example of why centralized social media is dangerous, here’s one.

    • @[email protected]
      link
      fedilink
      -361 year ago

      And what would you call the child porn that keeps popping up here?

      You people are absurd and just throwing spaghetti at the wall.

      This is clearly an admin mistake over at Facebook but you people want Facebook to fail so badly you run with the most inconsequential dumb shit.

        • @[email protected]
          link
          fedilink
          -221 year ago

          By paying teams of people, which fediverse has an aversion too.

          I also like how you do confidently declare it’s easily dealt with engine the instances are currently fighting and blocking child porn attacks. Like it’s going on right now in plain sight and you are calling it mission accomplished 🤣.

          That’s some Bush on an aircraft carrier levels of reality wrangling.

            • @[email protected]
              link
              fedilink
              -151 year ago

              Yet it’s back. You will be fighting a battle of attrition. There is a reason social media companies hire entire teams of people for this. And you know what? That job sucks balls, requires full time therapy and discussions, and has an insanely high burnout rate. You aren’t going to just ML it away and services offered by folks like CF aren’t 100% effective. And the problem never ever goes away.

              There will be burnout and someone is going to be lazy and get in legal trouble which is going to scare the shit out of anyone who is crazy enough to be running instances not behind a non profit or something.

      • @[email protected]
        link
        fedilink
        31 year ago

        Part of freedom is that some people will abuse it.

        I hope you’re not suggesting we should give up freedoms so those abusing it can’t.

        Just go after the abusers, which instance admins do.

        If you’re trying to suggest a solution to stop all sharing of CP on the internet without restricting everyone else, I’d love to hear what you come up with.

        waits patiently

        • @[email protected]
          link
          fedilink
          -21 year ago

          No, you should employee the team and put in place the procedures for handling it according to the law. Liked literally everyone out there doing social media.

          • @[email protected]
            link
            fedilink
            -21 year ago

            What do you mean?

            I don’t think mods are paid on most lemmy servers, and I’ve never seen CP.

            Not saying it doesn’t happen, but it’s pretty clear that paying people to moderate is not necessary.

            • @[email protected]
              link
              fedilink
              -11 year ago

              It’s all new. Folks are happy to moderate it now. It will keep coming. They will not have the support of a team and they will miss things and get tired of it. Somebody is bound to get in trouble with legal issues and it’s going to spooke people. The current state is not sustainable long term.

      • @[email protected]
        link
        fedilink
        31 year ago

        Woah dude. Where did that come from. One thing being worthy of criticism doesn’t mean other things aren’t.