• @[email protected]
    link
    fedilink
    English
    492 months ago

    They’re already ignoring robots.txt, so I’m not sure why anyone would think they won’t just ignore this too. All they have to do is get a new IP and change their useragent.

    • @[email protected]
      link
      fedilink
      English
      52 months ago

      Cloudflare is protecting a lot of sites from scraping with their POW captchas. They could allow people who pay

  • @scarabine
    link
    English
    242 months ago

    I have an idea. Why don’t I put a bunch of my website stuff in one place, say a pdf, and you screw heads just buy that? We’ll call it a “book”

    • Rikudou_Sage
      link
      fedilink
      English
      202 months ago

      Put a page on your website saying that scrapping your website costs [insert amount] and block the bots otherwise.

  • @[email protected]
    link
    fedilink
    English
    192 months ago

    As someone who uses invidious daily I’ve always been of the belief if you don’t want something scraped, then maybe don’t upload it to a public web page/server.

    • @[email protected]
      link
      fedilink
      English
      52 months ago

      There’s probably not many people here who understand the connection between Invidious and scraping.

    • Justas🇱🇹
      link
      fedilink
      English
      12 months ago

      Imagine a company that sells a lot of products online. Now imagine a scraping bot coming at peak sales hours and looking at each product list and page separately for said service. Now realise that some genuine users will have a worse buying experience because of that.

      • @[email protected]
        link
        fedilink
        English
        12 months ago

        Yeah there’s way easier ways to combat that without trying to prevent scraping.

        Maybe don’t ship 20 units to the same address.