Skip Navigation
Is there a simple way to severly impede webscraping and LLM data collection of my website?
  • Why not add a basic http Auth to the site? Most web hosts provide a simple way to protect a site or directory.

    You can have a simple username and pass for humans, but it will stop scrapers as they won't get past the Auth challenge unless they know the details. I'm pretty sure you can even show login details in the Auth dialog, if you wanted to, rather than pre sharing them.

  • 1 month of Linux Mint and some thoughts.
  • I'm not a gamer, but I've tried all types of distros, and have gradually always come back to LM. Like you said stuff works as expected, and you're usually pleasantly surprised at how easy things are to do, setup, change and config. It's still a breath of fresh air if you have to use Windows for your day job. 😀

  • InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)FA
    FactualPerson @lemmy.world
    Posts 0
    Comments 4