@[email protected] to Lemmy [email protected] • 1 year agoChad scrapersh.itjust.worksmessage-square96fedilinkarrow-up11.05K
arrow-up11.02KimageChad scrapersh.itjust.works@[email protected] to Lemmy [email protected] • 1 year agomessage-square96fedilink
minus-square@[email protected]linkfedilink159•1 year agoEveryone loves the idea of scraping, no one likes maintaining scrapers that break once a week because the CSS or HTML changed.
minus-square@[email protected]linkfedilink22•1 year agoThis one. One of the best motivators. Sense of satisfaction when you get it working and you feel unstoppable (until the next subtle changes happens anyway)
minus-square@[email protected]linkfedilink27•1 year agoI loved scraping until my ip was blocked for botting lol. I know there’s ways around it it’s just work though
minus-squarePennomilinkfedilinkEnglish41•1 year agoI successfully scraped millions of Amazon product listings simply by routing through TOR and cycling the exit node every 10 seconds.
minus-squareferretlinkfedilinkEnglish5•1 year agolmao, yeah, get all the exit nodes banned from amazon.
minus-squarePennomilinkfedilinkEnglish12•1 year agoThat’s the neat thing, it wouldn’t because traffic only spikes for 10s on any particular node. It perfectly blends into the background noise.
minus-square@[email protected]linkfedilinkEnglish3•1 year agoQueue Office Space style error and scrape for 10 hours on each node.
minus-square@[email protected]linkfedilink7•1 year agoI’m coding baby’s first bot over here lol, I could probably do better
minus-square@[email protected]linkfedilink11•1 year agoOr in the case of wikipedia, every table on successive pages for sequential data is formatted differently.
Everyone loves the idea of scraping, no one likes maintaining scrapers that break once a week because the CSS or HTML changed.
deleted by creator
This one. One of the best motivators. Sense of satisfaction when you get it working and you feel unstoppable (until the next subtle changes happens anyway)
I feel this
I loved scraping until my ip was blocked for botting lol. I know there’s ways around it it’s just work though
I successfully scraped millions of Amazon product listings simply by routing through TOR and cycling the exit node every 10 seconds.
That’s a good idea right there, I like that
This guy scrapes
lmao, yeah, get all the exit nodes banned from amazon.
That’s the neat thing, it wouldn’t because traffic only spikes for 10s on any particular node. It perfectly blends into the background noise.
Queue Office Space style error and scrape for 10 hours on each node.
You guys use IP’s?
I’m coding baby’s first bot over here lol, I could probably do better
Token ring for me baybeee
Or in the case of wikipedia, every table on successive pages for sequential data is formatted differently.
Just use AI to make changes ¯_(ツ)_/¯
Here take these: \\
¯_(ツ)_/¯\\ Thanks