11-19-2016, 01:30 PM
yup it sounds like your killing the squid proxies way too quickly - probably due to your settings
I have 100 back connect proxies so my proxies rotate every 10 minutes
here is how I have things setup
8 threads when running harvester (100 proxies added) 17 sec wait time for proxy - harvester wait time is set to 30 sec - I also have retry proxies set to max which is 10 I think (Im wondering if scrapebox could code something for unlimited retries with delays or have some type of back connect proxies checkbox that will auto add the settings for us)
From your screenshots I can see you was able to scrape google but i see all those errors too..
Squid are excellent proxy providers but im not sure they are the right choice for something like this.. I personally use squid proxies for my social accounts such as google and youtube I dont scrape with them
I have 100 back connect proxies so my proxies rotate every 10 minutes
here is how I have things setup
8 threads when running harvester (100 proxies added) 17 sec wait time for proxy - harvester wait time is set to 30 sec - I also have retry proxies set to max which is 10 I think (Im wondering if scrapebox could code something for unlimited retries with delays or have some type of back connect proxies checkbox that will auto add the settings for us)
From your screenshots I can see you was able to scrape google but i see all those errors too..
Squid are excellent proxy providers but im not sure they are the right choice for something like this.. I personally use squid proxies for my social accounts such as google and youtube I dont scrape with them