Tried searching for an answer for this, couldn't find any, so here's my situation.
Been using scrapebox for awhile now and I keep running across this problem, not sure what's causing it, but I have a hunch it has to do with proxies.
I usually scrape urls with google and yahoo, 1000 results per keyword, custom footprint and keyword lists of at least several thousand, and with proxies. What I keep seeing happen is Google will be scraping along fine, and the longer it goes the slower it scrapes and eventually just stops scraping. It's not like scrapebox freezes and crashes, the number of urls harvested just stops moving up.
I usually just abort after it being frozen for a long time, and see that most of the keywords haven't even been scraped for.
This happened today. I was using 28 free proxies I harvested from scrapebox, (only working ones I got from all sources) plus 10 private proxies of my own. Had a keyword list of 20,000, good custom footprint, and Google harvested about 6500 urls, then just stopped. After aborting and looking at the keyword list, it had gone through about 5% of the keywords.
Does the scraping process stop because the proxies get burnt out and banned from the search engines? That's the only explanation I can think of.
What do you think? And is what is the remedy? More proxies?
Thanks for your input!
Been using scrapebox for awhile now and I keep running across this problem, not sure what's causing it, but I have a hunch it has to do with proxies.
I usually scrape urls with google and yahoo, 1000 results per keyword, custom footprint and keyword lists of at least several thousand, and with proxies. What I keep seeing happen is Google will be scraping along fine, and the longer it goes the slower it scrapes and eventually just stops scraping. It's not like scrapebox freezes and crashes, the number of urls harvested just stops moving up.
I usually just abort after it being frozen for a long time, and see that most of the keywords haven't even been scraped for.
This happened today. I was using 28 free proxies I harvested from scrapebox, (only working ones I got from all sources) plus 10 private proxies of my own. Had a keyword list of 20,000, good custom footprint, and Google harvested about 6500 urls, then just stopped. After aborting and looking at the keyword list, it had gone through about 5% of the keywords.
Does the scraping process stop because the proxies get burnt out and banned from the search engines? That's the only explanation I can think of.
What do you think? And is what is the remedy? More proxies?
Thanks for your input!