(07-30-2014, 01:25 AM)loopline Wrote: There is no way to make 100% sure except for to scrape and post and see what sticks. You could hone your footprint a bit perhaps, but else thats what mass is about. Besides that, if the site changed since googles last scrape, or SER can't submit to it for some reason or it goes to moderation etc... There is just no way to know 100%. The best way is to look at the successful sites and build the footprint from there, and then you can look at failed sites and their url structure and see if you can build a url structure element that you can use scrapebox to "remove urls containing" and then enter that in and strip some off.
Else you will have to always slog thru some failures.
I've got the point. Just wanted to get clearly about that. Thank you very much .
Maybe you will also know for the answer for that question: It is better to verify our URL's before the blast, through GSA identify platform tool? We could have a list of 30k URL's and end up with 10k, because GSA don't recognize some and won't post to them. Am I right?