ScrapeBox Forum
Need A Footprint For Local Sites...Help! - Printable Version

+- ScrapeBox Forum (https://www.scrapeboxforum.com)
+-- Forum: ScrapeBox Main Discussion (https://www.scrapeboxforum.com/Forum-scrapebox-main-discussion)
+--- Forum: Scrapebox Footprints (https://www.scrapeboxforum.com/Forum-scrapebox-footprints)
+--- Thread: Need A Footprint For Local Sites...Help! (/Thread-need-a-footprint-for-local-sites-help)



Need A Footprint For Local Sites...Help! - BeyondPro - 12-26-2013

Hello,

So I sell Internet marketing services to local businesses all over the USA. I typically go to each URL of each business and find the company email address manually and compile a huge list. It takes forever.

I bought Scrapebox to help speed up the process however I feel like writing Footprints is an art that I have not mastered.

Lets say I want to pull all the URLs of lawyers in New York, New York the largest city in the US, how would I do it?

What footprint would be best to specifically pull just the URLs of law firms in New York, New York?

Can anyone help?


RE: Need A Footprint For Local Sites...Help! - loopline - 12-30-2013

Well you could do like

"new york new york lawyers"

Then take the domains and trim to root and do a site:domain.com

Then run the email grabber on the resulting list.


Next up you could take the results from "new york new york lawyers" before you trim to root and then load them in the link extractor addon. If you have the automator, this can be done automatically.

Thats only going to give you the first 1000 results though.

So you would want to expand your keywords.

So you could try zip codes of new york, like

laywers 45678

you could also use street names or sections of town, like

laywers bronx

etc...

So phase 1 is get together a decent set of footprints


phase 2 is just scrape them all and then use the site: or link extractor or both to scrape as many internal pages of the sites as you can.

Then run them all thru the email grabber. You can go to options and turn on the option to save the url with the email so you know where the email came from.