Sort of. There is indeed a blacklist. However it doesn't happen in real time, so when the harvest is done you go to remove/filter >> apply users local blacklist
You can manage the blacklist in the main menu at the top of scrapebox under blacklist
You can also simply maintain this in a file and then go to remove/filter >> remove urls containing entries from - and then select that text file, this will work with more then just domains.
Or you could maintain a file of just domains and go to import >> import and compare on domain level - and this will remove and domains that are in your list from the domains you harvested.
None of its real time though, Scrapebox will harvest everything and then you just remove it afterward. It used to be real time but that created multiple issues.
(10-07-2016, 06:28 PM)loopline Wrote: Sort of. There is indeed a blacklist. However it doesn't happen in real time, so when the harvest is done you go to remove/filter >> apply users local blacklist
You can manage the blacklist in the main menu at the top of scrapebox under blacklist
You can also simply maintain this in a file and then go to remove/filter >> remove urls containing entries from - and then select that text file, this will work with more then just domains.