11-25-2016, 05:43 AM
I know SB goes a bit flaky over 1m URL's or at least it did before. We have a list of 20M> URL's and wanted to clear it of duplicate domains. SB is probably best for this but can it cope? Not needing to harvest, just clear a big list. So my question is whether SB can handle extremely large lists and is there a limit?
I would like suggestions on strategies on dealing with extremely large lists like this please.
Also has version 2 I think it is improved on this? Still haven't upgraded as old version was doing a good job.
I would like suggestions on strategies on dealing with extremely large lists like this please.
Also has version 2 I think it is improved on this? Still haven't upgraded as old version was doing a good job.