07-30-2014, 11:58 PM
Im not sure its "better" to verify them first, but that would make for a "Cleaner" posting in SER.
I mean SER will have to load each of the 30K urls, and then match it to a known good footprint, then give you back the 10K good. Then when you go to post to the 10K, it has to load them again, match against the same set of footprints and then post. So you loaded the 30K urls once and 10K urls again thus resulting in 40K url loads.
Vs just loading them in as targets, then the 30K all get loaded, the good get posted to, the bad get discarded, so 30K total url loads. This would be quicker and simpler.
Unless you plan on loading the end resulting list to like 10 projects and you are not using any of the global lists to accomplish this. So then you are loading 30K urls to 10 projects so 300K url loads. In this case its better to filter it down to 10K urls first as you would have to only load 130K total url loads (30 from the filter process and 10K x 10 projects).
I mean SER will have to load each of the 30K urls, and then match it to a known good footprint, then give you back the 10K good. Then when you go to post to the 10K, it has to load them again, match against the same set of footprints and then post. So you loaded the 30K urls once and 10K urls again thus resulting in 40K url loads.
Vs just loading them in as targets, then the 30K all get loaded, the good get posted to, the bad get discarded, so 30K total url loads. This would be quicker and simpler.
Unless you plan on loading the end resulting list to like 10 projects and you are not using any of the global lists to accomplish this. So then you are loading 30K urls to 10 projects so 300K url loads. In this case its better to filter it down to 10K urls first as you would have to only load 130K total url loads (30 from the filter process and 10K x 10 projects).