The Ultimate Managed Hosting Platform

Welcome, Guest
You have to register before you can post on our site.

Username
  

Password
  





Search Forums

(Advanced Search)

Forum Statistics
» Members: 3,641
» Latest member: Basestian
» Forum threads: 2,763
» Forum posts: 34,943

Full Statistics

Online Users
There are currently 139 online users.
» 2 Member(s) | 135 Guest(s)
Applebot, Google, dichvusocks.us, tisocks

Latest Threads
{Tisocks.com} - Socks5 Pr...
Forum: ScrapeBox Proxies
Last Post: tisocks
5 minutes ago
» Replies: 3,022
» Views: 1,062,264
[Dichvusocks.us] Service ...
Forum: ScrapeBox Proxies
Last Post: dichvusocks.us
12 minutes ago
» Replies: 7,913
» Views: 3,210,796
SocksHub.net-400,000+ Pri...
Forum: Sell Services
Last Post: SocksHub
4 hours ago
» Replies: 45
» Views: 12,660
wildcards in user black l...
Forum: General ScrapeBox Talk
Last Post: loopline
5 hours ago
» Replies: 1
» Views: 18
[Vn5socks.net] Service Se...
Forum: ScrapeBox Proxies
Last Post: DavidRock99999
7 hours ago
» Replies: 8,180
» Views: 2,625,441
[Shopsocks5.com] Service ...
Forum: ScrapeBox Proxies
Last Post: shopsocks5.com
8 hours ago
» Replies: 2,806
» Views: 359,195
ScrapeBox Email Scraper
Forum: General ScrapeBox Talk
Last Post: hummersport
9 hours ago
» Replies: 0
» Views: 14
PremSocks.com - SOCKS Pro...
Forum: ScrapeBox Proxies
Last Post: PremSocks
9 hours ago
» Replies: 22
» Views: 8,714
[Hostpoco]★SSD Linux Rese...
Forum: Sell Services
Last Post: Pocomaster
Today, 11:02 AM
» Replies: 0
» Views: 17
The essence of perfect sh...
Forum: Off Topic
Last Post: georgejtoliver
Today, 07:05 AM
» Replies: 0
» Views: 20

 
Photo ScrapeBox Email Scraper
Posted by: hummersport - 9 hours ago - Forum: General ScrapeBox Talk - No Replies

Hellow !

Please tell me:
    - how to limit the search for letters to one section, I'm interested in the section (for example, so that I could get all Email only from the "Jobs" section).
    - how you can receive Email from this section ("Jobs") and the URLs where they are located (for example, the "Email | URL" table or the "email, url" list).

As a result, I need to get a list or table in which there is an Email only from the "Jobs" section, and the URL where this email is located.



Attached Files Thumbnail(s)
   
Print this item

  wildcards in user black list - possible?
Posted by: walkallone - Today, 09:39 AM - Forum: General ScrapeBox Talk - Replies (1)

Hello,

is it possible to use wildcards in user's blacklist?

why ?

I want to filter out some words in domains -  like all domains with ".ru"  or ".tk" or....   (many)

there is the option to remove urls containing  ..  but i have quite a long list of strings to filter out.  it takes a lot of time to manually do it.  is there a way to automate it?  maybe through users blacklist??

thank you

Print this item

  Perfectly Beautiful,Running is your life
Posted by: georgejtoliver - Today, 06:57 AM - Forum: General ScrapeBox Talk - No Replies

[font=Verdana, Geneva, Lucida, "Lucida Grande", Arial, Helvetica, sans-serif]To kick-off their re-entry into the basketball category, PUMA honored basketball legend Walt "Clyde" Frazier with a lifetime contract today in Brooklyn, presented by PUMA Global Director of Brand and Marketing, Adam Petrick. [/font][font=Verdana, Geneva, Lucida, "Lucida Grande", Arial, Helvetica, sans-serif]puma basketball shoes mens[/font][font=Verdana, Geneva, Lucida, "Lucida Grande", Arial, Helvetica, sans-serif] has a long history in the basketball space with Walt "Clyde" Frazier and the first ever endorsed sneaker the PUMA Clyde that was launched in 1973.[/font]
[font=Verdana, Geneva, Lucida, "Lucida Grande", Arial, Helvetica, sans-serif]</br></br>[/font]
[font=Verdana, Geneva, Lucida, "Lucida Grande", Arial, Helvetica, sans-serif]In addition to the [/font][font=Verdana, Geneva, Lucida, "Lucida Grande", Arial, Helvetica, sans-serif]puma cell shoes womens[/font][font=Verdana, Geneva, Lucida, "Lucida Grande", Arial, Helvetica, sans-serif] will also be releasing a "Carnage" version of the model rightfully dubbed the BAIT x MARVEL x Puma Cell Venom Carnage. The Black (Venom) pair will not be easy to cop, but this Red (Carnage) pair will be even harder to pick up.[/font]
[font=Verdana, Geneva, Lucida, "Lucida Grande", Arial, Helvetica, sans-serif]</br></br>[/font]
[font=Verdana, Geneva, Lucida, "Lucida Grande", Arial, Helvetica, sans-serif]Introducing the all new [/font][font=Verdana, Geneva, Lucida, "Lucida Grande", Arial, Helvetica, sans-serif]reebok nano x sale[/font][font=Verdana, Geneva, Lucida, "Lucida Grande", Arial, Helvetica, sans-serif] Reebok's newest and most versatile training shoe. For more than a decade, Reebok has been dedicated to creating unrivaled competitive training shoes as part of the Nano family of shoes. For its 10th iteration, Reebok is launching the Nano X, engineered to be the most versatile Nano yet , with added features to make the shoe a go-to option for not only seasoned gym-goers and world class athletes, but also those new to functional fitness or those that enjoy a range of cross-training/high-intensity workouts.[/font]
[font=Verdana, Geneva, Lucida, "Lucida Grande", Arial, Helvetica, sans-serif]</br></br>[/font]
[font=Verdana, Geneva, Lucida, "Lucida Grande", Arial, Helvetica, sans-serif]adidas and Reebok have come together to create something revolutionary, the [/font][font=Verdana, Geneva, Lucida, "Lucida Grande", Arial, Helvetica, sans-serif]reebok instapump fury boost[/font][font=Verdana, Geneva, Lucida, "Lucida Grande", Arial, Helvetica, sans-serif].The ultimate statement shoe, the Instapump Fury Boost is built for breaking free and creating the unexpected. The trailblazing Instapump Fury has long been the choice of influencers and iconoclasts. Now, with the unrivalled comfort of adidas Boost technology, it's back to push the limits of design and performance.[/font]

[font=Verdana, Geneva, Lucida, "Lucida Grande", Arial, Helvetica, sans-serif][Image: cllful642.jpg][/font]

Print this item

  Page Scanner is not recording Results
Posted by: goozleology - Yesterday, 02:41 AM - Forum: General ScrapeBox Talk - Replies (2)

Basically, I've created some footprints and the Page Scanner scans and says it's completed but there are no results.

Here's a video with more details: https://share.getcloudapp.com/7KupZAge

As far as I can tell, I've set everything up correctly.

Print this item

  Possibility of adding meta description to harvester engine settings ?
Posted by: walkallone - 03-03-2021, 09:54 AM - Forum: General ScrapeBox Talk - No Replies

Hello,

i have a new question. 

Is it possible to use the "Aditional header / settings"  optin in the harvester setitngs to also grab meta description/ or meta title  when harvesting urls?

And can i display it (the description) in the main window?

Why -  in my experience it is MUCH easier to understand if a site is or is not relevant from URL AND meta description than from just url alone.
so, its way faster to filter out bad results.

thanks.

Print this item

  Link Extractor saved files are blank
Posted by: goozleology - 03-03-2021, 05:59 AM - Forum: General ScrapeBox Talk - Replies (1)

This is really weird because I just scraped like 1800 URLs that found 20X that number of internal links. All of the links are in the TXT file

Then I try to scrape ONE URL, https://www.zipcodestogo.com/ZIP-Codes-by-State.htm, and it finds 70 internal links but when I open the TXT file, there's nothing there.

I also disabled my proxies (they were getting a "socket error # 10054") and then I tried to scrape the homepage of The Onion as a test.

Same problem, it says that 33 internal links were found but nothing saved to the file.

And I tried using my three ProxyMesh proxies, my StormProxies that I just bought today, and no proxies. Every time it says it's saved but the file is empty.

Any ideas?

Also, is there a way to save scraped data with a more descriptive name? It's really difficult to find files when the file name is just a long string of numbers.

Thanks!

Print this item

  Limit search engine results to first 100 or less
Posted by: walkallone - 03-02-2021, 07:14 PM - Forum: General ScrapeBox Talk - Replies (2)

Hello,

i want to limit search engine results to the first 100 results or less. 

How can i do this?

Thank You.

The reason is i'm using scrapebox for non english content, and after the first 40-50-60 or so results, the sites scraped are no longer relevant and its way better and faster to use long tail keywords.

Print this item

  urgent help
Posted by: mohamedabdu - 02-23-2021, 05:51 PM - Forum: General ScrapeBox Talk - Replies (1)

Hi  Scrapebox Team 

Please do not waste time, I very interested to purchase now
 I received and I studied your payment methods you sent it but my problem is I have an issue with my PayPal account.

My scarpebox purchasing connected with my PayPal account  so when I click on scrapebox purchase plugin icon, I will go to PayPal account windows and then I can not see an icon  (the payment on the card) as  photo as below



i want to pay by my own credit card not by PayPal

Print this item

  beginner problem with proxies
Posted by: walkallone - 02-09-2021, 12:02 AM - Forum: General ScrapeBox Talk - Replies (1)

Hello,
just started today using scrapebox.  I have a problem with setting up proxies.  
IF i use proxies as http i can harvest and use search engines but, i cannot scrape (as far as i know)  http sites, only https.
IF i use proxies as SOCK i can scrape  any site (as far as i know)  but i cannot harvest or use search engines.

any idea why?
its a bummer, as i have to manually switch form http to sock

(i use proxies on another tools and are very good , are not banned on G or ...)

Print this item

  Understanding Connection To Proxy Ratio
Posted by: jim - 02-06-2021, 06:20 PM - Forum: General ScrapeBox Talk - Replies (1)

I am watching looplines video on safely scraping google in 2020 and am having a fundamental misunderstanding on the terminology.

It says that there should be one connection for every 5 proxies, or that the connection ratio can vary.

How can more than 1 IP address make a single connection? When a page loads does it not load from a single IP?

How would it be possible for 50 IP addresses to load one connection?

What does "connection" mean in this case? What is a thread?

Print this item