Looplines Scrapebox List

Welcome, Guest
You have to register before you can post on our site.

Username
  

Password
  





Search Forums

(Advanced Search)

Forum Statistics
» Members: 3,747
» Latest member: LisaKulik
» Forum threads: 834
» Forum posts: 18,668

Full Statistics

Online Users
There are currently 180 online users.
» 0 Member(s) | 180 Guest(s)

Latest Threads
[Dichvusocks.us] Service ...
Forum: ScrapeBox Proxies
Last Post: dichvusocks.us
19 minutes ago
» Replies: 4,884
» Views: 1,346,685
[Vn5socks.net] Service Se...
Forum: ScrapeBox Proxies
Last Post: DavidRock99999
51 minutes ago
» Replies: 4,519
» Views: 935,329
[Shopsocks5.com][New shop...
Forum: ScrapeBox Proxies
Last Post: shopsocks5.com
1 hour ago
» Replies: 546
» Views: 21,714
Writing a dissertation is...
Forum: Lounge (Off-Topic Discussions)
Last Post: LisaKulik
2 hours ago
» Replies: 0
» Views: 19
Custome date scrapeber
Forum: General ScrapeBox Talk
Last Post: blitzme
Yesterday, 02:10 PM
» Replies: 0
» Views: 36
Help with custom crawler
Forum: General ScrapeBox Talk
Last Post: Nosh
Yesterday, 12:12 PM
» Replies: 1
» Views: 132
Model Making
Forum: Buy & Sell Services
Last Post: pammodels
Yesterday, 07:10 AM
» Replies: 0
» Views: 47
403 Error when harvesting...
Forum: General ScrapeBox Talk
Last Post: loopline
02-20-2018, 10:54 PM
» Replies: 5
» Views: 376
Fast HTTP/Socks5 proxy fo...
Forum: Buy & Sell Services
Last Post: Nosok
02-20-2018, 10:24 AM
» Replies: 11
» Views: 6,527
How do you scrape for web...
Forum: General ScrapeBox Talk
Last Post: loopline
02-17-2018, 12:42 AM
» Replies: 2
» Views: 359

 
  Custome date scrapeber
Posted by: blitzme - Yesterday, 02:10 PM - Forum: General ScrapeBox Talk - No Replies

Hi, i would like to find video about how to scrape custome date via google and bong, but cant find, any help ?

Print this item

  Help with custom crawler
Posted by: Nosh - 02-19-2018, 01:01 PM - Forum: General ScrapeBox Talk - Replies (1)

Hi everybody,
I´m trying to scrape german Amazon Sellers with the Custom Crawler.

I get the CEO Names and the VAT ID but not the company name and the email. 
Any body can help me?

One example URL:
https://www.amazon.de/sp?_encoding=UTF8&...vasStoreID=

Thanks!



Attached Files Thumbnail(s)
   
Print this item

  How do you scrape for webpages with your backlink efficiently??
Posted by: kevinmac - 02-13-2018, 04:44 PM - Forum: General ScrapeBox Talk - Replies (2)

I have a problem. I am a little new to Scrapebox but I am learning. I am trying to scrape for all web pages that have a backlink to my website. I tried the method of "check links" but I notice that a lot of the results that I get are not the page where my link is for example, I get a lot of these "www.somesite.com/signup". How do I only get results for the actual page that has my link only?

Print this item

  403 Error when harvesting anything
Posted by: Simonboy - 02-11-2018, 08:41 PM - Forum: General ScrapeBox Talk - Replies (5)

Hello,


I'm not sure if anyone has posted an answer to this (if so please redirect me), but for the past day I can't seem to harvest anything. SB was working great for me for the past couple of months and now when I try to use Rapid Indexer, Grab emails, or pull anything from my list of urls, I get 403 and Timed Out errors. I tried 10 new proxies, and it didn't help. I tried maxing the timeout-didn't help.


Can anyone assist? Why am I suddenly getting these errors with all my harvesting?

Print this item

  Autosave passed proxies to file issue
Posted by: ClintN - 02-09-2018, 10:12 AM - Forum: General ScrapeBox Talk - Replies (1)

Hey there,

My ProxyManager Configuration's function to "Autosave Proxies File" actually deletes the designated empty file upon running the test proxies function.

I can trick scrapebox to not deleting the file, by having all of the proxies to be tested saved to the file by scrapebox first. However when running the text proxies function the file size does not change (increase) every 60 seconds, therefore I can only assume that this is not working.

Are there any other setting that could be preventing this from working properly

Print this item

  ScrapeBox Name Feature
Posted by: Santiago.TDI - 02-08-2018, 01:34 AM - Forum: General ScrapeBox Talk - Replies (1)

Hi im new with the software and with the Scrape im just learning about it today, i have a web page that i will want to extract Phone, mail and the name of the person who is already in the web page but i dont see any options how to do it.

I hope its not a big trouble if someone did it in the past just let me know if im using the right software or not.

thanks very much
best regards
STM

Print this item

  Im pulling out my hair to find an older version email scrape on craigL is not working
Posted by: Majesticsword - 01-31-2018, 04:55 AM - Forum: General ScrapeBox Talk - Replies (1)

There is a nice socket error 10054.

Im wasting loads of time looking on spammy websites for free downloads of the old version with no avail....what do i do?

anyone have a link? or previous version download, it dosent revert in the help section on the software

Print this item

  BUG - Automator plugin
Posted by: skibbbi - 01-31-2018, 02:35 AM - Forum: General ScrapeBox Talk - Replies (3)

If I check "Append to file" Automator add to file only : (colon) and nothing more. If I uncheck this option everything working OK, but of course Automator overwrites the file.

Scrapebox and plugin updated.

Regards,



Attached Files Thumbnail(s)
   
Print this item

  Scrape Mobile number Data
Posted by: Ammie Chng - 01-30-2018, 06:21 AM - Forum: General ScrapeBox Talk - Replies (1)

Hi, anyone know how can scrape contact number from different countries Eg: Singapore, Malaysia or Indonesia? Or could provide the regex patterns to suit phone number formats for different countries?

Print this item

  How to grab links by crawling a site from more then one site at the same time?
Posted by: skibbbi - 01-27-2018, 10:14 PM - Forum: General ScrapeBox Talk - Replies (2)

Hi,

Of course I know Grab / Check -> Grab links by crawling a site but this is working only for one site at the same time.
I need to find a way to do it simultaneously for many Url's.
Any suggestions?

Regards,

Print this item