Crawl Domain Find Expred Domains -- 2
$10-15 USD
Pagado a la entrega
We are looking for a crawler to crawl every page of a website looking for external links pointing to expired domains.
User should definde a list of sites to crawl via text file. Crawler should work logically crawling all pages of a site and not be sitemap dependent. Only unique external domains should be logged to prevent duplicate domain availability lookups.
User should also be able to define a list of urls to ignore checking for availability; eg. [login to view URL] etc. these domains should be user defined in a blacklist text file.
Results should be given in a csv file listing linking domain and available domain.
Nº del proyecto: #19765484
Sobre el proyecto
6 freelancers están ofertando un promedio de $28 por este trabajo
Hi, I have gone through your requirement to scrape lots of websites. I am EXPERT in building scraping tools /scripts. Hence, I can SURELY work on your project. I am having 4 YEARS of EXPERIENCE in developing PHP-PYTHO Más
Dear Prospect Hiring Manager. Thank you for giving me a chance to bid on your project. i am a serious bidder here and i have already worked on a similar project before and can deliver as u have mentioned "I can do th Más
I have experiment in crawling data using be4, scrapy,... with python, extract data to xml, json,... Contact me!
Hi there JUNA here. I understand that you need a crawler or A SPIDER for scraping expired domain but my question is will you provide the list of domains that you need to check for the availability otherwise this sho Más