dcrawl is a simple, but smart, multithreaded web crawler for randomly gathering huge lists of unique domain names.
How does dcrawl work?
dcrawl takes one site URL as input and detects all a href= links in the site’s body. Each found link is put into the queue. Successively, each queued link is crawled in the same way, branching out to more URLs found in links on each site’s body.
dcrawl Web Crawler Features
- Branching out only to predefined number of links found per one hostname.
Read the rest of dcrawl – Web Crawler For Unique Domains now! Only available at Darknet.