Web Application Analysiswebcrawlerendpointsurlsjavascriptgolangdiscovery

hakrawler

Web crawler designed for easy, quick discovery of endpoints and assets. Fast golang web crawler for gathering URLs and JavaScript file locations.

Description

hakrawler is a simple web crawler tool built using the Gocolly library, focused on rapid discovery of web endpoints and assets. It helps security professionals and penetration testers identify URLs and JavaScript files during reconnaissance phases of web application testing.

Use cases include automated HTTP endpoint discovery as part of web application enumeration methodologies. It is particularly useful for quickly mapping out a web application's structure without manual browsing.

The tool is lightweight with an installed size of 9.37 MB and depends on libc6. It integrates into Kali Linux workflows for efficient asset gathering.

How It Works

hakrawler operates as a Golang-based web crawler using the Gocolly library to traverse websites. It follows links to a configurable depth, extracts URLs and JavaScript file locations, and supports custom headers and TLS options for flexible crawling.

Installation

bash
sudo apt install hakrawler

Flags

-d intDepth to crawl. (default 2)
-h stringCustom headers separated by two semi-colons. E.g. -h "Cookie: foo=bar;;Referer: http://example.com/"
-insecureDisable TLS verification.
-jsonOutput as JSON.

Examples

Display usage information and available flags.
hakrawler --help
Crawl example.com to depth 3 to discover endpoints.
hakrawler -d 3 example.com
Crawl with custom headers for authenticated endpoint discovery.
hakrawler -h "Cookie: session=abc123;;User-Agent: Mozilla/5.0" target.com
Crawl HTTPS site with disabled TLS verification.
hakrawler -insecure https://target.com
Crawl and output discovered URLs in JSON format.
hakrawler -json target.com
Crawl to depth 2 with JSON output and custom Referer header.
hakrawler -d 2 -json -h "Referer: http://example.com" target.com
Updated 2026-04-16kali.org ↗