Web Application Analysiswebscannersecurityreconnaissancevulnerabilitycrawldictionary

Skipfish

Skipfish is a fully automated active web application security reconnaissance tool that performs recursive crawls and dictionary-based probes to generate an interactive sitemap annotated with security checks.

Description

Skipfish prepares an interactive sitemap for the targeted site by carrying out a recursive crawl and dictionary-based probes. The resulting map is then annotated with the output from a number of active but hopefully non-disruptive security checks. The final report generated by the tool is meant to serve as a foundation for professional web application security assessments.

It is designed for fully automated security reconnaissance of web applications, providing comprehensive coverage through crawling and probing. The tool outputs detailed statistics on scan progress, HTTP requests, database pivots, and issues found categorized by impact level.

How It Works

Skipfish performs a recursive crawl to build a sitemap, using dictionary-based probes with wordlists for discovering content. It conducts active security checks to annotate the map, generates reports with crawl trees, summary views, and issue details. The process involves HTTP requests, TCP handshakes, compression handling, and tracking pivots, nodes, and signatures, saving output including static resources and pivot data for third-party tools.

Installation

bash
sudo apt install skipfish

Flags

-o dirwrite output to specified directory (required)
-W wordlistuse a specified read-write wordlist (required)
-A user:passuse specified HTTP authentication credentials
-F host=IPpretend that 'host' resolves to 'IP'
-C name=valappend a custom cookie to all requests
-H name=valappend a custom HTTP header to all requests
-b (i|f|p)use headers consistent with MSIE / Firefox / iPhone
-Ndo not accept any new cookies
--auth-form urlform authentication URL
-d max_depthmaximum crawl tree depth (16)
-c max_childmaximum children to index per node (512)

Examples

Using the given directory for output (-o 202), scan the web application URL (http://192.168.1.202/wordpress)
skipfish -o 202 http://192.168.1.202/wordpress
Display help and usage information for skipfish
skipfish -h
Basic usage pattern: specify options, required wordlist, output directory, and start URL(s)
skipfish [ options ... ] -W wordlist -o output_dir start_url
Scan with HTTP authentication credentials
skipfish -A user:pass -W wordlist -o output_dir http://example.com
Append a custom cookie to all requests during scan
skipfish -C name=val -W wordlist -o output_dir http://example.com
Limit maximum crawl tree depth to 10
skipfish -d 10 -W wordlist -o output_dir http://example.com
Updated 2026-04-16kali.org ↗