Crawling is a technique used to recursively follow those links and build the indexed website architecture. This architecture sometimes contains interesting links (admin log-in pages, API...) testers can focus on.
hakrawler -url $URL
Burp Suite's graphical interface is a great alternative (
Dashboard > New scan (Crawl) then
Once the crawling is over, testers need to inspect the website architecture and look for admin paths, unusual redirections and anything that could lead to a potential vulnerability.