Marial's Notes
  • Hello!
  • Pentesterlab Labs
    • Recon Badge
      • recon_00 (/robots.txt)
      • recon_01 (404 pages)
      • recon_02 (/.well-known/security.txt)
      • recon_03 (directory listing)
      • recon_04 (/admin)
      • recon_05 (wfuzz)
      • recon_06 (vhost)
      • recon_07 (vhost over TLS)
      • recon_08 (alt name)
      • recon_09 (header)
      • recon_10 (visual recon)
      • recon_11 (virtual host brute)
      • recon_12 (load balance)
      • recon_13 (TXT)
      • recon_14 (zone transfer)
      • recon_15 (int zone transfer)
      • recon_16 (bind version)
      • recon_17 (dev name)
      • recon_18 (public repos)
      • recon_19 (find email)
      • recon_20 (check branches 1)
      • recon_21 (check branches 2)
      • recon_22 (deleted file)
      • recon_23 (commit message)
      • recon_24 (assets)
      • recon_25 (S3)
      • recon_26 (JS)
  • TryHackMe Rooms
    • Basic Pentesting
    • EasyPeasy
    • Kenobi
    • Vulnversity
Powered by GitBook
On this page
  • OBJECTIVE
  • THE ROBOTS.TXT FILE
  • SOLUTION

Was this helpful?

  1. Pentesterlab Labs
  2. Recon Badge

recon_00 (/robots.txt)

PreviousRecon BadgeNextrecon_01 (404 pages)

Last updated 7 months ago

Was this helpful?

View the exercise here:

OBJECTIVE

For this challenge, your goal is to retrieve the robots.txt from the main website for hackycorp.com.

THE ROBOTS.TXT FILE

The robots.txt file is used to tell web spiders how to crawl a website. To avoid having confidential information indexed and searchable, webmasters often use this file to tell spiders to avoid specific pages. This is done using the keyword Disallow. You can find more about the robots.txt file by reading

SOLUTION

Checking the /robots.txt file reveals paths that are restricted from search engine indexing. These paths may contain sensitive or hidden information that attackers can exploit, making it essential to review during reconnaissance.

We'll find the flag for this challenge in hackycorp.com/robots.txt.

PentesterLab: Recon 00
Robots exclusion standard