- Web recon is the foundation of a thorough security assessment
- Primary goals of web recon:
- Identifying assets
- Discovering hidden info
- Analyzing attack surface
- Gathering intel
## Types Recon
- Active recon: attacker directly interacts with the target to gather info
- Usually provides a more comprehensive view of target's infrastructure and security posture but carries a high risk of detection
| Technique | Description | Example | Tools | Risk of Detection |
| ------------------------ | ------------------------------------------------------------------------ | ------------------------------------------------------------------------------------------------------------------------------------- | ---------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------------ |
| `Port Scanning` | Identify open ports and services | Using Nmap to scan a web server for open ports like 22, 80, 443, 445 | Nmap, Masscan, Unicornscan | High: Direct interaction with the target can trigger IDS/IPS and FWs |
| `Vulnerability Scanning` | Probe target for known vulns , e.g., outdated SW or misconfigurations | Running Nessus against a web application to check for SQL injection flaws or XSS vulns | Nessus, OpenVAS, Nikto | High: Vulnerability scanners send exploit payloads that security solutions can detect. |
| `Network Mapping` | Map target's topology, including connected devices and relationships | Using traceroute to determine the path packets take to reach the target server, revealing potential network hops and infrastructure. | Traceroute, Nmap | Medium to High: Excessive or unusual network traffic can raise suspicion. |
| `Banner Grabbing` | Retrieve service banners | Connecting to a web server on port 80 and examining the HTTP banner to identify the web server SW and version info | Netcat, curl | Low: Banner grabbing typically involves minimal interaction but can still be logged. |
| `OS Fingerprinting` | Identify target's OS | Using Nmap's OS detection capabilities (`-O` flag) | Nmap, Xprobe2 | Low: OS fingerprinting is usually passive, but some advanced techniques can be detected. |
| `Service Enumeration` | Determine the specific versions of open services | Using Nmap's service version detection (`-sV` flag) | Nmap | Low: Similar to banner grabbing, service enumeration can be logged but is less likely to trigger alerts. |
| `Web Spidering` | Crawl the target website to identify web pages, directories, files, etc. | Running a web crawler like Burp Suite Spider or OWASP ZAP Spider to map out the structure of a website and discover hidden resources. | Burp Suite Spider, OWASP ZAP Spider, Scrapy (customisable) | Low to Medium: Can be detected if the crawler's behaviour is not carefully configured to mimic legitimate traffic. |
- Passive Recon: attacker gather info about target without direct interaction
- Typically relies on publicly available info and resources
- Stealthier but less comprehensive
| Technique | Description | Example | Tools | Risk of Detection |
| ----------------------- | ------------------------------------------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------- | ----------------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------ |
| `Search Engine Queries` | Utilize search engines to uncover information, e.g., websites, social media profiles, and news articles | Searching Google for "`[Target Name] employees`" to find employee information or social media profiles. | Google, DuckDuckGo, Bing, and specialized search engines (e.g., Shodan) | Very Low: Search engine queries are normal internet activity and unlikely to trigger alerts. |
| `WHOIS Lookups` | Query WHOIS databases to retrieve domain registration info | Performing a WHOIS lookup on a target domain to find the registrant's name, contact information, and name servers. | whois command-line tool, online WHOIS lookup services | Very Low: WHOIS queries are legitimate and do not raise suspicion. |
| `DNS` | Analyze DNS records to identify subdomains, mail servers, etc. | Using `dig` to enumerate subdomains of a target domain. | dig, nslookup, host, dnsenum, fierce, dnsrecon | Very Low: DNS queries are essential for internet browsing and are not typically flagged as suspicious. |
| `Web Archive Analysis` | Examine historical snapshots of the target's website to identify changes, vulns, or hidden info | Using the Wayback Machine to view past versions of a target website to see how it has changed over time. | Wayback Machine | Very Low: Accessing archived versions of websites is a normal activity. |
| `Social Media Analysis` | Gather info from social media platforms, e.g., LinkedIn | Searching LinkedIn for employees of a target organisation to learn about their roles, responsibilities, and potential social engineering targets. | LinkedIn, Twitter, Facebook, specialised OSINT tools | Very Low: Accessing public social media profiles is not considered intrusive. |
| `Code Repositories` | Analyze publicly accessible code repositories, e.g., GitHub for exposed creds or vulns | Searching GitHub for code snippets or repositories related to the target that might contain sensitive information or code vulnerabilities. | GitHub, GitLab | Very Low: Code repositories are meant for public access, and searching them is not suspicious. |