Recon

Recon is an essential part of the Red Teaming Activity. Any professional penetration test begins with this step. The goal of this phase is to collect as much information about the target as possible. This includes technical details about the network's topology and systems. However, it also contains information on employees and the company itself that may be useful later in the penetration test. The more information you collect during the reconnaissance phase, the more likely you are to succeed later in the penetration test. There are two types of reconnaissance:-
#Passive Recon
Passive recon is the process of gathering information about a target without directly interacting with it. This means you don't send any type of request to the target, so they have no way of knowing you're gathering information on them. In general, passive information gathering makes use of publicly available resources that contain information on the target. Open-source intelligence refers to the use of public resources to gather information (OSINT). OSINT can be used to gather information such as IP addresses, domain names, email addresses, names, hostnames, DNS records, and even what software is running on a website and the CVEs associated with it. Here are the steps used in passive recon:-
#1. Target and its Acquisitions Information Gathering
We can use different websites like Linkedin and Built to gather information about the Target Company.
Tools for Acquisition Information
Crunchbase
Wikipedia
#2. Email Gathering
Organization email can be used for brute force, Phishing campaigns, etc.
Tools:-
Phonebook
Crosslinked
#3. Cloud Asset Enumeration
Cloud assets are used by every organization to keep their data. Mainly S3 buckets and EC2 instance are the two cloud asset type that is targeted by hackers. Both manual and automated approach is used to gather cloud asset for better result.
For Automation:-
Slurp
S3 scanner
AWSBucketdump
Cloudbrute
For Manual:-
Github Dorking
Google Dorking
Bitbucket Dorking
Gitlab Dorking
#4. CIDR Ranges
Classless Inter-Domain Routing (CIDR) is a range of IP addresses a network uses. A CIDR address looks like a normal IP address, except that it ends with a slash followed by a number.
We can enumerate Target CIDR to widen the scope of engagement.
Tools:-
Hardcidr
#5. Subdomain Enumeration
A subdomain name is an additional information added to the beginning of a website’s domain name. It allows websites to separate and organize content for a specific function — such as a blog or an online store — from the rest of your website. Automation is the best way for subdomain enumeration.
Tools:-
Amass
Subfinder
Sublit3r
Findomain
SubBrute
Anubis
AORT (All-In-one-recon tool)
Rengine
Dismap
After finding the Raw Subdomains, we can use tools to validate the subdomains and get Live Subdomains for further steps.
Tools -
HTTPX
DNSX
#6. Parameter Gathering
Parameters can be used to provide endpoints for attacking the Target. Hidden and juicy endpoints always lead to some bugs.
Tools :-
Waybackurls
Gau
Gauplus
Paramspider
Gospider
#7. Javascript File Enumeration
Javascript files are very essential for the client-side functioning of the website. So, javascript files always contain things like Api Keys, Endpoints, Parameter, Token, etc.
We can enumerate javascript files to find sensitive data inside it.
Tools:-
JS Parser
LinkFinder
jsendpoints
#8. Github Recon
GitHub is a version management and collaboration tool for programming. Developers, hackers, analysts, and others use GitHub to save their code projects worldwide. Many critical data are available on the GitHub repository, allowing hackers to gather knowledge about the company and discover security flaws. Other websites, such as GitLab and bitbucket, are used for the same purpose, but GitHub is the most popular.
Tools:-
Githound
Gitrob
Githarvester
trufflehog
Manually Github Dorking is the best way for Github Recon.
#9. Organisation Tech-stack
The organization's tech stack is understanding what technologies are utilized by the organization.
Tools
wappy
Wappalyzer
whatweb
#Active Recon
Active recon is when you interact directly with a computer system in order to gather system-specific information about the target. Unlike passive information gathering, which is based on publicly available data, active information gathering is based on tools that send various types of requests to the computer. The goal is to gather information about that device or other devices on the same network that is connected to it. Active recon can be used to discover open/closed ports, the operating system of a machine, the services that are running, banner grabbing, discovering new hosts, or locating vulnerable applications on a host. The primary disadvantage of active reconnaissance over passive reconnaissance is that direct interaction with the host has the potential to trigger the system's IDS/IPS and alert people to your activity. Here are the steps used in active recon:-
#1. Port Scanning
In the port scanning phase, all the CIDR ranges are converted to ip addresses and after that, every subdomain and IP address is scanned using some vulnerability scanner to check which ports are open and if the ports are open which service is running on the port.
This is a crucial step in Active Recon because its result is used as a base for further Active Recon Processes.
Tools:-
Nmap
Rustscan
Nuclei
skanuvaty
After Port Scanning, tools are also used to take screenshots of every web-based domain to understand the target batter.
Tools:-
Aquatone
#2. Vulnerability Scanning
After the port scan, assets are scanned for vulnerability through automation tools. It includes mostly CVE-based and regex-based scanning.
Tools :-
Nessus
Acunetix
Nikto
Nuclei
#3. Fuzzing
Fuzz testing or fuzzing is an automated software testing method that injects invalid, malformed, or unexpected inputs into a system to reveal software defects and vulnerabilities.
Fuzzing is used for web-based assets to find some hidden or sensitive endpoint that can lead to some bug.
Tools:-
Fuff
dirsearch
dirbuster
gobuster
#4. JS Scan
Javascript files are very important for the function of a website on the client side. There are tons of sensitive information that can be found in javascript files. For example - endpoints, API keys, passwords, usernames, tokens,etc.
Tools can be used to automate the crawling of websites for Javascript files. After gathering javascript files, we can use regex-based tools to find sensitive data inside it.
Tools:-
JSFScan.sh
JS Parsing
#Other Tools
Metabigor
spoofchecks
certSniff
dnsrecon
shodan.io
Last updated