The Art of Attack Surface Mapping
A Guide to Advanced Bug Bounty Reconnaissance
Introduction: Reconnaissance is More Than Subdomains
In the world of bug bounty hunting, there's a saying that's repeated so often it has become a cliché, yet it remains profoundly true: "The bigger the attack surface you map, the more bugs you will find."
Beginners often equate reconnaissance with running a simple subdomain enumeration tool like subfinder, piping the results to httpx, and then immediately firing up a vulnerability scanner. This is the 101 course. It will find you low-hanging fruit and place you in direct competition with thousands of other hunters running the exact same methodology.
Advanced reconnaissance, however, is a different beast entirely. It's a philosophy. It's an iterative, patient, and creative process of digital cartography. It's about understanding a target not just as a collection of web servers, but as a living, breathing digital ecosystem. It involves peeling back layers of history, corporate structure, human error, and technological debt to uncover forgotten assets, misconfigured services, and hidden entry points that automated tools miss.
This guide will take you beyond the basics. We won't just list tools; we will explore the why behind the techniques. We will build a methodology that combines passive data collection, active probing, content analysis, and intelligent automation to create a comprehensive map of your target's digital footprint. Prepare to dive deep, because this is where the most impactful vulnerabilities are found.
Part 1: The Recon Philosophy - Thinking Like a Digital Archaeologist
Before we type a single command, we must adjust our mindset. An advanced recon specialist is part detective, part historian, and part sysadmin.
1. Recon is Iterative, Not Linear: Your recon process should be a loop, or a flywheel. Every piece of new information you discover—an IP address, an employee's name, a technology stack, a company acquisition—should be fed back into the beginning of your process to uncover even more. A new set of subdomains might reveal a new ASN. That ASN might contain IP ranges with forgotten servers. Those servers might be running old software, which you can then search for across all other known assets.
2. The Goal is a Complete Digital Footprint: Don't stop at subdomains. Your goal is to map:
Domains & Subdomains: The obvious starting point.
IP Addresses: Both IPv4 and IPv6.
ASNs (Autonomous System Numbers): The network blocks owned by the company.
Cloud Assets: S3 buckets, Azure blobs, Google Cloud Storage, DigitalOcean Spaces.
Acquired Companies: Their old infrastructure is often a goldmine of vulnerabilities.
Public Code & Documents: GitHub, GitLab, pastebins, leaked documents.
Employee Information: Names, emails, and roles can help with social engineering context or password spraying on non-production environments (always check program scope).
Technology Stack: What frameworks, web servers, and third-party services are they using?
3. Connect Findings to Vulnerability Classes: Always be thinking, "How can this piece of information lead to a bug?"
Finding: An old copyright year in a footer (
© 2015). -> Hypothesis: This page might be running outdated JS libraries vulnerable to XSS.Finding: A subdomain like
jenkins-dev.target.com. -> Hypothesis: Potential for exposed credentials, RCE via script console.Finding: An S3 bucket URL in a JavaScript file. -> Hypothesis: Potential for misconfigured permissions (public read/write).
This mindset transforms recon from a chore into an active hunt.
Part 2: Advanced Passive Reconnaissance - Gathering Intelligence Without Touching the Target
Passive reconnaissance is about collecting publicly available information without sending a single packet directly to the target's primary infrastructure. This is the stealth phase, and it's where we build our foundational dataset.
2.1. Mastering Subdomain Enumeration
Basic subdomain enumeration relies on a few sources. Advanced enumeration uses all of them.
Tool of Choice: amass subfinder is excellent for speed, but amass is the undisputed king of comprehensive passive enumeration. It queries dozens of sources.
The Basic Command:
Bash
amass enum -passive -d target.com -o subdomains.txt
The Advanced Approach: Configuring and Expanding Sources
API Keys are Non-Negotiable: Many services that
amassqueries (like VirusTotal, SecurityTrails, Shodan) offer vastly more data if you provide an API key. Take the time to create free accounts on all supported services and add the keys to theamassconfig file (~/.config/amass/config.ini). This is the single biggest step-up you can make.Certificate Transparency (CT) Logs: Every time an SSL/TLS certificate is issued for a domain, it's logged in public Certificate Transparency logs. This is a goldmine for finding subdomains, including internal hostnames that were accidentally included in a wildcard certificate.
amasssearches this by default, but you can also use tools likectfrorcert.shfor manual inspection.Example: Go to
https://crt.sh/?q=%.target.com
to see a live feed of certificates issued for your target. You'll often find staging, UAT, and development servers.
Permutation Scanning: What if a company uses a predictable pattern for its subdomains, like
dev-app1.target.com,qa-app1.target.com,prod-app1.target.com? Permutation scanning generates a massive wordlist based on known subdomains and common patterns.Tool of Choice:
gotatorBash
# First, get a list of known subdomains
amass enum -passive -d target.com -o subdomains.txt
# Now, generate permutations
cat subdomains.txt | gotator -sub - -depth 2 -numbers 10 -mindup -adv -silent > permutations.txt
This creates variations like
dev-01,test-02, etc., which you can then resolve to see if they exist.
2.2. ASN & IP Space Mapping
This is a core advanced technique. Instead of just looking for names that point to IPs, we find the IP ranges the company owns and then look for live hosts within them. This uncovers servers that have no DNS records pointing to them—truly hidden assets.
What is an ASN? An Autonomous System Number is a globally unique identifier for a large network or group of networks managed by a single entity, like a corporation or an ISP.
The Process:
Find the Target's ASNs:
Bash
amass intel -org "Target Corporation"
This command will search public records for organizations matching the name and return their associated ASNs (e.g.,
AS12345). You can also usewhoison a known IP address.Find IP Ranges from the ASN: Once you have the ASN, you can find all the CIDR (IP address blocks) associated with it.
Bash
amass intel -asn 12345
This will output a list of IP ranges like
192.0.2.0/24.Scan the IP Ranges for Live Web Servers: Now we can use a fast port scanner to find HTTP/HTTPS servers within these ranges. This is technically active, but we're starting from passively-gained information. Tools:
masscan(for speed) +nmap(for detail) orhttpx.Bash
# Scan the entire range for common web ports FAST
masscan -p80,443,8080,8443 --rate 10000 -iL ip_ranges.txt -oG masscan_results.grep
# Extract live IPs and probe them with httpx
grep "Host:" masscan_results.grep | cut -d' ' -f2 | httpx -silent -title -tech-detect -status-code > live_web_servers.txt
This workflow often reveals forgotten login pages, old infrastructure, and internal applications accidentally exposed to the internet.
2.3. Following the Acquisition Trail
Large companies grow by acquiring smaller ones. When this happens, the acquired company's infrastructure is often absorbed, sometimes hastily. This legacy infrastructure can be running outdated software and have different security policies.
How to Find Acquisitions:
Crunchbase: The best resource for M&A data. Search for your target and look at their "Acquisitions" tab.
Wikipedia: The target's company page often has a "History" section listing major acquisitions.
News Articles: Search for
"Target Corporation" acquiresor"Target Corporation" merges.
What to Do with this Info: Once you have a list of acquired companies (e.g., "OldApp Inc."), you treat them as a primary target.
Run your full subdomain enumeration process on
oldapp.com.Look for subdomains like
oldapp.target.com.Check if
OldApp Inc.'s original ASNs and IP ranges are still active and now managed byTarget Corp.
2.4. Public Code Repository Mining (GitHub Dorking)
Developers often make mistakes, and one of the most common is committing sensitive information to public code repositories.
Tool of Choice: gitleaks or manual searching.
Manual GitHub Dorking (Advanced Queries): Go beyond just searching for target.com. Use GitHub's advanced search filters.
Find credentials within the organization's repositories:
org:TargetCorp passwordorg:TargetCorp api_keyorg:TargetCorp "BEGIN PRIVATE KEY"Find sensitive files:
org:TargetCorp filename:.envorg:TargetCorp filename:config.jsorg:TargetCorp filename:secrets.ymlFind hardcoded credentials for third-party services:
"target.com" "s3.amazonaws.com" access_key"target.com" "Authorization: Bearer"Find subdomain leaks in code: Search for
".target.com"to find code that references subdomains. This can reveal internal or API domains.
Automating the Search: Tools like trufflehog and gitleaks can scan repositories or GitHub organizations automatically using regex patterns for secrets.
Bash
gitleaks detect --source https://github.com/target-corp/leaky-repo --verbose
Part 3: Advanced Active Reconnaissance - Probing the Attack Surface
Now that we have a massive dataset of potential assets, it's time to interact with them. We will do this intelligently to gather as much information as possible.
3.1. Strategic Port Scanning and Service Identification
A simple nmap scan on a single host is basic. An advanced approach involves scanning thousands of hosts with a strategy.
The Workflow:
Mass Discovery (Speed): Use
masscanornaabuto quickly identify open ports across all your discovered IPs. Don't scan for service versions here; just find what's open. We'll focus on the most common web ports and other interesting ones.Bash
# Use your compiled list of all IPs from subdomains and ASN scans
cat all_ips.txt | naabu -p 80,443,8000,8080,8443,21,22,25,3306,5432 -c 50 -rate 1000 -o open_ports.txt
Deep Service Analysis (Detail): Now, take the list of hosts with open ports and run a detailed
nmapscan on them. This is where we identify services, versions, and run basic vulnerability scripts.Bash
nmap -iL open_ports.txt -sV -sC -O --script=default,vuln -oN nmap_detailed.txt
-sV: Probe open ports to determine service/version info.-sC: Run default safe scripts.-O: Enable OS detection.--script=vuln: A powerful but noisy option. Runs all scripts in thevulncategory to check for known vulnerabilities. USE WITH CAUTION AND CHECK SCOPE. Some scripts can be intrusive.
What to Look For:
Uncommon Ports: A web server on port
3000might be a NodeJS app, on8080a Tomcat server.Outdated Versions:
nmapwill reportApache httpd 2.4.29. A quick search reveals this is vulnerable to several CVEs.Exposed Services: FTP (
21), SMTP (25), SMB (445), RDP (3389), databases (3306for MySQL,5432for PostgreSQL). These are high-impact targets if misconfigured.
3.2. Visual Reconnaissance - A Picture is Worth a Thousand Requests
You have thousands of web servers. How do you prioritize? By looking at them. Visual recon tools take screenshots of web pages, allowing you to quickly scan hundreds of sites and identify interesting targets.
Tools of Choice: httpx (with screenshot flag), goknock, or Aquatone.
The Workflow:
Bash
# Take your list of live web servers from the previous steps
cat live_web_servers.txt | httpx -screenshot -o screenshots_output
This will create a folder with an HTML report containing screenshots of all websites.
What to Look For in the Screenshots:
Login Panels: Default login pages for services like Jenkins, Grafana, vSphere, etc.
403 Forbidden / 401 Unauthorized: These pages might be bypassable.
Directory Listings: An instant indicator of information disclosure.
Error Pages (Stack Traces): Reveal frameworks, internal paths, and other valuable data.
Old/Dated UI: A website that looks like it's from 2005 probably has 2005-era vulnerabilities.
Parked Domains / Default Pages: Nginx/Apache default pages can often be ignored, helping you filter out noise.
"Under Construction" Pages: Often forgotten and poorly secured.
Visual recon allows you to use your human intuition to spot anomalies that automated scanners would miss.
Part 4: Deep Diving into Web Assets - Uncovering Hidden Functionality
Once we've prioritized our targets, we need to dig into the applications themselves. The goal here is to map out the application's functionality, especially the parts that aren't immediately visible.
4.1. JavaScript File Analysis - The Treasure Map
Modern web applications are built on JavaScript. The client-side code contains a wealth of information about how the application works, including API endpoints, hidden parameters, and sometimes even credentials.
The Process:
Gather all JS files: Use a tool like
subjsto crawl your list of live websites and extract all the.jsfile URLs.Bash
cat live_web_servers.txt | subjs -c 10 > js_files.txt
Analyze for Secrets and Endpoints:
Tool:
LinkFinderLinkFinderis a Python script that uses regex to find endpoints and other interesting things in JavaScript files.Bash
python linkfinder.py -i https://target.com/main.js -o cli
Manual Analysis: Nothing beats opening the JS file (prettified, of course) and searching for keywords:
api/,/v1/,/v2/-> API endpoints.key,token,secret,auth-> Potential credentials.path:,endpoint:,url:-> Hardcoded paths.S3.amazonaws.com-> Cloud storage buckets."dev","staging"-> Keywords indicating environment-specific logic.
Example: Imagine you find this snippet in a JS file:
JavaScript
function getUserData(id) {
var apiUrl = "/api/v3/users/get?id=" + id;
// ... fetch data from apiUrl ...
}
You've just discovered the api/v3/users/get endpoint and the id parameter. Now you can test this endpoint directly for IDORs, SQLi, etc.
4.2. Advanced Content Discovery - Fuzzing for Gold
We know about directory busting with tools like dirb. The advanced approach uses more powerful tools with better techniques.
Tool of Choice: ffuf (Fuzz Faster U Fool)
ffuf is incredibly fast and flexible. The key to using it effectively is in the wordlists and the options.
Wordlists are Everything: Don't use a single small wordlist. Use a collection of high-quality lists. The best source is SecLists.
Discovery/Web-Content/common.txt(for basics)Discovery/Web-Content/raft-large-directories.txt(for directories)Discovery/Web-Content/api/(for API endpoints)
An Advanced ffuf Command:
Bash
ffuf -u https://target.com/FUZZ -w /path/to/SecLists/Discovery/Web-Content/directory-list-2.3-medium.txt -e .php,.html,.js,.txt -c -fc 404,403 -recursion -recursion-depth 1
Let's break that down:
-u https://target.com/FUZZ: The URL, withFUZZas the injection point.-w ...: The wordlist.-e .php,.html,.js,.txt: Try these extensions for every word (e.g.,admin.php).-c: Colorized output.-fc 404,403: Filter (hide) responses with these status codes. This cleans up the noise.-recursion: If it finds a directory (e.g.,/admin/), it will start a new scan inside it.-recursion-depth 1: Only go one level deep to avoid infinite loops.
Fuzzing for Virtual Hosts (VHosts): Sometimes multiple websites are hosted on the same IP address. You can discover hidden domains by fuzzing the Host header.
Bash
# Use your list of potential subdomains as the wordlist
ffuf -u http://192.0.2.10 -w permutations.txt -H "Host: FUZZ.target.com" -fs 12345
Here, -fs 12345 filters out responses of a specific size (the size of the default page), showing you only the VHosts that return a different page.
4.3. Parameter Mining
Many vulnerabilities (IDOR, XSS, SSTI) exist in URL parameters that aren't linked anywhere in the application's UI. We need to find them.
Tool of Choice: Arjun Arjun is a tool specifically designed to find query parameters. It has a large, built-in wordlist of common parameter names.
Bash
arjun -u https://target.com/profile.php
Arjun will test thousands of parameters like id, user, debug, redirect, next, url, view, etc., against the endpoint and report any that seem to have an effect on the page response. Finding a hidden debug=true or admin=true parameter can be a critical discovery.
Part 5: Automation, Methodology, and Data Management
Advanced recon generates a ton of data. Without a system to manage it, you'll drown. This is where we bring everything together.
5.1. The Recon Flywheel - An Automated Toolchain
The goal is to create a simple script that automates the initial phases of recon. This allows you to point it at a new program and have a solid base of data within hours.
A Simple Bash Script Example:
Bash
#!/bin/bash
TARGET=$1
if [ -z "$TARGET
You stopped this response
modifi thid likr it write a human
Show thinking
Of course. Here is that guide rewritten with a more personal, human touch, as if it were a seasoned hunter sharing their secrets.
Forget the Scanners: How to Think and Hunt Like a Top-Tier Bug Bounty Pro
Hey there.
Let's be real for a second. You've probably seen a hundred blog posts that tell you to run subfinder | httpx | nuclei and call it a day. And sure, that'll get you… something. It'll get you the same duplicate bug that 500 other people just submitted.
If you want to find the bugs that lead to massive payouts, the kind that make companies sit up and take notice, you have to go deeper. Much deeper. Reconnaissance isn't a checklist you run through. It's an art form. It’s the single biggest difference between a good hunter and a great one.
This isn't just another list of tools. This is a new way to think. We're going to explore how to map a target's entire digital universe, from forgotten servers collecting dust in a data center to secrets accidentally left in public code. Forget being a scanner monkey. It's time to become a digital archaeologist.
Part 1: The Recon Mindset - It's All in Your Head 🧠
Before you open your terminal, you need a mental upgrade. Top-tier hunters don't just run commands; they think like a mix of a paranoid system admin, a curious developer, and a cunning detective.
Think in Loops, Not Lines: Your recon should be a flywheel. Every new piece of info you find—an employee's name, an old company they bought, a weird IP address—isn't the end of a trail. It's a clue you feed back into the start of your process to find even more. A new subdomain reveals a new cloud provider. That provider's patterns might help you find more assets. See? It's a loop.
Map Everything: Don't just hunt for subdomains. You're building a complete intelligence portfolio. We're talking IP address ranges, cloud storage buckets, code repositories on GitHub, the tech stacks they use, and especially the skeletons in their closet—the companies they acquired years ago.
Connect Every Clue to a Kill Chain: Don't just collect data. Weaponize it in your mind.
You find: A copyright date of
© 2016on a forgotten page.You think: "This page is ancient. The JavaScript on it is probably vulnerable as hell. Prime target for XSS."
You find: A subdomain like
vpn-test.target.com.You think: "Someone was testing their VPN here. Are there default credentials? Is the management interface exposed?"
This mindset changes everything. Recon stops being a chore and starts being the most exciting part of the hunt.
Part 2: Passive Recon - The Art of Spying 🕵️
This is the James Bond phase. We gather a mountain of intelligence without making any direct contact with the target's main servers. We're a ghost in the public records.
Beyond Basic Subdomains
Everyone finds blog.target.com. You need to find the servers they forgot they even owned.
Your main tool here should be amass. It's slower than other tools for a reason: it's incredibly thorough. But to unlock its real power, you must feed it API keys. Go to sites like SecurityTrails, Shodan, VirusTotal, etc., create free accounts, and plug those keys into the amass config file. Not doing this is like trying to explore the ocean in a kiddie pool.
Another goldmine? Certificate Transparency (CT) logs. Every time a company gets an SSL certificate, it's recorded publicly. You can literally go to crt.sh and search for %.target.com to see a history of hostnames they've secured. You'll find gems like dev-feature-x.target.com that were never meant to be public.
Mapping Their Digital Real Estate (ASNs)
This is a true power move. Instead of just finding domain names, you find the actual blocks of the internet that your target owns. These are their Autonomous System Numbers (ASNs).
Find their ASNs: Pop open your terminal and run
amass intel -org "Target Corporation". This will spit out the ASN numbers for their networks.Find their IP ranges: Take an ASN you found (e.g.,
AS12345) and runamass intel -asn 12345. This gives you the raw IP address blocks they own.Find the forgotten servers: Now you can scan these massive IP ranges for open web ports. This is how you find servers with no domain names pointing to them. These are often development machines, backup servers, or internal tools accidentally left open to the world.
Digging for Corporate Skeletons (Acquisitions)
Big companies are like sharks; they're always eating smaller fish. When they acquire a small startup, they inherit all of its old, messy, and often insecure technology.
Hit up Crunchbase and Wikipedia. Look for your target's acquisition history. Found out they bought "OldApp Inc." back in 2018? Fantastic. Now you have a new target. Hunt for subdomains on oldapp.com, look for things like oldapp.target.com, and check if OldApp's original IP ranges are still alive. This is where you find legacy systems that time forgot.
Eavesdropping on Developers (GitHub Recon)
Developers are human. They get tired, they get rushed, and they make mistakes. Sometimes, that mistake is committing passwords, API keys, or other secrets to a public GitHub repository.
Don't just search for target.com. Use GitHub's advanced search like a pro:
org:TargetCorp passwordorg:TargetCorp filename:.env(Finds environment files, often full of secrets)"target.com" "Authorization: Bearer"(Finds authentication tokens)
Part 3: Active Recon - Knocking on Doors 🚪
Okay, spy time is over. We have our map. Now it's time to put on our boots and go knock on some digital doors to see who's home.
Scanning with a Purpose
Don't just nmap one host. Your strategy should be two-fold:
Go Wide & Fast: Use a quick scanner like
masscanornaabuon your entire list of IPs. Your only goal here is to find open ports, especially web ports (80,443,8080, etc.).Go Deep & Slow: Take the list of live hosts you just found and now run a detailed
nmapscan.nmap -sV -sC -iL live_hosts.txtThis will grab service versions (-sV) and run safe scripts (-sC) to identify what's really running. An old version of Apache? A weird FTP server? An exposed database login? Bingo.
Visual Recon: The CSI Zoom-and-Enhance
You have a list of 2,000 live websites. Are you going to check them all manually? Hell no. You're going to use visual recon. Tools like httpx (with the -screenshot flag) or goknock will visit every single site and take a screenshot.
You then get a beautiful HTML report with pictures of every homepage. In minutes, your brain can do what a computer can't: spot anomalies.
That weird "403 Forbidden" page? Might be bypassable.
That default login panel for Jenkins or Grafana? Huge finding.
That page that looks like it was designed in 1999? It probably has vulnerabilities from 1999.
That directory listing showing all the files? Jackpot.
Part 4: Deep Dive - Cracking Open the Application
Now we zoom in on the most promising targets we found. We're no longer mapping the city; we're trying to find the keys to a specific building.
JavaScript Files are Treasure Maps
Modern websites are powered by JavaScript, and developers leave clues all over the place. JS files are your treasure maps to the application's hidden workings.
Use a tool like
subjsto crawl your target sites and collect all the.jsfiles.Feed those files into
LinkFinder. It will rip them apart and find API paths, links, and potential endpoints.But don't forget to look yourself! Open the files and search for words like
api,token,secret,key, andS3.amazonaws.com. You'll be shocked at what you find. A single line of code can reveal a hidden API endpoint that leads to a critical vulnerability.
Fuzzing with ffuf - The Digital Crowbar
Forget basic directory busters. You need a tool that's fast, smart, and relentless. You need ffuf. The key to ffuf is using great wordlists (get them from SecLists on GitHub) and smart filtering.
Here's a beast of a command:
Bash
ffuf -u https://api.target.com/FUZZ -w /path/to/SecLists/Discovery/Web-Content/raft-large-words.txt -c -fc 404,403
This tells ffuf to hammer the URL (-u) with a great wordlist (-w), but to shut up and hide all the useless 404 and 403 responses (-fc). This lets you focus only on what's actually there. You can use it to find hidden files, directories, and API endpoints.
Parameter Mining with Arjun
What if a vulnerability is hiding in a parameter that isn't used anywhere on the site, like ?admin=true or ?debug=true? You find these by "parameter mining." A tool called Arjun is built for exactly this. Point it at a URL, and it will test thousands of common and obscure parameter names to see if they have any effect on the page. Finding one of these is like finding a hidden key.
Part 5: Building Your Recon Machine 🤖
This all sounds like a lot, right? That's because it is. You can't do this all by hand every time. You need to automate it. You need to build your own personal recon machine.
A simple bash script can tie these tools together into a powerful workflow.
Your Starter Recon Script:
Bash
#!/bin/bash
TARGET=$1
if [ -z "$TARGET" ]; then
echo "Usage: ./recon.sh <domain.com>"
exit 1
fi
echo "[+] Starting reconnaissance for $TARGET"
mkdir -p $TARGET
cd $TARGET || exit
# 1. Passive Subdomain Enumeration
echo "[*] Running Amass for passive subdomains..."
amass enum -passive -d $TARGET -o passive_subs.txt
# 2. Resolve and find live hosts
echo "[*] Probing for live web servers with httpx..."
cat passive_subs.txt | httpx -silent -threads 200 -o live_hosts.txt
# 3. Take screenshots for visual recon
echo "[*] Taking screenshots of live hosts..."
cat live_hosts.txt | httpx -screenshot -silent -o screenshots_output
# 4. Grab JS files for analysis
echo "[*] Grabbing all JavaScript files..."
cat live_hosts.txt | subjs -c 20 > js_files.txt
# 5. Scan for low-hanging fruit with Nuclei
echo "[*] Running Nuclei for initial vulnerability scanning..."
nuclei -l live_hosts.txt -t ~/nuclei-templates/cves/ -t ~/nuclei-templates/technologies/ -o nuclei_results.txt
echo "[+] Reconnaissance complete for $TARGET. Happy hunting!"
This is just a starting point. You can add ASN lookups, port scanning, content discovery—anything you want. The goal is to have a script that does the boring work for you, leaving you free to do the fun part: thinking, exploring, and finding bugs.
Conclusion: It's a Craft, Not a Command
Recon is what separates the script kiddies from the legends. It’s a game of patience, curiosity, and creativity. The tools will change, but the mindset of a digital archaeologist will always be your most valuable asset.
So go out there. Map the world. Dig up the secrets. And find the bugs that no one else can.
Happy hunting.



