0% found this document useful (0 votes)
508 views9 pages

Speedy Notes - Information Gathering

The document outlines processes for passive information gathering about a target's infrastructure and business. It describes searching search engines and social media to find information like the target's web presence, locations, employees, partners, job postings, finances, and networked infrastructure through techniques like DNS enumeration, identifying IP addresses and netblocks, and investigating WHOIS records. The goal is to understand the target's business and map their technical systems before potentially escalating to active testing.

Uploaded by

Ben
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
508 views9 pages

Speedy Notes - Information Gathering

The document outlines processes for passive information gathering about a target's infrastructure and business. It describes searching search engines and social media to find information like the target's web presence, locations, employees, partners, job postings, finances, and networked infrastructure through techniques like DNS enumeration, identifying IP addresses and netblocks, and investigating WHOIS records. The goal is to understand the target's business and map their technical systems before potentially escalating to active testing.

Uploaded by

Ben
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

Two types of information gathering that you should at least have at the end of the information

gathering phase:

Infrastructure Business
Network Maps Web presence (domains)
Network Blocks Physical locations
IP addresses Employees / Departments
Ports Emails
Services Partners and third parties
DNS Press / news releases
Operating systems Documents
Alive machines Financial information
Systems Job postings

The Process that we will follow is:

Business
• Search engines
• Social media

Infrastructure
• Full scope test
• Narrowed scope

Through passive information gathering, we want to find business information and also information
related to infrastructure;

Search Engines

• Web presence
(1) → what they do;
→ What is their business purpose;
→ Physical and logical locations;
→ Employees and departments;
→ Email and contact information;
→ Alternative web sites and sub-domains;
→ Press releases, news, comments, opinions;
(2) Do a mind mapping thing of the site that includes the website in the middle (e.g.
“elearnsecurity” and subcategories around the central map such as “Business”, “Projects”,
“Website”, “Location”, “Employees”, etc.
(3) move on with analyzing information that is publicly available on the internet, and use
things like Google dorks, etc.
Examples of Google dorks:
→ cache:www.website.com
→ link:www.website.com – will display websites that have links to the specific website.
→ site:www.website.com
→ filetype:pdf
(4) Review alternative websites to Google:
→ Bing
→ Yahoo
→ Ask
→ AOL
→ Pandastats.net
→ Dogpile.com
(5) Find corporate accounts to third party services. E.g. a company page on LinkedIn
which can reveal sensitive information such as employees, events, products, contact
info, locations and more. Just be careful with LinkedIn footprinting, because when
you visit another person's profile or company, this is revealed and they can see that;

(6) Organisations that operate globally and have a desire to sell to the US government or
government agencies are required to possess two codes:
→ DUNS number (DUNS and Bradstreet)
→ CAGE code (or NCAGE for a non US business)
→ The above two codes allows us to retrieve even more information such as
contacts, product lists, active / inactive contracts with the government and much
more; search here: https://www.sam.gov/SAM/pages/public/searchRecords/search.jsf
→ Organisations belonging to different industries can be investigated through search
in different publicly available databases; Compliance and regulation may force
companies to publish different kind of information publicly. E.g. publicly listed
companies have to file their financial documents to SEC database;
(7) When you find new organization projects, websites and sub domains, you
have to repeat the whole investigation process. This will widen the attack
surface thereby increasing the chances of a successful outcome of the pen
test;

• Partners and third parties


→ deduce what type of tech and systems are used internally;
→ Can also use it to perform more effective social engineering attacks with a higher chance
of success;

• Job posting
→ can deduce internal hierarchies, vacancies, projects, responsibilities, weak departments,
financed projects, tech implementations and more;
→ Can also use sites like LinkedIn, Monster, Careerbuilder, Glassdoor, Simplyhired, Dice;

• Financial information
→ find out if organisation is going to invest in a specific tech;
→ might be merging with another;
→ has critical assets and business services;
→ E.g. crunchbase – find info about companies, people, investors / financial information;
→ E.g. inc offera a list of the 500 / 5000 fastest growing private companies, showing very
useful information and stats on them;
→ E.g. google finance;
→ E.g. yahoo finance;
• Harvesting
→ gathering company documents such as charts, database files, diagrams, papers,
documentation spreadsheets, etc.
→ Harvesting emails, accounts (Twitter, Facebook, etc.), names, roles and more.
→ Harvest metadata from documents retrieved online (e.g. filetype google dorks)
→ Windows tool that lets us harvest metadata from files is called “FOCA” - queries sites
like google and bing (also allows us to extract infrastructure information);
→ “theHarvester” - able to enumerate email accounts, user names, domains and hostnames;

Theharvester –d elearnsecurity.com -l 100 -b google Thirsty Harry drank large beer

– -l limits the results to the value specified (in this case 100);

– at the end of this phase, we should have a list of names, email addresses, documents,
telephone numbers, usernames and so on;

• Cached and archival sites


→ See stuff like job postings that have been taken down to mine useful information;
→ archive.org
→ cache:URL

Social Media
→ People are the weakest link;
→ Gather an employee's personal info: phone numbers, addresses, history, CV, opinions,
responsibilities, projects, etc.
→ Select most appropriate target for a social engineering attack;
→ Search persons gathered from previous phase in the social media phase;
→ Use LinkedIn search feature to look up people or see who occupies a certain position – e.g CEO;
→ Can also search someone on LinkedIn via google: “name” / “position” / “company”
site:linkedin.com
→ Build a network of people and put it in the mind map;
→ Figuring out trust relationship is essential part of information gathering process;
→ Look at linkedin relationships and then use Twitter and Facebook as well to infer the level of
relationship between two people: i.e. how trusted the relationship is;

→ Social media people search: e.g. pipl.com, spokeo (seems to only be US based), peoplefinders

So information to gather from social media:

→ age
→ Phone number
→ Business
→ Addresses
→ Occupation
→ Interests
→ Email addresses
→ Website owned
→ Related documents
→ Financial info

→ Usenet and Google groups;

Infrastructure information gathering:

Full scope engagement:


• Domains
• DNS servers in use
• Netblocks or IP addresses
• Mail servers
• ISP's used
• Any other technical information
To find netblocks, query each of the five internet registries. e.g.
(1) Given a domain name, go to WHOIS; whois -h whois.apnic.net Microsoft [for asia];
→ owner of a domain name;
→ IP address or range;
→ Autonomous system;
→ Technical contacts;
→ Five main providers of WHOIS information: AFRINIC, ARIN, APNIC, LACNIC, RIPE NCC;
→ Can find out: number resource records, network numbers (IP addresses) referred to as NETs,
autonomous system numbers referred to as ASNs, organisation records referred to as ORGs, point
of contact records referred to as POCs;
→ Authoritative WHOIS database;
→ Be sure to try different search techniques: i.e. target, target.com, target.net, etc.

(2) DNS Enumeration

→ DNS queries produce listings called Resource Records;


→ For maximum effect, do DNS lookup, MX lookup and Zone transfers;
→ DNS lookup:

Nslookup targetorganisation.com

→ Reverse DNS lookup: receive IP address associated with a given domain name; use network-
tools.com/nslook/ (only domains with a PTR record set will respond to this);

→ MX Lookup:
Nslookup -type=MX domain

Comparison of nslookup and dig commands (Linux). Should utilise both:

Nslookup target.com Dig target.com +short


Nslookup -type=PTR target.com find domain for ip; Dig target.com PTR
Nslookup -type=MX target.com Dig target.com MX
Nslookup =type=NS target.com Dig target.com NS
Nslookup Neverlookup Sam is on LSD Dig axfr @target.com target.com
> server target.com Dig An Axe From France @
> ls -d target.com
IP addresses

→ Once found a number of host names related to organisation, can move on with both determining
their relative IP addresses and potentially any netblocks associated with the organisation;

(1) Resolve all the host names we have in order to determine what IP addresses are used;

Nslookup ns.targetorganisation.com

(2) Once we retrieve one or more IP addresses correspondong to domains, have to consider the
following:
→ Is this IP address hosting only that given domain?
→ Who does this IP address belong to?

→ Bing offers a query filter that returns all the websites hosted on a given IP address;
ip:199.193.116.231

→ Also, a few websites that allow sub domain enumeration from a specific IP address if you
believe bing results are inaccurate or incomplete: https://dnslytics.com/reverse-ip,
http://reverseip.domaintools.com/, https://www.robtex.com/
→ Might have to go back and re-enumerate once found subdomains;
→ map IP addresses and related domains using mind mapping tools;
→ netblocks of smaller corporations will normally show up as beloning to ISP rather than the
smaller corp leasing them;
→ Autonomous System is made up of one or more netblocks under the same administrative
control; big corps and ISPs have an autonomous system, while smaller companies will barely have a
net block;
→ To find the owner of a netblock, you can also use tools such as: hostmap, maltego, foca, fierce,
dmitry;

Live hosts
→ With pool of IP addresses, we have to identify the devices and roles played by each IP in the
target organisation: is it a server or workstation? And which IP's are alive?

→ Perform ICMP Ping Sweep: fping, nmap, hping;

Fping -a -g 192.168.1.0/24 Fping alive giants;

“a” shows systems that are alive;

Nmap -sn 10.0.0.0/24


“sn” option is a ping sweep that also sends a SYN packet to port 443 and an ACK packet to port 80;

If you run the scan from a machine within the same network, Nmap runs an ARP scan instead of
ICMP packets. To avoid this behaviour:
--disable-arp-ping OR –send-ip
Find further DNS within the target network:

• Use nmap to scan the entire netblock for TCP / UDP port 53 to find hosts that have this port
open;

Nmap -sS -p53 [netblock]

Nmap -sU -p53 [netblock]

• Once more DNS servers have been retrieved, can perform a reverse lookup to find out if
they are serving any particular domain;
• Can try zone transfer and other DNS techniques on them, etc.

Maltego

Other tools

• DNSdumpster (www.dnsdumpster.com) doesn’t appear to be working;


→ Can discover hosts related to a specific domain;

• DNSEnum
→ Gather as much info as possible about a domain;
→ Can get the host's address (A record)
→ Get the name servers (threaded)
→ Get the MX record (threaded)
→ Perform AXFR queries on the name servers (threaded)
→ Get extra names and sub domains via Google dorks (allinurl:-www site:domain)
→ Brute force sub domains from file, can also perform recursion on sub domain that have
NS records
→ Calculate C class domain network ranges and perform WHOIS queries on them;
→ Perform reverse lookups on net ranges (C class or / and WHOIS net ranges);

Dnsenum.pl [options] <domain> dnsenum.pl -p 100 -s 100 google.com

--private – Show and save private Ips at the end of the file domain_ips.txt
--subfile <file> Write all valid sub domains to this file
--threads <value>
-p <value> – number of google search pages to process when scraping names, default is 20, -s
switch must be specified;
-s <value>– the maximum number of sub domains that will be scraped from Google;
-f <file> - Read sub domains from this file to perform brute force
-u update any file that may already exist
-r recursive brute force on any discovered domains;

• Comes with a wordlist file containing most common DNS and sub domain names for brute
force attacks – find it in /usr/share/dnsenum;

• Fierce
• Dnsmap
→ sub domain enum and brute forcing;
→ uses dictionary file that comes with the tool or word list file that the user makes;
Dnsmap targetdomain
- sub domain brute forcing using dnsmaps built in word-list;

dnsmap targetdomain -w wordlist.txt


- sub domain brute forcing using a user-supplied wordlist

dnsmap targetdomain -r /tmp/


- Example of sub domain brute forcing using built in wordlist and saving the results to /tmp/

dnsmap-bulk.sh domain.txt /tmp/results


- For brute forcing a list of target domains in a bulk fashion using the bash script provided;

• Metagoofil
• Foca
• Dmitry
• Recon-ng

Videos

Whois Lookups

→ whois.icann.org → enter domain and lookup;


→ namecheap whois lookup service (may not be able to get full info about a domain)
→ Use the whois server detailed on the whois thing on namecheap to find out more info (i.e. the
target of whois's own whois server – in this case it is at who.godaddy.com/whoischeck.aspx

→ command line service: whois elsfoo.com (will try to find proper whois server to use for
elsfoo.com)
→ whois -h whois.godaddy.com elsfoo.com [instructs to use a specific or best whois server for a
specific registrar].

Information gathering DNS

→ nslookup site.com
→ nslookup -query=mx site.com [maps domain to a list of available mail servers for domain];
→ nslookup -query=ns site.com [maps to available name servers]
→ nslookup -query=any site.com
→ nslookup -query=cname site.com

→ dig site.com
→ dig site.com A [query A records]
→ dig site.com +nocmd MX +noall +answer [limits output to just MX records]
→ dig site.com +nocmd NS +noall +answer
→ dig site.com +nocmd A +noall +answer
→ zone transfer have to specify specific misconfigured DNS that belongs to target:
dig +nocmd site.com AXFR +noall +answer @site.com

Fierce perl tool:


→ fierce -dns site.dns.com [scan a specific domain] – does things like zone transfer and brutes sub
domains and scans ip addresses etc.
→ fierce -dns site.dns.com -dnsserver ns1.site.com [dns server to use specifically]
→ fierce is one of the best tools to use for bruteforcing sub domains

Dnsenum
→ dnsenum site.com
→ gets A records, name servers, mail servers, attempts zone transfer, can also brute;
→ dnsenum site.com –dnsserver site.dns.com [use specific dns server to use]
→ If internal dns server is exposed can use dnsenum to map entire internal network
→ brute force option: dnsenum site.com -f /root/hosts.txt

Dnsmap
→ dnsmap site.com
→ uses preloaded wordlists to brute sub domains;

Dnsrecon – all in one tool;


→ dnsrecon
→ dnssec is dns security extensions to help security; srv adds flexibility to dns such as port for
specific service; BIND is dns software;
→ to find out the above can use dnsrecon -d site.com

Host discovery with nmap, Hping and Fping

→ fping -a 10.1.1.1 [echo icmp to target]


→ fping -A 10.1.1.1 [same as above but only shows alive hosts]
→ fping -A 10.1.1.1 -r 0 [number of retries which in this case is zero]
→ fping -A 10.1.1.1 -e – time for response;
→ fping -q -a -g 10.1.1.1/24 -r 0 -e [-q forces quiet and -g allows subnet searching]

Hping
→ no port specified, default port is 0;
→ icmp echo ping request: hping3 -1 10.1.1.1 -c 3
→ hping3 –time-ts 10.1.1.1 -c 3 -V [send timestamp icmp packet]
→ -2 option is udp: hping3 -2 10.1.1.1 -p 53 -c 3
→ hping3 -S 10.1.1.1 -c 3
→ fin: hping3 -F 10.1.1.1 -c 3
→ urg: hping3 -U 10.1.1.1 -c 3
→ hping3 -F -P -U 10.1.1.1 -c 3
→ hping -1 10.1.1.x –rand-dest -I eth2 [to scan a class c network; --rand-dest is important to replace
x with one of the IP addresses in the class c network]

Maltego
→ gather and link info on sites, servers, networks, etc.
→ Auto and manual options;
→ machines – maltego scripts to extract info, domains websites, database etc.
→ transforms – convert info to other info linked to previous info – get netblocks, Ips, etc.
→ White icon –> new project with palette on left –> choose website → double click on object to set
target site “target.com” → right click on website icon to look for transforms –> to domain dns
transform → once transform is done new object appears on graph → right click on site again and
select IP transform → right click again and use email address transform (utilises a google dork
query) → right click on dns discovered before → choose transform to do e.g. whois, zone transfer,
sub-domains, etc.
→ e.g. see where emails were found from and on the right you can click and view or download the
file for further investigation; and then use tool such as focal to extract useful metadata and the like;
→ file search is another useful transform;
→ on IP object, look at netblock transform;
→ you can change graph by clicking on the different icons;
→ can also list as an entity list;
→ Select multiple items to do a transform on;

Foca Shodan

→ Foca is a windows tool – can extract metadata, dns queries, search engines to find interesting
files on target server, etc.
→ Stuff that you can do is in the left pane. Select things like metadata and customise what you want
and then search;
→ right click on stuff in metadata after searching and select “analyse metadata”;

→ Shodan: good to use!


→ explots.shodan.io very cool!

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy