Speedy Notes - Information Gathering
Speedy Notes - Information Gathering
gathering phase:
Infrastructure Business
Network Maps Web presence (domains)
Network Blocks Physical locations
IP addresses Employees / Departments
Ports Emails
Services Partners and third parties
DNS Press / news releases
Operating systems Documents
Alive machines Financial information
Systems Job postings
Business
• Search engines
• Social media
Infrastructure
• Full scope test
• Narrowed scope
Through passive information gathering, we want to find business information and also information
related to infrastructure;
Search Engines
• Web presence
(1) → what they do;
→ What is their business purpose;
→ Physical and logical locations;
→ Employees and departments;
→ Email and contact information;
→ Alternative web sites and sub-domains;
→ Press releases, news, comments, opinions;
(2) Do a mind mapping thing of the site that includes the website in the middle (e.g.
“elearnsecurity” and subcategories around the central map such as “Business”, “Projects”,
“Website”, “Location”, “Employees”, etc.
(3) move on with analyzing information that is publicly available on the internet, and use
things like Google dorks, etc.
Examples of Google dorks:
→ cache:www.website.com
→ link:www.website.com – will display websites that have links to the specific website.
→ site:www.website.com
→ filetype:pdf
(4) Review alternative websites to Google:
→ Bing
→ Yahoo
→ Ask
→ AOL
→ Pandastats.net
→ Dogpile.com
(5) Find corporate accounts to third party services. E.g. a company page on LinkedIn
which can reveal sensitive information such as employees, events, products, contact
info, locations and more. Just be careful with LinkedIn footprinting, because when
you visit another person's profile or company, this is revealed and they can see that;
(6) Organisations that operate globally and have a desire to sell to the US government or
government agencies are required to possess two codes:
→ DUNS number (DUNS and Bradstreet)
→ CAGE code (or NCAGE for a non US business)
→ The above two codes allows us to retrieve even more information such as
contacts, product lists, active / inactive contracts with the government and much
more; search here: https://www.sam.gov/SAM/pages/public/searchRecords/search.jsf
→ Organisations belonging to different industries can be investigated through search
in different publicly available databases; Compliance and regulation may force
companies to publish different kind of information publicly. E.g. publicly listed
companies have to file their financial documents to SEC database;
(7) When you find new organization projects, websites and sub domains, you
have to repeat the whole investigation process. This will widen the attack
surface thereby increasing the chances of a successful outcome of the pen
test;
• Job posting
→ can deduce internal hierarchies, vacancies, projects, responsibilities, weak departments,
financed projects, tech implementations and more;
→ Can also use sites like LinkedIn, Monster, Careerbuilder, Glassdoor, Simplyhired, Dice;
• Financial information
→ find out if organisation is going to invest in a specific tech;
→ might be merging with another;
→ has critical assets and business services;
→ E.g. crunchbase – find info about companies, people, investors / financial information;
→ E.g. inc offera a list of the 500 / 5000 fastest growing private companies, showing very
useful information and stats on them;
→ E.g. google finance;
→ E.g. yahoo finance;
• Harvesting
→ gathering company documents such as charts, database files, diagrams, papers,
documentation spreadsheets, etc.
→ Harvesting emails, accounts (Twitter, Facebook, etc.), names, roles and more.
→ Harvest metadata from documents retrieved online (e.g. filetype google dorks)
→ Windows tool that lets us harvest metadata from files is called “FOCA” - queries sites
like google and bing (also allows us to extract infrastructure information);
→ “theHarvester” - able to enumerate email accounts, user names, domains and hostnames;
– -l limits the results to the value specified (in this case 100);
– at the end of this phase, we should have a list of names, email addresses, documents,
telephone numbers, usernames and so on;
Social Media
→ People are the weakest link;
→ Gather an employee's personal info: phone numbers, addresses, history, CV, opinions,
responsibilities, projects, etc.
→ Select most appropriate target for a social engineering attack;
→ Search persons gathered from previous phase in the social media phase;
→ Use LinkedIn search feature to look up people or see who occupies a certain position – e.g CEO;
→ Can also search someone on LinkedIn via google: “name” / “position” / “company”
site:linkedin.com
→ Build a network of people and put it in the mind map;
→ Figuring out trust relationship is essential part of information gathering process;
→ Look at linkedin relationships and then use Twitter and Facebook as well to infer the level of
relationship between two people: i.e. how trusted the relationship is;
→ Social media people search: e.g. pipl.com, spokeo (seems to only be US based), peoplefinders
→ age
→ Phone number
→ Business
→ Addresses
→ Occupation
→ Interests
→ Email addresses
→ Website owned
→ Related documents
→ Financial info
Nslookup targetorganisation.com
→ Reverse DNS lookup: receive IP address associated with a given domain name; use network-
tools.com/nslook/ (only domains with a PTR record set will respond to this);
→ MX Lookup:
Nslookup -type=MX domain
→ Once found a number of host names related to organisation, can move on with both determining
their relative IP addresses and potentially any netblocks associated with the organisation;
(1) Resolve all the host names we have in order to determine what IP addresses are used;
Nslookup ns.targetorganisation.com
(2) Once we retrieve one or more IP addresses correspondong to domains, have to consider the
following:
→ Is this IP address hosting only that given domain?
→ Who does this IP address belong to?
→ Bing offers a query filter that returns all the websites hosted on a given IP address;
ip:199.193.116.231
→ Also, a few websites that allow sub domain enumeration from a specific IP address if you
believe bing results are inaccurate or incomplete: https://dnslytics.com/reverse-ip,
http://reverseip.domaintools.com/, https://www.robtex.com/
→ Might have to go back and re-enumerate once found subdomains;
→ map IP addresses and related domains using mind mapping tools;
→ netblocks of smaller corporations will normally show up as beloning to ISP rather than the
smaller corp leasing them;
→ Autonomous System is made up of one or more netblocks under the same administrative
control; big corps and ISPs have an autonomous system, while smaller companies will barely have a
net block;
→ To find the owner of a netblock, you can also use tools such as: hostmap, maltego, foca, fierce,
dmitry;
Live hosts
→ With pool of IP addresses, we have to identify the devices and roles played by each IP in the
target organisation: is it a server or workstation? And which IP's are alive?
If you run the scan from a machine within the same network, Nmap runs an ARP scan instead of
ICMP packets. To avoid this behaviour:
--disable-arp-ping OR –send-ip
Find further DNS within the target network:
• Use nmap to scan the entire netblock for TCP / UDP port 53 to find hosts that have this port
open;
• Once more DNS servers have been retrieved, can perform a reverse lookup to find out if
they are serving any particular domain;
• Can try zone transfer and other DNS techniques on them, etc.
Maltego
Other tools
• DNSEnum
→ Gather as much info as possible about a domain;
→ Can get the host's address (A record)
→ Get the name servers (threaded)
→ Get the MX record (threaded)
→ Perform AXFR queries on the name servers (threaded)
→ Get extra names and sub domains via Google dorks (allinurl:-www site:domain)
→ Brute force sub domains from file, can also perform recursion on sub domain that have
NS records
→ Calculate C class domain network ranges and perform WHOIS queries on them;
→ Perform reverse lookups on net ranges (C class or / and WHOIS net ranges);
--private – Show and save private Ips at the end of the file domain_ips.txt
--subfile <file> Write all valid sub domains to this file
--threads <value>
-p <value> – number of google search pages to process when scraping names, default is 20, -s
switch must be specified;
-s <value>– the maximum number of sub domains that will be scraped from Google;
-f <file> - Read sub domains from this file to perform brute force
-u update any file that may already exist
-r recursive brute force on any discovered domains;
• Comes with a wordlist file containing most common DNS and sub domain names for brute
force attacks – find it in /usr/share/dnsenum;
• Fierce
• Dnsmap
→ sub domain enum and brute forcing;
→ uses dictionary file that comes with the tool or word list file that the user makes;
Dnsmap targetdomain
- sub domain brute forcing using dnsmaps built in word-list;
• Metagoofil
• Foca
• Dmitry
• Recon-ng
Videos
Whois Lookups
→ command line service: whois elsfoo.com (will try to find proper whois server to use for
elsfoo.com)
→ whois -h whois.godaddy.com elsfoo.com [instructs to use a specific or best whois server for a
specific registrar].
→ nslookup site.com
→ nslookup -query=mx site.com [maps domain to a list of available mail servers for domain];
→ nslookup -query=ns site.com [maps to available name servers]
→ nslookup -query=any site.com
→ nslookup -query=cname site.com
→ dig site.com
→ dig site.com A [query A records]
→ dig site.com +nocmd MX +noall +answer [limits output to just MX records]
→ dig site.com +nocmd NS +noall +answer
→ dig site.com +nocmd A +noall +answer
→ zone transfer have to specify specific misconfigured DNS that belongs to target:
dig +nocmd site.com AXFR +noall +answer @site.com
Dnsenum
→ dnsenum site.com
→ gets A records, name servers, mail servers, attempts zone transfer, can also brute;
→ dnsenum site.com –dnsserver site.dns.com [use specific dns server to use]
→ If internal dns server is exposed can use dnsenum to map entire internal network
→ brute force option: dnsenum site.com -f /root/hosts.txt
Dnsmap
→ dnsmap site.com
→ uses preloaded wordlists to brute sub domains;
Hping
→ no port specified, default port is 0;
→ icmp echo ping request: hping3 -1 10.1.1.1 -c 3
→ hping3 –time-ts 10.1.1.1 -c 3 -V [send timestamp icmp packet]
→ -2 option is udp: hping3 -2 10.1.1.1 -p 53 -c 3
→ hping3 -S 10.1.1.1 -c 3
→ fin: hping3 -F 10.1.1.1 -c 3
→ urg: hping3 -U 10.1.1.1 -c 3
→ hping3 -F -P -U 10.1.1.1 -c 3
→ hping -1 10.1.1.x –rand-dest -I eth2 [to scan a class c network; --rand-dest is important to replace
x with one of the IP addresses in the class c network]
Maltego
→ gather and link info on sites, servers, networks, etc.
→ Auto and manual options;
→ machines – maltego scripts to extract info, domains websites, database etc.
→ transforms – convert info to other info linked to previous info – get netblocks, Ips, etc.
→ White icon –> new project with palette on left –> choose website → double click on object to set
target site “target.com” → right click on website icon to look for transforms –> to domain dns
transform → once transform is done new object appears on graph → right click on site again and
select IP transform → right click again and use email address transform (utilises a google dork
query) → right click on dns discovered before → choose transform to do e.g. whois, zone transfer,
sub-domains, etc.
→ e.g. see where emails were found from and on the right you can click and view or download the
file for further investigation; and then use tool such as focal to extract useful metadata and the like;
→ file search is another useful transform;
→ on IP object, look at netblock transform;
→ you can change graph by clicking on the different icons;
→ can also list as an entity list;
→ Select multiple items to do a transform on;
Foca Shodan
→ Foca is a windows tool – can extract metadata, dns queries, search engines to find interesting
files on target server, etc.
→ Stuff that you can do is in the left pane. Select things like metadata and customise what you want
and then search;
→ right click on stuff in metadata after searching and select “analyse metadata”;