Footprinting - First Step to Hacking - Recon


One way to begin planning an ethical hack on your business is through a process often called footprinting or Information Gathering Recon. Footprinting means gathering information about a target system that can be used to execute a successful cyber-attack. An Ethical Hacker must spend most of his/her time profiling an organization, gathering information about the host, network, and people related to the organization.

Information such as:

  • IP address
  • Domain name info
  • Technologies used
  • Other websites on the same server
  • DNS records
  • Unlisted files, sub-domains, and directories
can be collected.

    Two types of footprinting:

    • Passive Footprinting: Passive footprinting means collecting information about a system located at a remote distance from the attacker.
    • Active Footprinting: Active footprinting means performing footprinting by getting in direct touch with the target machine.

      Possible Ways:

      • Who Is: The best starting point is to perform a Whois lookup by using any one of the Whois tools available on the Internet. Whois databases and the servers are operated by RIR - Regional Internet Registries. It is used to query databases such as IP address block, domain name, location, email-id, phone numbers, domain owner, etc. Website for Whois Lookup Query: Whois Lookup
      • Google Hacking refers to collecting information using google dorks. They are keywords that can be used to google search a target in an optimized way. These searches can be helpful in finding sensitive information like compromised passwords, default credentials, competitor information, information related to a topic, etc. Website for Commands or Keywords: Google Hacking Database.
      • Organization's Website: This can also be the best place to begin. You can find open-source information, which is freely provided to clients, customers, or the public.
      • DNS Lookup: DNS is the Internet's system for converting alphabetic names into numeric IP addresses. For example, when a URL is typed into a browser, DNS servers return the IP address of the Web server associated with that name. DNS lookup query stores all information, or resource records, associated with a domain into a file. Website for DNS Lookup:
        • Robtex (Shows comprehensive info about the target website)
        • DNS Dumpster (Enumerate a domain and pull back up to 40K subdomains, results are available in an XLS for easy reference)
      • JOB Websites: Organizations can share some confidential data unknowingly on many JOB websites. For example, a company posted on a website: “Job Opening for Apache 2.0 Server Administrator”. From this information, we can gather that an organization uses Apache web server 2.0.
      • Social Engineering: Social media like Twitter and Facebook are searched to collect information like personal details, user credentials, and other sensitive information. Most people have the tendency to release most of their information online. Hackers use this sensitive information as a big deal.
      • Competitive Intelligence: Competitive intelligence gathering is the process of gathering information about competitors from resources such as the Internet. Examples: company website, search engine, internet, online databases, press releases, annual reports, trade journals. Useful tools/websites:
      • Useful Websites:
        • Netcraft Site Report: tells which server-side or client-side technologies are in use.
        • Archive.org: It is like a time machine for any website. The Archived version refers to the older version of the website which existed at a time. It is a website that collects snapshots of all the websites at regular intervals of time.
        • WhatWeb: It is a tool available in Kali Linux. WhatWeb identifies websites. Its goal is to answer the question, “What is that Website?”. WhatWeb recognizes web technologies including content management systems (CMS), blogging platforms, statistic/analytics packages, JavaScript libraries, web servers, and embedded devices. WhatWeb can be stealthy and fast, or thorough but slow.
        • HTTrack: It allows you to download a World Wide Web site from the Internet to a local directory, building recursively all directories, getting HTML, images, and other files from the server to your computer.
        • BuiltWith: It displays information like widgets, analytics, frameworks, content management systems, advertisers, content delivery networks, web standards, and web servers.
      • Subdomains: One server can serve several websites. Gaining access to one can help to gain access to others. For example, google.com is the main domain but mail.google.com, smtp.google.com, etc are the sub-domain of google.com. A tool useful to get subdomains: 
        • KNOCK
          • Clone the repository
          • Redirect to the knockpy directory
          • Run the program using python knockpy.py <target_website>.
        • DIRB: DIRB is a Web Content Scanner. DIRB's main purpose is to help in professional web application auditing. Especially in security-related testing. It covers some holes not covered by classic web vulnerability scanners. DIRB looks for specific web objects that other generic CGI scanners cannot look for. It does not search for vulnerabilities, nor does it look for web content that can be vulnerable. It basically works by launching a dictionary-based attack against a web server and analyzing the response.
        • VirusTotal is also a good easy and fast way to get the subdomains of a website.
      • Geolocation: IP geolocation and domain information can also be helpful. Website for getting the geolocation: ipinfo.io.


      We hope this helps. If any suggestions or doubts you can add a comment and we will reply as soon as possible.

      No comments:

      Post a Comment