Website Footprinting - Part 1


Website footprinting is the technique of monitoring and analysing a target organization's website for information. Sensitive information, such as the names and contact details of the organization's leaders and details of forthcoming projects, can be found on their website.


This topic is divided into two articles. Continue Reading Part 2


Without setting off the intrusion detection system or raising the suspicions of any system administrator, an attacker can create an in-depth schematic of the architecture and structure of a website. Generally, navigating the target website will provide the following data:

  • Software used and its version
  • Operating System used
  • Searches can reveal the sub-directories and parameters by noting the URLs while browsing the target website. 
  • Analyze anything after a query that looks like a filename, path, database field name, or query to check whether it offers opportunities for SQL Injection. 
  • Determine the scripting platform and technologies the website uses by looking at extensions such as .php, .asp, or .jsp.  
  • The contact pages usually offer details such as names, phone numbers, email addresses, and locations of admin and support personnel which can be used to perform social engineering attacks. 

An attacker uses sophisticated footprinting tools or the basic tools that come with the operating system, such as Burp Suite, Zaproxy, WhatWeb, BuiltWith, Wappalyzer, Netcraft, and Website Informer to view headers that provide:
  • Connection status and content type
  • Accept-Ranges and Last-Modified Information
  • X-Powered-By Information
  • Web Server in use and its version 

Examining the HTML source code

By looking through the HTML source code and paying attention to the manually inserted comments, attackers can obtain sensitive information. What's going on in the background might be revealed by reading the comments. They might even offer the web developer or administrator's contact information. 

Observe all the links and image tags to map the file system structure. This will reveal the existence of hidden directories and files. 

Enter fake information to see how the script functions. Sometimes it is possible to make changes to the source code. 

Examining Cookies

Cookies that are set by the server can be examined to find out what software is running and how it behaves. Examine sessions and additional supporting cookies to determine the scripting platforms. It is also possible to obtain data regarding the domain size, cookie name, and value. 

Mirroring Entire Website

The practice of making a duplicate or clone of the original website is known as website mirroring. Mirroring tools like NCollector Studio and HTTrack Web Site Copier allow users to duplicate websites. 

It constructs a directory structure for all the folders, including HTML, photos, flash, videos, and other files from the web server on a different machine by recursively downloading a website to a local directory. It enables an attacker to spend more time viewing and analyzing the website for vulnerabilities and loopholes.

Monitoring Web Pages for Updates and Changes

Monitoring the target websites enables attackers to get access to and identify changes in login pages, extract password-protected pages, follow changes in software versions and driver updates, extract and save images on updated web pages, and so on.  Attackers examine the data obtained to identify underlying weaknesses in the target website, and then exploit the target web application based on these vulnerabilities.

No comments:

Post a Comment