80,443 - WEB
Enumeration
Ports: 80 / 443 (TCP)
A web application consists of domains, subdomains, directories, APIs, endpoints, files... In this section, the attacker will find some of the main steps to gather information on a web application for future web exploitations.
General Enumeration
Using the default scripts of Nmap should provide the attacker with enough information to continue enumerating the web.
Nonetheless, there is a huge quantity of Nmap scripts for web enumeration.
Well Knows files and directories
At the beginning of web enumeration, you should check for standard web files that can provide you with extra information.
.git
Some websites accidentally expose their source code via this endpoint. If a .git
directory exists, you can obtain the repository's contents with git-dumper
robots.txt
The file http://TARGET/robots.txt
will provide you with new endpoints excluded by web crawlers.
.well-known
The endpoint http://TARGET/.well-know/
can contain a lot of URIs with valuable details.
You can check for well-known standard URIs at this link.
sitemap.xml
The file http://TARGET/sitemap.xml
will help you find content pages.
Communication layer
Most of the web sites traffics travels over an encrypted channel, thanks to TLS, so another thing you need to check on a web audit is to enumerate the encryption supported protocols, the certificates and its expiration date.
Domain certificate
If the web page communication is protected with HTTPS, you can inspect its certificate looking for subdomains or wildcards.
To do so, you can execute the following command.
As an alternative, use sslscan.
Automated tools
There are tools that perform encryption checks about ciphers, protocols as well as some cryptographic flaws.
Testssl.sh (Offline) is a free command line tool which checks a server's service on any port f
Qualys - SSL Labs (Online) performs a deep analysis of the configuration of any SSL web server on the public Internet.
Directories/Files enumeration
Another essential step in web enumeration is looking for hidden files that do not appear on the web page. Directory-bruteforcing, can be achieved with the following tools.
Ffuf
Feroxbuster
Gobuster
Virtual hosts
A single web server can be configured to run multiple websites at once under different subdomain names, what are called virtual hosts (vhosts). Finding vhosts is important because each website might contain vulnerabilities, allowing the attacker to compromise the server and gain unauthorised access to the other website.
You can enumerate virtual hosts with the following tools.
Ffuf
GoBuster
Web Application Scanners
There are already automated tools that will facilitate your web enumeration.
Wappalyzer
Wappalyzer is a web browser extension that identifies technologies on websites, such as JavaScript libraries, web servers, operating systems, CMS, Analytics...
Nikto
Nikto is a CLI scanner that checks for vulnerabilities and configuration problems.
Wapiti
Wapiti is a general web scanner web applications that crawls the webpages of the deployed webapp, looking for scripts forms where it can inject data. Once it gets the list of URLs, forms and their inputs, Wapiti acts like a fuzzer,injecting payloads to see if a script is vulnerable.
WPScan
WPScan is a WordPress scanner that checks for WordPress version, installed plugins and themes, looking for vulnerabilities. Furthermore, it looks for backed up wp-config.php
files and database dumps. Finally, it also does user enumeration and password brute-forcing.
Joomscan
Joomscan is a Joomla vulnerability scanner that already comes preinstalled with Kali.
Drupal
Drupal is another CMS like WordPress and Joomla with associated scanners such as drupwn and droopescan.
Badmoodle
Badmoodle is an unofficial community-based vulnerability scanner for moodle that scans for canonical and non-canonical Moodle vulnerabilities.
References
Last updated