Web
Nmap discovered a Web server on the target port 80
The running service is Apache httpd 2.4.6 ((CentOS) PHP/7.3.22)
┌──(kali㉿kali)-[~/PEN-200/PG_PRACTICE/sybaris]
└─$ curl -I -X OPTIONS http://$IP/
HTTP/1.0 500 Only GET and POST are supported
Date: Sat, 29 Mar 2025 10:29:55 GMT
Server: Apache/2.4.6 (CentOS) PHP/7.3.22
X-Powered-By: PHP/7.3.22
Set-Cookie: PHPSESSID=2vt82a2tom8fdgfd8v953ut2dc; path=/
Expires: Thu, 19 Nov 1981 08:52:00 GMT
Cache-Control: no-store, no-cache, must-revalidate
Pragma: no-cache
Content-Length: 31
Connection: close
Content-Type: text/html; charset=UTF-8
┌──(kali㉿kali)-[~/PEN-200/PG_PRACTICE/sybaris]
└─$ curl -I http://$IP/
HTTP/1.0 500 Only GET and POST are supported
Date: Sat, 29 Mar 2025 10:30:00 GMT
Server: Apache/2.4.6 (CentOS) PHP/7.3.22
X-Powered-By: PHP/7.3.22
Set-Cookie: PHPSESSID=9duo8jjrdes4l801d5me8u8ah8; path=/
Expires: Thu, 19 Nov 1981 08:52:00 GMT
Cache-Control: no-store, no-cache, must-revalidate
Pragma: no-cache
Connection: close
Content-Type: text/html; charset=UTF-8
Webroot
It’s a blog powered by HTMLy
HTMLy is an open source databaseless PHP blogging platform. A flat-file CMS that allows you to create a fast, secure, and powerful website or blog in seconds. Source code is available
Version Information
Checking the source code of the
index.php
file reveals the version information; HTMLy v2.7.5
Vulnerabilities
┌──(kali㉿kali)-[~/PEN-200/PG_PRACTICE/sybaris]
└─$ searchsploit HTMLy
----------------------------------------------------------------------- ---------------------------------
Exploit Title | Path
----------------------------------------------------------------------- ---------------------------------
htmly 2.8.0 - 'description' Stored Cross-Site Scripting (XSS) | multiple/webapps/49772.py
HTMLy Version v2.9.6 - Stored XSS | php/webapps/51979.txt
----------------------------------------------------------------------- ---------------------------------
Shellcodes: No Results
Papers: No Results
The target instance suffers from XXS vulnerabilities N/A
User
At the footer, there is a username disclosure;
pablo
Admin Page
Attempting to access the admin page results in
302
to a login endpoint at /login
Username Enumeration
┌──(kali㉿kali)-[~/PEN-200/PG_PRACTICE/sybaris]
└─$ patator http_fuzz -t 128 url=http://$IP/login method=POST body='user=FILE0&password=blah&csrf_token=_CSRF_&submit=Login' 0=/usr/share/wordlists/seclists/Usernames/xato-net-10-million-usernames.txt follow=0 accept_cookie=1 -x ignore:fgrep='Username not found in our record.' before_urls="http://$IP/login" before_egrep='_CSRF_:name="csrf_token" value="(\w+)"'
12:39:13 patator INFO - Starting Patator 1.0 (https://github.com/lanjelot/patator) with python-3.13.2 at 2025-03-29 12:39 CET
12:39:13 patator INFO -
12:39:13 patator INFO - code size:clen time | candidate | num | mesg
12:39:13 patator INFO - -----------------------------------------------------------------------------
12:39:19 patator INFO - 200 3448:3129 0.037 | pablo | 935 | HTTP/1.1 200 OK
Testing out a new amazing advanced fuzzer, patator, that supports fetching and updating CSRF token.
It does that by using the before_urls
module to make request to, then the before_egrep
to extract data from the before_urls
’s response to place in the main request via custom variable, _CSRF_
.
Additionally, it also supports regex filtering from response.
The pablo
user appears to be the sole user
Fuzzing
┌──(kali㉿kali)-[~/PEN-200/PG_PRACTICE/sybaris]
└─$ ffuf -c -w /usr/share/wordlists/seclists/Discovery/Web-Content/big.txt -u http://$IP/FUZZ -ic -e .txt,.html,.php
________________________________________________
:: Method : GET
:: URL : http://192.168.185.93/FUZZ
:: Wordlist : FUZZ: /usr/share/wordlists/seclists/Discovery/Web-Content/big.txt
:: Extensions : .txt .html .php
:: Follow redirects : false
:: Calibration : false
:: Timeout : 10
:: Threads : 40
:: Matcher : Response status: 200-299,301,302,307,401,403,405,500
________________________________________________
.htaccess [Status: 403, Size: 211, Words: 15, Lines: 9, Duration: 38ms]
.htaccess.txt [Status: 403, Size: 215, Words: 15, Lines: 9, Duration: 22ms]
.htaccess.php [Status: 403, Size: 215, Words: 15, Lines: 9, Duration: 22ms]
.htaccess.html [Status: 403, Size: 216, Words: 15, Lines: 9, Duration: 22ms]
.htpasswd [Status: 403, Size: 211, Words: 15, Lines: 9, Duration: 23ms]
.htpasswd.txt [Status: 403, Size: 215, Words: 15, Lines: 9, Duration: 22ms]
.htpasswd.html [Status: 403, Size: 216, Words: 15, Lines: 9, Duration: 33ms]
.htpasswd.php [Status: 403, Size: 215, Words: 15, Lines: 9, Duration: 29ms]
Index [Status: 200, Size: 7870, Words: 2318, Lines: 142, Duration: 64ms]
LICENSE.txt [Status: 200, Size: 18092, Words: 3133, Lines: 340, Duration: 75ms]
admin [Status: 302, Size: 0, Words: 1, Lines: 1, Duration: 83ms]
cache [Status: 301, Size: 236, Words: 14, Lines: 8, Duration: 77ms]
cgi-bin/ [Status: 403, Size: 210, Words: 15, Lines: 9, Duration: 53ms]
cgi-bin/.html [Status: 403, Size: 215, Words: 15, Lines: 9, Duration: 53ms]
config [Status: 403, Size: 208, Words: 15, Lines: 9, Duration: 43ms]
content [Status: 301, Size: 238, Words: 14, Lines: 8, Duration: 68ms]
favicon.ico [Status: 200, Size: 1150, Words: 4, Lines: 1, Duration: 57ms]
front [Status: 301, Size: 0, Words: 1, Lines: 1, Duration: 60ms]
humans.txt [Status: 200, Size: 1157, Words: 49, Lines: 59, Duration: 107ms]
index [Status: 200, Size: 7870, Words: 2318, Lines: 142, Duration: 60ms]
lang [Status: 301, Size: 235, Words: 14, Lines: 8, Duration: 69ms]
login [Status: 200, Size: 3046, Words: 616, Lines: 69, Duration: 47ms]
logout [Status: 302, Size: 0, Words: 1, Lines: 1, Duration: 70ms]
robots.txt [Status: 200, Size: 1154, Words: 112, Lines: 48, Duration: 59ms]
robots.txt [Status: 200, Size: 1154, Words: 112, Lines: 48, Duration: 47ms]
sitemap.xml [Status: 200, Size: 505, Words: 4, Lines: 1, Duration: 89ms]
system [Status: 301, Size: 237, Words: 14, Lines: 8, Duration: 68ms]
themes [Status: 301, Size: 237, Words: 14, Lines: 8, Duration: 67ms]
upload.php [Status: 302, Size: 0, Words: 1, Lines: 1, Duration: 80ms]
:: Progress: [81912/81912] :: Job [1/1] :: 114 req/sec :: Duration: [0:04:00] :: Errors: 0 ::
The /admin
and robots.txt
endpoints seem interesting
/robots.txt
┌──(kali㉿kali)-[~/PEN-200/PG_PRACTICE/sybaris]
└─$ curl -s http://$IP/robots.txt | grep -v '^#'
User-agent: *
Disallow: /config/
Disallow: /system/
Disallow: /themes/
Disallow: /vendor/
Disallow: /cache/
Disallow: /changelog.txt
Disallow: /composer.json
Disallow: /composer.lock
Disallow: /composer.phar
Disallow: /search/
Disallow: /admin/
Allow: /themes/*/css/
Allow: /themes/*/images/
Allow: /themes/*/img/
Allow: /themes/*/js/
Allow: /themes/*/fonts/
Allow: /content/images/*.jpg
Allow: /content/images/*.png
Allow: /content/images/*.gif
The robots.txt
file reveals many endpoints
N/A