[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[Full-disclosure] WWWroot spring cleaning of neglected files



[ Tl;dr: do a cleanup, help create a web-scan jackpot DB ]


Ever temporarily uploaded/moved/created files in a directory accessible from 
the web? How many times have you left them there? Have you ever used a wwwroot 
to transfer DB's (even if through https) from one place to another? Ever used 
short filenames that you thought were kind-of-random for anyone to scan for? 
Read on.

I realize there are many 'web vulnerability scanners' out there with thousands 
of different variations of possibly interesting web queries and such. The 
reason I'm asking you all to contribute with ideas is that...

1) In practice, I found less usable results - especially in a plaintext dump - 
than I expected (including dozens of weblogs).
2) Many of these 'lists' contain too much obsolete junk that makes it 
unrealistic to use in a mass-scan on a larger local network (or the internet, 
which is not my aim by the way).
3) I hope to compile a list of neat locations that do not yet appear in any web 
scanner databases, but are still worth mentioning and looking for.

The best way to contribute would be - after anything valid that comes to mind - 
to go and check out your wwwroots, do a spring cleaning and share whatever file 
or directory name you found and removed that is likely used on other servers 
and could be of interest to an 'attacker'.

Mainly looking for:
- test, backup scripts
- DB/www backups
- source code in general
- temporary dirs for file sharing

Leave out obvious and application-specific stuff (already out there in all 
scanners)
- /admin
- /phpmyadmin
- /robots.txt
- /cgi-bin
- /scripts

Leave out generic ones (that will generate 'false positives' too often)
- /help
- /info
- /stat
- /doc
- /list
- /upload

A few ideas off the top of my head (I expect better from you guys :))
- /intranet
- /backup
- /backup(s).asp/php/py
- /database, /dbase, /dbs, /db, /_db, /save
- /backup.tgz, /backup.tar.gz, /backup.zip, /backup.rar
- /www.tgz, /www.tar.gz, /www.zip, /www.rar
- /db.tgz, /db.tar.gz, /db.zip, /db.rar
- /sql.tgz, /sql.tar.gz, /sql.zip, /sql.rar
- /user.sql, /users.sql, /customer.sql, /db.sql, /data.sql, /dump.sql
- /dump /dump.tgz, /dump.tar.gz, /dump.tgz, /dump.rar
- [hostname].tgz, [hostname].tar.gz, [hostname].zip, [hostname].rar
- /sql, /sqlbackup
- /inc, /include, /includes
- /a, /b, /c etc...
- /1, /2, /3, /4 etc...
- /2000, /2001, /2002, /2003, etc...
- /log.txt, /log, /logs, /weblog, /weblogs
- /zip, /zipfiles
- /htaccess.txt, /htpasswd.txt
- /manage
- /tmp
- /uploads
- /tmp
- /beta
- /test
- /excel, /xls
- /xml
- /www-sql
- /prv, /priv, /privat, /private
- /config, /configs
- /accounts
- /config.inc
- /index.phps
- /moderator, /moderators
- /useradmin, /dbadmin
- /dynamic
- /api
- /employees
- /fileadmin
- /hidden, /secret
- /shadow, /master.passwd, /pwd.db
- /.bash_history, /.history, /.mc, /.ssh
- /work
- /billing
- /auth.txt, /login.txt

After a few good replies and ideas, I would like to see anyone with access to a 
larger network with many webservers to do a scan (legally, of course) and 
provide statistics on success and false positives. I will do the same (unless 
this ends in a big FAIL / trollfest / flamewar - which is no doubt a 
possibility). I am also interested to hear what programs (out of the many) you 
use to scan webservers and why.

My apologies if such a thread has been posted here already or if I'm missing 
something obvious (in any case, links and resources are welcome of course).

Kind regards,
http://tor.hu



_______________________________________________
Full-Disclosure - We believe in it.
Charter: http://lists.grok.org.uk/full-disclosure-charter.html
Hosted and sponsored by Secunia - http://secunia.com/