[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
[FD] [Tool/API] desenmascara.me - Fingerprinting and assessing the web security awareness of websites
- To: Fulldisclosure <fulldisclosure@xxxxxxxxxxxx>
- Subject: [FD] [Tool/API] desenmascara.me - Fingerprinting and assessing the web security awareness of websites
- From: Emilio Casbas <ecasbasj@xxxxxxxx>
- Date: Wed, 15 Apr 2015 07:24:21 +0000 (UTC)
desenmascara.me (in English can be translated as: Unmask me) is an online PoC
tool whose goal is: to raise web security awareness among web owners in order
to help decrease the constant rise of compromised websites.
The desenmascara.me PoC calculate a score also known as 'security awareness
value' of any website (neither resources nor crawling) based on all the
metadata available.
Basically the score is based on a simple calculation of the more weak metadata
you show the worst and in the other way around. It is explained as follows:
-Score < 0: a website is considered as prone to be compromised. (because it is
based on known and buggy software such as old Joomla, highly critical Drupal or
Typo3 versions...)
-Score between 0 and 20: a website is not considered as security aware (their
owners have room to improvement, they should take care)
-Score between 20 and 59: a website is considered somehow as security aware.
(their owners have done some tweaking to the site). Usually this is the normal
status.
-Scoring 60 or higher: a website is considered as security aware (their owners
have done some hardening either on the platform or in the web architecture)
Additionally as desenmascara.me will collect all the metadata possible from any
website it can serve at the same time to two audiences:
-Auditors/pentesters: as 1 click method to Fingerprint a web server.
-Web owners without security background: trying to explain in a brief summary
the web security awareness status based on all the info collected.
The metadata extraction will be totally passive just like browsing the website,
otherwise the tool could not be online for public use.
Some additional features of the tool by leveraging all the info collected are:
-Easy to use, only enter a website address to see what's behind the scenes
-Available in English and Spanish (based on the browser language)
-Detection of domains potentially being used for phishing (kind of:
mail-google.com.ve, applesupport.com.mx..)
-Detection of sites being mirrored from anothers (usually founds in Phishing,
fake and scareware websites)
-Detection of CMSs and versions (whatweb core)
-Brief summary about the website configuration
-Different report colours to highlight web security awareness
-Detection of domain registrar for .com & .net TLD (some are more security
savvy than others)
-Detection and warnings about the danger of hosting third party providers.
-Warnings about old software being exploited in the wild like joomla-1.5, RoR
CVE-2013-0156...
-Warnings about domains (.com & .net) expiring in the coming days
-Detection of properties file leak in Ruby on Rails.
-Warnings about OpenSSL version afected by heartbleed.
-Warnings about Drupal Core - Highly Critical PSA-2014-003.
-Warnings about TYPO3 - Highly Critical Authentication bypass.
-Detection of hardening signs such as WAF, CDN, reverse proxy...
-In case of CloudFlare protected websites, it will try to show the real server
IP.
-Detection of blacklisted websites by GoogleSafeBrowsing
-Detection of suspicious iframes or hidden spam
-Detection of misconfiguration on robots.txt files (i.e: exposing confidential
information)
-Detection of defacements, directory listings, private IP address in comments...
-In the case of very known websites (Forbes, EA, .gov ...) will inform about
known security incidents which they were victim of.
-Stats about general web security awareness and some details of compromised
websites
The goal of this tool is NOT to consider a website either secure or insecure,
but to consider the website owners security aware in different levels as
explained.
In my observations during the last years I could spot a common pattern among
the vast majority of compromised websites being used in all kind of malware
campaigns: they are all poor maintained.
With this PoC the goal is to highlight the importance to keep updates the
websites. The lowest the score the more vulnerable a website is and therefore
prone to be compromised. Bear in mind that this analysis is valid for most of
the current attacks on the web nowadays; compromised websites to serve as
redirectors, to store phishings, as proxys for exploit kits and any malicious
purposes of malware campaigns. All the above activity within the compromised
websites is taking place without their owner´s knowledge, in web servers which
can be online during years without updates. All the malicious activity is
happening behind the scenes. Later the owner of the website will get complains
either from their users or the hosting company. The problem is that these
owners are not security aware (nor the hosting companies). But if you have
deployed a website you should take some precautions, you can not rely on the
sofware vendors, community or the hostings. It is like driving a car, you need
formation and a license and then you need to keep you car in good status, doing
the yearly checks and so on, in order to not be a danger in the road. By not
taking precautions you would be considered as an irresponsible and you will end
up with fines at best. The same in Internet, if you do not take precautions
with your website then you are feeding the malware ecosystem in the Internet.
Having done some tests [1] with the desenmascara.me scoring towards tens of
websites from the same malware campaign. And by observing the results during
the last years of using this scoring I can see the accuracy to evaluate the
relation between poor maintained websites and how they become compromised is
around 80%.
Therefore I have decided to publish an API [2] to this service with an
interesting use intended for an initial fulldisclosure and as a wake up call
for their web owners: querying URLs prone to be compromised:
http://desenmascara.me/api/howto#pronetld
You can play with http requests such as:
http://desenmascara.me/api/lessscorebytld/gov
http://desenmascara.me/api/lessscorebytld/br
http://desenmascara.me/api/lessscorebytld/com
There is nothing illegal nor unethical on providing this information. All the
info is public, the desenmascara.me service just collect it and interpret it.
http://desenmascara.me
REFERENCES
[1] http://pwnedwebsites.com/how-to-spot-website-easy-target.html[2]
http://desenmascara.me/api/howto
Thanks
Emilio
_______________________________________________
Sent through the Full Disclosure mailing list
https://nmap.org/mailman/listinfo/fulldisclosure
Web Archives & RSS: http://seclists.org/fulldisclosure/