Unix Audit Script One Liners

Lately I’ve had quite a few requests come in from students and clients to review the audit script that companies are using to audit their Unix / Linux systems. It seems like every company has one person who, at some point in time, wrote a script to audit Unix systems, or they downloaded one from someone online. But in either case people keep wondering – exactly what information are they gathering, and is it the right information to help in an audit?

I know there are multiple languages a company could use to write their Unix audit script, but for me BASH (Bourne Again Shell) scripting has always been the way to go. I don’t have any problem with a company using Perl or Python for their scripts. My only concern is that there have been quite a few times that during an audit an interpreter for either language might not be on the system I’m auditing. And due to principle, I just don’t feel right asking a sysadmin to install an interpreter – not that they would even if I asked!

All that being said, I figured I would write up some Unix audit one liners to inspire you to write your own scripts. If you are writing your own, here are a few to get you started:

Display the name of the system:

Unix Host Name: $(hostname -s)

Display the DNS name of the system:

DNS Domain Name: $(hostname -d)

Display the CPU installed in the sytem:

CPU Model Name: $(cat /proc/cpuinfo | awk -F": " '/model name/{CPU=$2} END{print CPU}')

Display the CPU speed of the installed CPU:

CPU Speed (MHz): $(cat /proc/cpuinfo | awk '/cpu MHz/ { print $4 }')

Display the installed physical memory:

Total Physical Memory (MB): $(free -m | awk '/Mem:/ { print $2 }')

Display the memory in use on the system:

Used Physical Memory (MB): $(free -m | awk '/Mem:/ { print $3 }')

Display the available memory on the system:

Available Physical Memory (MB): $(free -m | awk '/Mem:/ { print $4 }')

We will post more one liners later this month. Hopefully this inspires you to take a look at the script you use or maybe even start writing your own.

Comparing Text Files in Windows

So last month we wrote a post about the built in capabilities of Microsoft Windows to be able to perform comparisons of two text files. Personally when I am comparing two files I am concerned that I can do it from the command line, can easily automate the comparison, and that the output is easy to parse and understand.

Built into Windows the three most common options I have seen to compare text files are:

  • FC.EXE (Windows Binary)
  • COMP.EXE (Windows Binary)
  • COMPARE-OBJECT (PowerShell cmdlet)

As I have said before, COMPARE-OBJECT is about the best option of the three choices, but it sure would be nice if there were more advanced options available with the tool. Ideally we would all have access to a free utility that would meet all the requirements we have been discussing here. But alas, I am not sure we are going to find that right now. So on the commercial side, what is left?

There are quite a few third party tools you might consider. Some of the more popular ones I have seen people use are:

  • WinDiff
  • ExamDiff
  • WinMerge
  • Beyond Compare

If you are looking for a free utility (open source), that also works from the command line, then you should check out WinMerge. If definitely looks like an intriguing project and it meets most of my requirements for a good file differential tool. You can learn more about it at http://winmerge.org/.

But lately I have been most impressed by Beyond Compare’s software (located at . The only downside to the project is that it is commercial. So if you’re planning on deploying it to a large number of systems you may need to shell out some coin. But it’s reasonably priced, so we cannot complain too much.

Not only does the tool meet all of our requirements though, it also comes with a few other nice bells and whistles, like the ability to compare registry settings, the ability to use the unified diff output format, more granular control of comparisons themselves, etc. We do not even get a commission from these folks, but their software seems to be the best suited if you’re looking for a fully capable system.

My guess is that most of us can get away with using WinMerge on a large scale for day to day and basic differentials. But if you really need more advanced capabilities, then forking over the money for a Beyond Compare license is likely worth it.

Comparing Two Files with PowerShell

One of the concepts that we have written about over and over again on this blog is the principal of baselining and how to compare the present state of a system with a known good snapshot of the same attribute of a system. If for instance we have a server with 10 running services on it today, and tomorrow we examine the system and discover that there are 11 running services, then something just isn’t right. In the immortal words of Sesame Street, “One of these things is not like the other. One of these things are not the same…”

So let’s say you have a baseline created for some attribute of an operating system. How would you go about doing a comparison of a snapshot you took earlier and one that you just took? To start let’s assume that these snapshots are text based. If they are binary snapshots then we have a whole different set of issues to worry about. But assuming the two snapshots are the output of the same command (just taken at a different time) and the output is text based, then we’re in business.

If you’re working with a Unix / Linux system, then you would definitely reach for the stalwart DIFF utility. It’s been around forever and it is a system administrator’s favorite.

With Microsoft PowerShell though we have to make a decision. The DIFF binary is not available in PowerShell (although there is a DIFF alias). However there are two commands that are built into Windows and that have been available since the days of CMD.EXE. These two built in commands are FC.EXE and COMP.EXE. Unfortunately both of them tend to be unreliable and give strange results when using them for baselining. So what else can we do?

Thankfully PowerShell has introduced the cmdlet COMPARE-OBJECT (and yes, as you guessed, DIFF is an alias to this cmdlet). With PowerShell, you can take two objects, give them to COMPARE-OBJECT, and it will give you a comparison between the two objects. These objects can be anything, but for our purposes we will be focusing on text files. But there’s nothing to say you could not compare user accounts from Active Directory, Registry Keys, or any other objects.

The syntax for the command is:

Compare-object object1 object2

However there is one gotcha. If you compare two text files this way, then the cmdlet will simply say, “Yes, the name of the text files file1.txt and file2.txt are different.” So if you want to compare the content of two text files there is an extra step to take. You have to introduce the GET-CONTENT cmdlet in order to compare the content of the two files. Therefore the new (and usable) syntax would work like this:

Compare-object (get-content file1.txt) (get-content file2.txt)

The results of this command will show you the side indicator of what is different between the two files and in which of the two files the added text exists in. It’s kind of a funny arrow based system, but it’s easy enough to understand.

Now you should be all set. So if you have been getting in the habit of baselining your systems, then these commands might be useful when you are trying to automate a comparison between two snapshots. Next you probably want to automate things further with a command line email utility (I like the built in PowerShell capabilities

Auditing Windows Permissions with Get-ACL

One of the new Microsoft PowerShell cmdlets that auditors should appreciate is the GET-ACL cmdlet. Now, through native PowerShell commands, an auditor can retrieve a list of all the permissions associated with a given Windows object. The output from this command can be used to create a permissions baseline if someone is trying to alert on permissions changes. Or this command could be used to generate a list of all the permissions associated with a given objects. Through a simple syntax, an auditor can dump a list of all the permissions associated with a given Microsoft Windows object.

The simple syntax to run the command against a file system object would be the following:

Get-Acl c:\tools\ | Format-List

However you can also run this command against a number of different Microsoft Windows objects, including registry keys, Active Directory objects, printers, or anything else with an access control list associated with it. For example, to perform a similar command against a registry hive, the following command would work:

Get-Acl HKCU:\Software\Microsoft\Windows | Format-list

In addition, when using the –AUDIT parameter, an auditor can dump a list of the System Access Control Lists (SACLs) that are associated with an object in order to determine the logging settings configured on an object. The following shows an example of how to perform the command:

get-Acl -audit c:\tools\ | Format-list

Finally Daniel Carrarini has posted an interesting script for dumping access that shows some of the full features of the command. Here is a link to his blog post as well:

http://carrarini.blogspot.com/2011/08/powershell-script-for-dumping-access.html

Parsing Windows Firewall Rules

In our last post we discussed how to gather general information about the configuration of a Microsoft Windows Firewall, host based firewall configuration. But what most people are really interested in when doing a firewall audit is how the firewall rules themselves are configured.

One of the challenges of auditing a Microsoft Windows Firewall ruleset is how do you parse all the firewall rules that Microsoft automatically creates for you? It is great that Microsoft automatically configures most of the rules – it helps encourage us to actually leave the firewall turned on. But with all those rules, especially the disabled ones, how does an auditor easily parse through the data? And to make matters worse, the output of commands like NETSH is just text – not a PowerShell object. So that makes it even more difficult and time consuming to parse.

So that got us thinking, what if we could convert the output of a NETSH SHOW command for Microsoft Windows Firewall rules into a PowerShell object that we could more easily parse?

So we did a little digging and found out that Jaap Brasser had already created a basic script to do that at PowerShell.com (http://powershell.com/cs/forums/t/13260.aspx). The following code allows us to take the output from a NETSH command, and convert it to a PowerShell object, so we can more easily parse the ruleset:

$Output = @(netsh advfirewall firewall show rule name=all)

$Object = New-Object -Type PSObject

$Output | Where {$_ -match '^([^:]+):\s*(\S.*)$' } | Foreach -Begin {
$FirstRun = $true
$HashProps = @{}
} -Process {
if (($Matches[1] -eq 'Rule Name') -and (!($FirstRun))) {
New-Object -TypeName PSCustomObject -Property $HashProps

$HashProps = @{}
} $HashProps.$($Matches[1]) = $Matches[2]
$FirstRun = $false
} -End {
New-Object -TypeName PSCustomObject -Property $HashProps}

Now that the firewall rules are a PowerShell object we can use cmdlets like WHERE-OBJECT and SELECT-OBJECT to filter the information. We can perform ad hoc queries and work with the information however we see fit. Enjoy!

Script for Windows Firewall Baseline

Another baseline an auditor or system administrator might want to consider when assessing their systems is a baseline for the general configuration of the Microsoft Windows Firewall. Many organizations are starting to utilize the built in Microsoft Windows Firewall more and more when protecting even their internal systems. The use of a host based firewall should definitely be on the list of things to consider when evaluating the security of a system.

If the system that you are auditing is using a third party firewall product, then you will need to contact that vendor to see how best to automate collecting data about the firewall’s configuration. However if you are using the built in Windows Firewall, then the following script will help by gathering a baseline of the general configuration settings of the firewall:

$Global = (netsh advfirewall show global)
$Profiles = (netsh advfirewall show allprofiles)

$Global | Out-file firewall_config.txt
$Profiles | Out-file -append firewall_config.txt

It should be noted that this script will only work on systems running PowerShell – but you have probably already noticed that theme in our posts. But also please remember that this will only work on systems running the Windows Advanced Firewall that was released with Windows Vista and later systems. This will not work on the earlier, original firewall built into Windows XP.

Parsing Active Directory Groups

In a previous post we shared a PowerShell script that would allow an auditor to parse a list of groups and group members on a Microsoft Windows system as a part of a security assessment or baselining process. The question has come up though – what if someone wants to follow the same process but parse a list of Active Directory groups and their members instead?

It turns out that it is much easier to perform the task against Active Directory than it is to do it against a local Microsoft Windows local Security Accounts Manager (SAM) database. It seems that the native capabilities of the Active Directory PowerShell cmdlets is going to make it easier for us to accomplish this task.

The output requirement for this script will be the same as for the previous. The output should be in a CSV format that could be easily parsed by a spreadsheet program like Microsoft Excel. Also, like the previous script, empty groups should still be listed, but just without any group members listed along with them.

The code we used to parse a list of AD groups and their members is:

Get-ADGroup -Filter * | ForEach-Object { 
    $GroupName = $_.Name 
    $Members = Get-ADGroupMember $_.samaccountname -Recursive | foreach{$_.Name} 
        If ($Members.count -ge 1){
            $Out = $GroupName + "," + [String]::Join(",", $Members)
            $out | Out-File -append ad_group_members.txt
        }
        Else{
            $Out = $GroupName
            $out | Out-File -append ad_group_members.txt  
        }
    } 

Ideally an auditor would combine this script with our previous example and using PowerShell remoting run one script that would parse local and Active Directory groups at the same time and output the data as an easily readable file. But between the two examples, hopefully it will give you enough inspiration to make this usable for you in your audit efforts.

Parsing Local Windows Groups

One step that has become a staple part of any audit of Microsoft Windows systems is a listing of all the local groups on the system. Listing all the groups on a system with all the members of that group can help establish a baseline for the security configuration of a system. Certainly groups like the local Administrators group is a concern during a security audit. Ideally though auditors would have the ability to look at the members of each group on a system and validate that the list of members is appropriate for that system.

As with many of the scripts we have recommended, we decided to write a script using PowerShell to pull the information from a local system. We also wanted to make sure that when we pulled the information from a system that we could parse it easily once we were done. Therefore we wrote the script to allow us to pull all the local computer groups from a system, with the members of the group, and then save it as a CSV file for easier parsing with Excel.

Here is the code we used for the script:

$GroupList = Get-WmiObject Win32_Group | ForEach {$_.name}

$ComputerName = "."

ForEach ($GL in $GroupList)

{

    $Computer = [ADSI]("WinNT://" + $ComputerName + ",computer")

    $Group = $Computer.psbase.children.find($GL)

    $Members= $Group.psbase.invoke("Members") | %{$_.GetType().InvokeMember("Name", 'GetProperty', $null, $_, $null)}

        If ($Members.count -ge 1){
            $Out = $GL + "," + [String]::Join(",", $Members)
            $out | Out-File -append local_group_members.csv
        }
        Else{
            $Out = $GL
            $out | Out-File -append local_group_members.csv
        }

}

We ran into a couple issues when writing the script. One was the formatting of the output to a CSV file. Normally we would use an EXPORT-CSV cmdlet to do that, but in this case since we had multiple data sources we had to do some custom formatting. Also since some of the groups were empty, we had to add the IF clause to make sure that even blank groups were reported in our output.

Once the concept of the script starts to make sense to you, it should be an easy next step to consider adding PowerShell remoting capabilities or a computer name list to the script to parse multiple systems at once. You might also consider saving the output from the script in a format (such as HTML) that is easier to read and include in your audit reports.

We hope the script is useful as you audit Microsoft Windows servers or establish security baselines for those systems.

Continuous Monitoring with PowerShell

Welcome back from the holidays! I imagine many of you are just returning from the holidays and are ready get started on those new year’s resolutions. If one of them was to implement continuous monitoring or learn more about scripting, do I have a treat for you! Now that I’m back from some holiday travel myself, I think it’s time to continue our series on automating continuous monitoring and the 20 Critical Controls.

I don’t want these blog posts on an introduction to PowerShell. There are plenty of fine references on that available to you. In talking with Jason Fossen (our resident Windows guru), I have to agree with him that one of the best starter books on the topic is Windows PowerShell in Action, by Bruce Payette. So if you’re looking to get started learning PowerShell, start here, or maybe try some of the Microsoft resources available at the Microsoft Scripting Center.

But let’s say you’ve already made a bit of an investment in coding and you already know what tasks you’d like to perform. For example, maybe you wonder who is a member of your Domain Admins group, so you use Quest’s ActiveRoles AD Management snap in to run the following command:

(Get-QADGroup ‘CN=Domain Admins,CN=Users,DC=auditscripts,DC=com’).members

Or on the other hand, maybe you are concerned about generating a list of user accounts in Active Directory who have their password set to never expire, you’d likely have code such as:

Get-QADUser -passwordneverexpires

Or maybe even you want to run an external binary, like nmap, to scan your machines, you might have a command such as:

Nmap –sS –sV –O –p1-65535 10.1.1.0/24

In any case, the first step is to come up with the code you want to automate. That’s step one.

Next, you don’t just want to run the code, you want the code to be emailed to you on a regular basis, say once a day or once a week. The next step is to use a mailer to email you the results of your script. Now you have a few choices here. One option is to use a third party tool like blat to generate your email. But since we’re using PowerShell, let’s stick with that. Version 2.0 of PowerShell also has some built in mailing capabilities in this regard.

The easiest way to get started is to save the output of the commands you want run to a temporary text file, mail the text file as the body of an email message, and then delete the temporary file. An easy way to do this to get started would be to use the following commands:

$filename = sometextfilewithoutputresultsinit.txt

$smtp = new-object Net.Mail.SmtpClient("mymailserver.auditscripts.com")
$subject="SANS Automated Report - $((Get-Date).ToShortDateString())"
$from="automation@sans.org"

$msg = New-Object system.net.mail.mailmessage
$msg.From = $from
$msg.To.add("automation@auditscripts.com")
$msg.Subject = $subject
$msg.Body = [string]::join("`r`n", (Get-Content $filename))

$smtp.Send($msg)
remove-item $filename

Save your data as an appropriate PS1 file, automate the command to run once in a while using Task Scheduler, and you’re off to the races!

We certainly have more to discuss, but hopefully this inspires some thinking on the matter. I’ll post again soon with some other steps to consider, before we move on the Bash. There’s a lot we can talk about here. Until next time…

Scripting Automation for Continuous Auditing

One of the topics that we have been discussing with organizations a great deal lately is the idea of automation in regards to continuous auditing. Said a different way, the standard audit model involves auditors making a list of audit scopes that they want to cover in the course of a year. Then, one at a time, the auditor interviews people, examines the output from technical tools, and in general manually collects audit evidence.

What if the act of evidence collection could be automated? Then it would seem that auditors could spend less time collecting data and spend more time analyzing evidence for risk. Maybe even an auditor could consider additional scopes in the course of a year if less time was being spent manually collecting such evidence.

In the next few blog posts I’d like to consider tools that could be used to assist us in automating the collection of audit evidence. Ideally we would use tools suited to this purpose, but realistically I know many of us will be left on our own to develop tools that can accomplish this purpose. Therefore we will focus on scripts and built in operating system tools to achieve this end.

To begin this discussion, let’s identify some tools that we can use for these purposes. Next time we can start to delve into a couple of these at a time. But in the meanwhile, check these out as a way to automate data collection. Consider each of these a piece of a bigger recipe for continuous auditing.

Windows Automation Tools:

PowerShell (scripting language)

Task Scheduler (scheduling tasks)

Advanced Task Scheduler (scheduling tasks)

Blat (command line mailer)

Bmail (command line mailer)

Unix Automation Tools:

Bash Scripting (scripting language)

Cron (scheduling tasks)

Anacron (scheduling tasks)

Mail (command line mailer)

Mailsend (command line mailer)

As you can see this is just a start. We will need to use other tools to actually collect our data sets, but these tools will form the building blocks of each of our scripts. If you have other tools you would like to suggest, don’t be shy. We’d all love to hear what tools have helped you to be a more successful auditor.