Automating Audit Baselines – A Case Study

Quite a few times in this blog we’ve talked about automating audit and assessment tasks, especially as it relates to system baselines. We’ve tried quite a few times to give our readers tools for creating baselines and always hope that people will turn those baselines into automated processes that will alert them to deviations or changes to their systems.

Certainly one way to do this is via commercial products. Many companies have purchased file integrity assessment toolkits, such as those by Tripwire, in order to help automate this process. Others have turned to lower cost mechanisms such as scripting tools to do the same thing, although maybe not as elegantly.

In doing a little reading online at the Microsoft Script Center, I found a nice case study who decided to use scripting to accomplish their automation goals. Ben Wilkinson from Microsoft posted a series of blog articles where he describes the process of automating and alerting on changes to group memberships using Microsoft scripting technologies. Here’s a link to the full set of articles:

http://gallery.technet.microsoft.com/scriptcenter/655180ff-6236-4718-8dee-0f5b9b4a1f06

I think it’s worth highlighting and applauding efforts like this. If you aren’t already using an automated process like this to assist you with your continuous audit efforts, hopefully you can receive some inspiration from what Ben has posted here. Thanks Ben for the posts.

More PowerShell Audit One Liners

In our last couple posts we described how to gather a general baseline of system demographics on a Microsoft Windows system you’ve been tasked with auditing. Hopefully the posts gave everyone some ideas for the capabilities that PowerShell offers, even if the information we gathered isn’t all that exciting. In this post I thought we would show you additional examples you could try if you want to explore other pieces of information you might be able to gather with PowerShell and WMI during an audit.

Once you have the general syntax of these commands, even if you don’t fully understand the scripting behind it, you should be able to copy and paste these commands into an audit script. If you want a full library of the various WMI objects that Microsoft makes available or the attributes they return, check out this link over at Microsoft:

http://msdn.microsoft.com/en-us/library/aa394084(v=vs.85).aspx

So here are a few other examples of WMI queries that might be useful during an audit:

List the available IPv4 Address(es) from a system:

get-wmiobject Win32_NetworkAdapterConfiguration | fl Name,IPAddress

List the available IPv6 Address(es) from a system:

ifconfig -a | awk '/inet6 addr:/ { print $3 }'

List the available MAC Address(es) from a system:

get-wmiobject Win32_NetworkAdapterConfiguration | fl Name,MACAddress

List the User Accounts on a system:

get-wmiobject Win32_UserAccount | ft Name,SID

List the Groups on a system:

get-wmiobject Win32_Group | ft Name,SID

I hope these help to inspire you to try out scripting in your audits and maybe even consider writing a few audit scripts of your own.

Using SystemInfo.exe to Baseline a System

After our last post on gathering system demographics using PowerShell (specifically the Get-Object cmdlet) we had a few auditors mention to us that there are other ways to do it as well. We couldn’t agree more and we’re glad they brought it up. Microsoft seems to like to give us choices for how we perform job tasks, and this is no exception.

One other very popular way to gather information from a Microsoft Windows system is through the built-in systeminfo.exe utility. This command has been available at the command line since Microsoft Windows XP, and so in the course of an audit you’re very likely to find this command native on any Windows system you happen to be auditing.

One of the other nice things about this command is the fact that it is very, very simple to run. Simply type the name of the binary into a cmd.exe or powershell.exe terminal window and the tool will query information about the underlying system you’re examining.

There aren’t many options or command line switches that you can use to customize the output, but there are a few. Microsoft documents all of the options you do have at http://technet.microsoft.com/en-us/library/bb491007.aspx. From that same article, here are the options that they make available to you:

[framed_box]/s Computer : Specifies the name or IP address of a remote computer (do not use backslashes). The default is the local computer.
/u Domain \ User : Runs the command with the account permissions of the user specified by User or Domain\User. The default is the permissions of the current logged on user on the computer issuing the command.
/p Password : Specifies the password of the user account that is specified in the /u parameter.
/fo { TABLE | LIST | CSV } : Specifies the format to use for the output. Valid values are TABLE, LIST, and CSV. The default format for output is LIST.
/nh : Suppresses column headers in the output. Valid when the /fo parameter is set to TABLE or CSV.
/? : Displays help at the command prompt[/framed_box]

So a few of the nice features you can see from the utility already is the ability to run the command against remote computers, the ability to specify the output format of the data (including CSV format), and even the ability to suppress the headers in a CSV file to make it easier to parse later.

So if you haven’t tried this utility as a part of your baselining efforts yet, we definitely would recommend that you check it out. It’s another one of those nice auditing goodies Microsoft has built into the operating system for us.

PowerShell Audit One Liners

Over our last few posts we’ve talked a lot about using Unix BASH scripting to audit Unix systems. But we certainly don’t want our Windows friends to feel left out. The more I talk with people and listen to their security challenges, the more interest I hear about how to use PowerShell for audit or security purposes. Who knows, maybe it’s your New Year’s resolution to learn PowerShell this year and integrate it more into your audit activities. Well if it is, maybe we can help to inspire you and get you started on the right foot.

Just like last month, we thought we would post scripting one liners that you can use to query information about a system you’re auditing. These one liners also work very nice in incident response scenarios as well if you find your self in that situation.

For consistency’s sake, I’ll start by following the same script we used on Unix the last few months. As a first step, what commands might someone issue in order to gather general demographic information about a Windows system they’re auditing using PowerShell. Here’s a few to get started:

Display the name of the system:

(get-wmiobject win32_computersystem).Name

Display the domain name of the system:

(get-wmiobject win32_computersystem).Domain

Display the CPU installed in the sytem:

(get-wmiobject win32_processor).Name

Display the CPU speed of the installed CPU:

(get-wmiobject win32_processor).MaxClockSpeed

Display the installed physical memory:

(get-wmiobject Win32_ComputerSystem).TotalPhysicalMemory / 1GB

Display the available memory on the system:

(get-wmiobject Win32_OperatingSystem).FreePhysicalMemory / 1GB

In all these cases so far we’re using the Get-WMIObject cmdlet in PowerShell to gather general demographic information. The nice thing about running each of these commands in PowerShell is that you can easily place them all into one script and you aren’t dependent on any OS specific or version specific binaries being present on the system. As long as PowerShell is available on the system (which certainly most all Windows boxes should have it by now), you’re able to use these commands.
We’ll post more ideas to add to your scripts later, but hopefully scripting is on your list of things to learn this year and we can give you a little shove in the right direction.

Audit Script to Detect Unix Operating System

In the last few blog entries we have been focusing quite a bit on displaying information from a Unix system via a BASH script. One question that’s come up by quite a few people is, do these commands work on all Unix system? That’s a very valid question.

It turns out that one of my favorite sayings about Unix is that “Unix systems are always the same, they’re just different.” In other words – most Unix flavors share all sorts of similarities. The problem is that there are often very subtle differences between flavors of systems that make it difficult to write one script that will work on all systems. Basically when writing an audit script you have to decide, do you write one script for all flavors and sacrifice on which commands you run to make it consistent, write multiple scripts (one per flavor of Unix), or do you try to do OS detection within your script.

If you decide to do OS detection, I know there are multiple ways to do it. But here is some basic code that we have used in the past in a BASH script to detect which operating system or flavor of Unix is running on a system:

if [ -f /etc/debian_version ]; then
OS="Debian"
VER=$(cat /etc/debian_version)

elif [ -f /etc/redhat-release ]; then
OS="Red Hat"
VER=$(cat /etc/redhat-release)

elif [ -f /etc/SuSE-release ]; then
OS="SuSE"
VER=$(cat /etc/SuSE-release)

else
OS=$(uname -s)
VER=$(uname -r)
fi

echo "Operating System Name: $OS"
echo "Operating System Version: $VER"

Like I said, this likely won’t work on all flavors, you’ll need to test it out on your favorite to make sure it works. In fact if you have suggestions to improve it, please submit them to [email protected] and we’d be happy to update ours too. Happy scripting!

Unix Audit Script for Disk Utilization

We’ve noticed an issue on some of our Unix servers lately. This may not be completely security related (unless of course we’re talking about the availability of a system). We have noticed on quite a few occurrences lately that the disk space on our Unix servers has started to grow out of control to the point where the availability of the system was at risk. Sometimes this happens because of log files that grow too large, databases grow larger than expected, or backups don’t rotate like we planned. But in any case the result is the same – disk drives start filling up to the point where there isn’t much disk space left.

So a while ago we implemented a few BASH scripts to check for free disk space and then report that disk utilization to the help desk on a regular, automated basis. Using the following simple Unix shell commands, we started reporting on the disk utilization for each of our systems.

Information on the installed physical disks on a system:

fdisk -l | head -2 | grep Disk

Information on the installed physical partitions on a system:

fdisk -l | tail -8 | awk '/dev/ { print $1 " " $5 " " $7 }'

Information on the available free disk space on a system:

df -h | awk '{if ($1 != "Filesystem" && $1 != "none") print $1 " " $5}'

Now, if you want to automated these scripts, I would recommend using the Cron or Anacron services to run these commands on a daily or weekly basis. For the reporting side, I really like the tool “SendEmail” for Unix and Windows systems to generate the email. But certainly that’s just a matter of preference.

You might also consider adding these commands to your Unix baseline script. That way during a baseline activity or during an audit you can gather info on disk utilization as well.

More Unix Audit Script One Liners

In our last post we gave some examples of Unix audit script one liners for baselining information from a Unix system. It turns out there are more people than we thought who are interested in this topic and are looking to include commands like these in their scripts. We definitely appreciate everyone’s enthusiasm and decided to post more commands in an effort to help you with your scripts.

So here are a few more Unix audit script one liners for you to try. Enjoy!

List the available IPv4 Address(es) from a system:

ifconfig -a | awk '/inet addr:/ { print $2 }' | cut -d: -f2

List the available IPv6 Address(es) from a system:

ifconfig -a | awk '/inet6 addr:/ { print $3 }'

List the available MAC Address(es) from a system:

ifconfig -a | awk '/Ether/ { print $5 }'

Find all SUID files on a Unix system:

find  / -ignore_readdir_race -perm 04000

Find all GUID files on a Unix system:

find  / -ignore_readdir_race -perm 02000

Find all Sticky files on a Unix system:

find  / -ignore_readdir_race -perm 01000

Again, we would strongly recommend scheduling your scripts and creating automated alerts via email alert functions and your help desk software. It will help keep you honest and track your progress.

Another fun idea that one of our clients suggested was to create your script and then execute it at the conclusion of every vulnerability scan that you run. It turns out most vulnerability scanners give you the ability to execute a script at the conclusion of your scan, your audit script could be what you execute. If you really want to get fancy, you could even centralize your output files via SSH to a central server so you, auditors, or sysadmins could review the results at will.

Unix Audit Script One Liners

Lately I’ve had quite a few requests come in from students and clients to review the audit script that companies are using to audit their Unix / Linux systems. It seems like every company has one person who, at some point in time, wrote a script to audit Unix systems, or they downloaded one from someone online. But in either case people keep wondering – exactly what information are they gathering, and is it the right information to help in an audit?

I know there are multiple languages a company could use to write their Unix audit script, but for me BASH (Bourne Again Shell) scripting has always been the way to go. I don’t have any problem with a company using Perl or Python for their scripts. My only concern is that there have been quite a few times that during an audit an interpreter for either language might not be on the system I’m auditing. And due to principle, I just don’t feel right asking a sysadmin to install an interpreter – not that they would even if I asked!

All that being said, I figured I would write up some Unix audit one liners to inspire you to write your own scripts. If you are writing your own, here are a few to get you started:

Display the name of the system:

Unix Host Name: $(hostname -s)

Display the DNS name of the system:

DNS Domain Name: $(hostname -d)

Display the CPU installed in the sytem:

CPU Model Name: $(cat /proc/cpuinfo | awk -F": " '/model name/{CPU=$2} END{print CPU}')

Display the CPU speed of the installed CPU:

CPU Speed (MHz): $(cat /proc/cpuinfo | awk '/cpu MHz/ { print $4 }')

Display the installed physical memory:

Total Physical Memory (MB): $(free -m | awk '/Mem:/ { print $2 }')

Display the memory in use on the system:

Used Physical Memory (MB): $(free -m | awk '/Mem:/ { print $3 }')

Display the available memory on the system:

Available Physical Memory (MB): $(free -m | awk '/Mem:/ { print $4 }')

We will post more one liners later this month. Hopefully this inspires you to take a look at the script you use or maybe even start writing your own.

Comparing Text Files in Windows

So last month we wrote a post about the built in capabilities of Microsoft Windows to be able to perform comparisons of two text files. Personally when I am comparing two files I am concerned that I can do it from the command line, can easily automate the comparison, and that the output is easy to parse and understand.

Built into Windows the three most common options I have seen to compare text files are:

  • FC.EXE (Windows Binary)
  • COMP.EXE (Windows Binary)
  • COMPARE-OBJECT (PowerShell cmdlet)

As I have said before, COMPARE-OBJECT is about the best option of the three choices, but it sure would be nice if there were more advanced options available with the tool. Ideally we would all have access to a free utility that would meet all the requirements we have been discussing here. But alas, I am not sure we are going to find that right now. So on the commercial side, what is left?

There are quite a few third party tools you might consider. Some of the more popular ones I have seen people use are:

  • WinDiff
  • ExamDiff
  • WinMerge
  • Beyond Compare

If you are looking for a free utility (open source), that also works from the command line, then you should check out WinMerge. If definitely looks like an intriguing project and it meets most of my requirements for a good file differential tool. You can learn more about it at http://winmerge.org/.

But lately I have been most impressed by Beyond Compare’s software (located at . The only downside to the project is that it is commercial. So if you’re planning on deploying it to a large number of systems you may need to shell out some coin. But it’s reasonably priced, so we cannot complain too much.

Not only does the tool meet all of our requirements though, it also comes with a few other nice bells and whistles, like the ability to compare registry settings, the ability to use the unified diff output format, more granular control of comparisons themselves, etc. We do not even get a commission from these folks, but their software seems to be the best suited if you’re looking for a fully capable system.

My guess is that most of us can get away with using WinMerge on a large scale for day to day and basic differentials. But if you really need more advanced capabilities, then forking over the money for a Beyond Compare license is likely worth it.

Comparing Two Files with PowerShell

One of the concepts that we have written about over and over again on this blog is the principal of baselining and how to compare the present state of a system with a known good snapshot of the same attribute of a system. If for instance we have a server with 10 running services on it today, and tomorrow we examine the system and discover that there are 11 running services, then something just isn’t right. In the immortal words of Sesame Street, “One of these things is not like the other. One of these things are not the same…”

So let’s say you have a baseline created for some attribute of an operating system. How would you go about doing a comparison of a snapshot you took earlier and one that you just took? To start let’s assume that these snapshots are text based. If they are binary snapshots then we have a whole different set of issues to worry about. But assuming the two snapshots are the output of the same command (just taken at a different time) and the output is text based, then we’re in business.

If you’re working with a Unix / Linux system, then you would definitely reach for the stalwart DIFF utility. It’s been around forever and it is a system administrator’s favorite.

With Microsoft PowerShell though we have to make a decision. The DIFF binary is not available in PowerShell (although there is a DIFF alias). However there are two commands that are built into Windows and that have been available since the days of CMD.EXE. These two built in commands are FC.EXE and COMP.EXE. Unfortunately both of them tend to be unreliable and give strange results when using them for baselining. So what else can we do?

Thankfully PowerShell has introduced the cmdlet COMPARE-OBJECT (and yes, as you guessed, DIFF is an alias to this cmdlet). With PowerShell, you can take two objects, give them to COMPARE-OBJECT, and it will give you a comparison between the two objects. These objects can be anything, but for our purposes we will be focusing on text files. But there’s nothing to say you could not compare user accounts from Active Directory, Registry Keys, or any other objects.

The syntax for the command is:

Compare-object object1 object2

However there is one gotcha. If you compare two text files this way, then the cmdlet will simply say, “Yes, the name of the text files file1.txt and file2.txt are different.” So if you want to compare the content of two text files there is an extra step to take. You have to introduce the GET-CONTENT cmdlet in order to compare the content of the two files. Therefore the new (and usable) syntax would work like this:

Compare-object (get-content file1.txt) (get-content file2.txt)

The results of this command will show you the side indicator of what is different between the two files and in which of the two files the added text exists in. It’s kind of a funny arrow based system, but it’s easy enough to understand.

Now you should be all set. So if you have been getting in the habit of baselining your systems, then these commands might be useful when you are trying to automate a comparison between two snapshots. Next you probably want to automate things further with a command line email utility (I like the built in PowerShell capabilities