There are a number of tools for doing this sort of activity. In this case, I choose to use PowerShell, because of the speed with which I could accomplish this task. It took me less than 5 minutes to write and test the script. And it saved me a lot of time parsing the log files!
Using the PowerShell script, I can quickly change my queries, re-run my queries, export results, and most importantly, query large numbers of log files quickly.
Here's the code I used.
# Set the path to the log files $path = "C:\Temp\Logs" # Get a collection of all the log files (anything ending in .log) $files = Get-ChildItem -Path $path -Filter "*.log" # Pipe the collection of log files to the ForEach-Object cmdlet # (the alias of ForEach-Object is %) $files | %{ # Call the OpenText method, to return a System.IO.StreamReader object $file = $_.OpenText(); # Record the current line number (to use in the console output) $lineNum = 1; Write-Host "Checking file"$_.Name -f Yellow; # Use the EndOfStream method (which returns true when you have reach the end # of the file), read each line of the file. while($file.EndOfStream -ne $true) { # Read the next line in the file $line = $file.ReadLine(); if($line -ne $null) { # Use the String ToLower and Contains methods to check for occurances # of the strings (or values) you need to check the file for # In this example, I'm looking for any instances of the text "error" or "exception" if($line.ToLower().Contains("error") -or $line.ToLower().Contains("exception")) { # If the current lines contains a match, write the line number # and line text out to the console Write-Host "Line: $lineNum " -NoNewline -ForegroundColor Green; Write-Host $line -f Red; } } # Increment the line number $lineNum++; } }