TroubleShooting Fast growing transactional logs – Part 2

I am continuing from the previous post ” TroubleShooting Fast growing transactional logs”.

In my experience I have seen the most of the issues where transactional log drives gets fill up quickly is because of  Active sync enabled devies.

Microsoft has provided a very good script of finding out the devices atht are causing issues. Below is the link to the script  which I have have clubed with mine script & which has made it very easy for me to find out the problematic devices/users.

http://blogs.technet.com/b/exchange/archive/2012/01/31/a-script-to-troubleshoot-issues-with-exchange-activesync.aspx

Now here are the scripts that I have used along with the above one..

1.  Extract the data from all the CAS servers (This will map drive from each CAS extract logs  & remove the mapping)

you have to input date for which you want to extract data..

—————————————————————————————————————————–
Function Active ($CAS, $date, $hits) {
$net = $(New-Object -ComObject WScript.Network)
$net.MapNetworkDrive(“V:”, “\\$cas\e$“)
.\ActiveSyncReport.ps1 -IISLog “V:\LogFiles\W3SVC1” -LogparserExec “C:\Program Files (x86)\Log Parser 2.2\LogParser.exe” -ActiveSyncOutputFolder C:\EASReports -Date $date -MinimumHits $hits -ActiveSyncOutputPrefix $CAS

$net.RemoveNetworkDrive(“V:”)
}

$time = “03-08-2012”
$Shoot = “1500”

Active CASP01 $time $Shoot
Active CASP02 $time $Shoot
Active CASP03 $time $Shoot

————————————————————————————————————————————————–

2. Now if you know the database that is having isssues.

you have to input the same date as in above script with CMS & data base name.

$date = “03-08-2012”

$input1 = “C:\EASReports” + “\” + “EASyncOutputReport-xcasp01_”+ $date +”_Minimum_Hits_of_1500.csv”
$input2 = “C:\EASReports” + “\” + “EASyncOutputReport-casp02_”+ $date +”_Minimum_Hits_of_1500.csv”
$input3 = “C:\EASReports” + “\” + “EASyncOutputReport-casp03_”+ $date +”_Minimum_Hits_of_1500.csv”

$server = “CMSP02”

$data = import-csv $input1

$database = $server + “\SG11\DB11_AM”
foreach ($i in $data)

{
# get mailbox for users
$mailbox = get-mailbox $i.user | where {$_.servername -eq $server -and $_.database -eq $database}

$mailbox | select Name, servername
}

$data = import-csv $input2
foreach ($i in $data)

{
# get mailbox for users
$mailbox = get-mailbox $i.user | where {$_.servername -eq $server -and $_.database -eq $database}

$mailbox | select Name, servername
}

$data = import-csv $input3
foreach ($i in $data)

{
# get mailbox for users
$mailbox = get-mailbox $i.user | where {$_.servername -eq $server -and $_.database -eq $database}

$mailbox | select Name, servername
}

Note:- put all the scripts in same folder (I have put them in c:\Activesync). Change the scripts according to the enviornment you are supporting.

Advertisements

3 thoughts on “TroubleShooting Fast growing transactional logs – Part 2

  1. I am having issues with activesyncreport powershell script. When I run a query using it the results always say that no active sync information was found in the logs. However, when I just run the Log Parser Studio against the same log directory it gets lots of information. Any thoughts on what is wrong?

  2. My script uses activesync script that should exist in same folder & active sync script uses logparser “C:\Program Files (x86)\Log Parser 2.2\LogParser.exe chk these paths are correct..

  3. Pingback: Exchange 2010 w3wp process High CPU utilization | Microsoft Technologies Blog

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s