Tuesday, May 26, 2009

Changing database options on multiple SQL Servers and databases

I have 150 odd SQL databases across the country. A lot are SQL 2000 or SQL 2005 that have been upgraded. And a lot of the databases on these have auto_close ON, Auto_Shrink ON and PageVerify set to NONE. The config exception report was 10 pages long! Enter Powershell.

I wrote a powershell function to alter each setting, so 3 powershell functions. I could have written one, but for 150 servers with many databases on each server, I only want to target databases on servers that are not correctly configured. So I do three passes of my server list. On each pass I find one incorrect setting and only fix this setting on databases that have this setting, not a blanket update to all.

The code for these functions is:



Function Set-AutoCloseOff {
Param ([string]$svr, [string]$dbname)
$dbn = get-sqldatabase $svr $dbname
$dbn.DatabaseOptions.AutoClose = $False
$dbn.Alter()
}

Function Set-AutoShrinkOff {
Param ([string]$svr, [string]$dbname)
$dbn = get-sqldatabase $svr $dbname
$dbn.DatabaseOptions.AutoShrink = $False
$dbn.Alter()
}

Function Set-PageVerify {
Param ([string]$svr, [string]$dbname)
$dbn = get-sqldatabase $svr $dbname
$dbn.DatabaseOptions.PageVerify = "TornPageDetection"
$dbn.Alter()
}



These functions changes one setting for one database on one server. Now the trick is to find the incorrectly configured databases on each server and then call the above functions. Code:


. ./SQLPSX/LibrarySMO.ps1
. ./DBConfigFixes/Function_Set-AutoCloseOff.ps1
function Set-AutoClose ($s)
{
$svr = $s ¶
$dblist = get-sqldatabase $svr ¶
write-host "Checking Databases on Server $s" -foregroundcolor Green ¶
$dbc ={foreach ($dbn in $dblist) {$dbn | select @{name="DatabaseName";Expression= {$dbn.name}}, @{name="AutoClose";Expression = {$dbn.DatabaseOptions.AutoClose}}}} ¶
$dbr = $dbc.invoke() | where {$_.AutoClose -eq $True} | select DatabaseName, AutoClose ¶
if ($dbr -ne $null) {foreach ($dd in $dbr) {write-host "Changing AutoClose for Database $dd.DatabaseName on $s" -foregroundColor "RED"; Set-AutoCloseOff $svr $dd.DatabaseName}} Else {Continue} ¶
}


I've included carriage return characters in the code window above. If a line wraps when you copy and paste it and there is no hard carriage return, remove the line break.

What the above code does is get a list of databases on a given server and then create another list of databases on that server that are not configured correctly. These are then used in the foreach loop to update the setting using the functions defined above.

Now all I need is a list of servers, and for each server run the above. Thats the easiest bit:

$ss = gc 'servers.txt'
foreach ($s in $ss) {Set-AutoClose $s}


Tada!

For the SQLPSX/LibrarySMO.ps1 library of SMO functions, search on www.sqlservercentral.com. This library contains the get-sqldatabase function.

I ran the above on my 150 servers in about 8 minutes. 3 passes for 3 different config fixes, 30 minutes. 438 config exceptions fixed. Badaboom!

Footnote: I've fixed the Set-PageVerify function to cater for the difference in behaviour between SQL 2000 and SQL 2005. If you want this function drop me a mail at dkorzennik@hotmail.com

Friday, May 22, 2009

Retrieving Partition Size, Free Space and Fragmentation Percentage from multiple servers

At my current client we receive an email every morning with partition information from a number of servers. This is run using VBScript and was set up a while ago. So, for one, the list of servers is outdated. The biggest bit of information missing from this report for me is the fragmentation percent of the partitions. I say this because one of the primary reasons for keeping enough free space available in a partition is to ensure effective defrags can run.

So I gathered my Server list. This, after going through numerous spreadsheets, amounted to 483 servers! Powershell eats this for a little morsel.

Getting the Partition size and Free space is a straighforward GWMI query in Powershell. Getting the fragmentation percent in the same result set requires some fancy footwork. My mate, Jean Louw, added this information using Add-Member. When that cmdlet is mentioned to him his eyes lose focus and his nether regions tighten. Check out his blog at: http://powershellneedfulthings.blogspot.com/.

So getting down to some code. Here I've written a function that retrieves The ServerName, DriveLetter, Label, Capacity, FreeSpace, PercentFree and Fragmentation Percent. I've left this data raw without fancy formatting since I'm going to export this to a csv and then bulk insert the csv into a SQL table and then present the information in Reporting Services. In T-SQL and Reporting Services I'll embelish the data as required.

Here is the function:


Function Get-FreeSpaceFrag ($s)
{
trap {write-host "Can't connect to WMI on server $s" -ForeGroundColor "Red"
continue
}
$dt = get-date

$Scope = new-object System.Management.ManagementScope "\\$s\root\cimv2"
$query = new-object System.Management.ObjectQuery "SELECT * FROM Win32_Volume"
$searcher = new-object System.Management.ManagementObjectSearcher $scope,$query
$SearchOption = $searcher.get_options()
$timeout = new-timespan -seconds 10
$SearchOption.set_timeout($timeout)
$SearchOption
$searcher.set_options($SearchOption)
$volumes = $searcher.get()

$fr = {foreach ($v in $volumes | where {$_.capacity -gt 0}){
$frag=($v.defraganalysis().defraganalysis).totalPercentFragmentation
$v | Add-Member -Name Frag -MemberType NoteProperty -Value $frag -Force -PassThru
} }
$fr.invoke() | select @{N="Server";E={$_.Systemname}}, DriveLetter, Label, Capacity, FreeSpace, @{N="PercentFree";E={"{0,9:N0}" -f (($_.FreeSpace/1gb)/($_.Capacity/1gb)*100)}}, Frag, @{N="InfoDate";E={$dt}}

}



The magic here is the Add-Member cmdlet getting my fragmentation percent. The other piece of magic is making sure the WMI query times out after 10 seconds!! Man did I battle with this. It took me a whole morning to get this right, and I start working at 6AM! The trick here is to instantiate the WMI object before invoking it. Then you can set the timeout using the new-timespan cmdlet and some properties of the ManagementObjectSearcher Object. I had a server, smack in the middle of my list of 483 servers, that broke my script before I added this error handling. Thats ugly, in a script that takes almost 4 hours. And when I say broke, Ctrl+C doesn't even work. Click on the X baby.

Great now I have my unit of work defined: Gather required information from one server. Its go time! Now I use this to gather the information from a list of servers, export the results to csv, time the entire operation of gathering the data and finally bulk insert the data into a SQL table. Code:


sl E:\Powershell\ServerDriveSpaceFragInfo
. ./Function_Get-FreeSpaceFrag.ps1
sl ..
. ./function_Get-TimeDelta.ps1
$svl = gc 'serversall.txt'
$x = {foreach ($s in $svl) {write-host "Getting Disk Info for Server $s" -foregroundcolor "Green"; Get-FreeSpaceFrag $s; start-sleep -s 60; break}}
$t1 = get-date
$x.invoke() | export-csv "D:\Powershell\DiskInfo.csv" -NoTypeInformation
$t2 = get-date
Get-TimeDelta $t1 $t2
cpi "C:\data\DiskInfo.csv" "\\SRV1\d$"
sl D:\Powershell\ServerDriveSpaceFragInfo
./BulkInsertDiskInfo.ps1



I save the above as a .ps1 file. I then call the ps1 file from SQL server using a SQL Agent job. The job step will be operating system (CmdExec) Type and the text would be:
"C:\WINDOWS\system32\windowspowershell\v1.0\powershell.exe" "D:\Powershell\DiskInfo\Eg_Get-FreeSpaceFrag.ps1"

For 483 servers this job runs for about 3 hours 43 minutes. Not bad considering they are all over the country, across some slow WAN links. The bulk insert code is as follows :


$SqlConnection = New-Object System.Data.SqlClient.SqlConnection
$SqlConnection.ConnectionString = "Server=SRV1;Database=master;Integrated Security=True"
$SqlCmd = New-Object System.Data.SqlClient.SqlCommand
$SqlCmd.CommandText = "use DBA; TRUNCATE TABLE DriveSpaceFragInfo;"
$SqlCmd.Connection = $SqlConnection
$SqlConnection.Open()
$sqlCmd.ExecuteNonQuery()
$SqlConnection.Close()
$SqlCmd.CommandText = "BULK INSERT DBA..DriveSpaceFragInfo FROM 'D:\DiskInfo.csv' WITH (FIELDTERMINATOR = ',', FIRSTROW = 2, ROWTERMINATOR = '\n')"
$SqlCmd.Connection = $SqlConnection
$SqlConnection.Open()
$sqlCmd.ExecuteNonQuery()
$SqlConnection.Close()
$SqlCmd.CommandText = "Exec DBA..usp_UpdateDriveSpaceFragInfo"
$SqlCmd.Connection = $SqlConnection
$SqlConnection.Open()
$sqlCmd.ExecuteNonQuery()
$SqlConnection.Close()


The 3rd command runs a stored procedure to clean up the data. Basically just removing double quotes and one or two other things. Now I have nice clean partition information that provides a wealth of information. Such as a Server that was hosting 2 Virtual Servers, had 35% free space on the data partition and was 94% fragmented. 94%!!!! I have never seen such a high figure before. So the fact that the partition has 35% free space doesn't mean the partition is in a healthy state.

I then created some reporting services reports to show the top 20 partitions with the least amount of free space and another report with the top 20 worst fragmented partitions. Nice. So now from a possible 1500 partitions on 483 servers I can target the least healthy partitions first. Also, I keep the partition information in the database. So over time I can report on how long a partition has been in a certain state, when it was cleaned or defraged, and how quickly it got filled up and fragmented again. Makes management happy. ;-).

Wednesday, May 13, 2009

Cleaning up full partitions

I work in an environment where there are a good few hundred servers all over the country. We have a report that gets sent to us every morning that lists servers and drives that have below 20% free space. The idea then is to free up space to get these drives to have more than 20% free space. After the 1st week it becomes very tedious. Check SQL backup folder, check W3SVC1 log files, check for out of date service packs.... Whats needed is a more methodical, and automated approach. You guessed it, powershell. ;-)

In the intro I alluded to the approach, check for a number of known file types and conditions that can contribute to filling a partition. Basically I go to the partition and for each check I either include or exclude certain file types or include files greater than a certain size.

I've written a function that goes to a server and a partition on that server and checks for:
  1. Files bigger than 10MB Excluding SQL files
  2. All Office, pst and txt docs bigger than 100kb
  3. Log files bigger than 5MB
  4. JPeg and MP3 files bigger than 100KB
  5. SQL Backup Files

This normally accounts for most cases of wasted space usage. This can obviously be expanded or customized to your particular needs or environment.

The function writes a csv file with a list of the files for each condition, with Name, FileSize, DirectoryName, FullName, CreationTimeUtc, LastAccessTimeUtc and DeleteCommand as fields. This information helps to confirm that a file can be deleted or compressed.

The neat bit here is the DeleteCommand. Its really just the Del command, which will work in the command prompt window or in PS, with the full path and filename. Run this carefully though, no prompting of "Are you sure" will be issued.



Herewith the code:

Function Find-Files {
$a = new-object -comobject MSScriptControl.ScriptControl
$a.language = "vbscript"
$a.addcode("function getInput() getInput = inputbox(`"Enter Server Name`",`"Find Files`") end function" )
$s = $a.eval("getInput")

$b = new-object -comobject MSScriptControl.ScriptControl
$b.language = "vbscript"
$b.addcode("function getInput() getInput = inputbox(`"Enter Server Drive`",`"Find Files`") end function" )
$dr = $b.eval("getInput")

$c = new-object -comobject MSScriptControl.ScriptControl
$c.language = "vbscript"
$c.addcode("function getInput() getInput = inputbox(`"Enter location to save output`",`"Find Files`") end function" )
$d = $c.eval("getInput")


#$s = 'DIVSS108'
#$dr = 'e'
$sp = "\\$s\$dr$\"
$d = "$d\"
$L = 10*1024*1024

#All files bigger than 10MB
write-host "Getting Files on $s Bigger than 10MB excluding SQL files..." -foregroundcolor "Green"
$f = $d+$s+"_big_Files.csv"
gci $sp -recurse -exclude *.bak,*.mdf,*.ldf,*Full.rar | Where {($_.Length -ge $L)} | select Name, @{N=' FileSize';E={"{0,12:N0} KB" -f ($_.Length/1kb) }}, DirectoryName, FullName, CreationTimeUtc, LastAccessTimeUtc, @{Name="DeleteCommand";E={"Del "+'"'+$_.FullName.Tostring()+'"'}} | export-csv $f -NoTypeInformation

#All Office, pst and txt docs bigger than 100kb
write-host "Getting Office Files on $s bigger than 100kb ..." -foregroundcolor "Green"
$L = 0.1*1024*1024
$f = $d+$s+"_Office_Files.csv"
gci $sp -recurse -include *.xls,*.doc,*.ppt,*.txt, *.pst | Where {($_.Length -ge $L)} | select Name, @{N=' FileSize';E={"{0,12:N0} KB" -f ($_.Length/1kb) }}, DirectoryName, FullName, CreationTimeUtc, LastAccessTimeUtc, @{Name="DeleteCommand";E={"Del "+'"'+$_.FullName.Tostring()+'"'}} | export-csv $f -NoTypeInformation

#Log files bigger than 5MB
write-host "Getting Log Files on $s bigger than 5MB ..." -foregroundcolor "Green"
$L = 5*1024*1024
$f = $d+$s+"_Log_Files.csv"
gci $sp -recurse -include *.log | Where {($_.Length -ge $L)} | select Name, @{N=' FileSize';E={"{0,12:N0} KB" -f ($_.Length/1kb) }}, DirectoryName, FullName, CreationTimeUtc, LastAccessTimeUtc, @{Name="DeleteCommand";E={"Del "+'"'+$_.FullName.Tostring()+'"'}} | export-csv $f -NoTypeInformation

#JPeg and MP3 files bigger than 100KB
write-host "Getting Jpg & mp3 Files on $s bigger than 100kb ..." -foregroundcolor "Green"
$L = 0.1*1024*1024
$f = $d+$s+"_Jpg_Mp3_Files.csv"
gci $sp -recurse -include *.jpg, *.mp3 | Where {($_.Length -ge $L)} | select Name, @{N=' FileSize';E={"{0,12:N0} KB" -f ($_.Length/1kb) }}, DirectoryName, FullName, CreationTimeUtc, LastAccessTimeUtc, @{Name="DeleteCommand";E={"Del "+'"'+$_.FullName.Tostring()+'"'}} | export-csv $f -NoTypeInformation

#SQL Backup Files
write-host "Getting SQL Backup Files on $s ..." -foregroundcolor "Green"
$L = 0.1*1024*1024
$f = $d+$s+"_SqlBackup_Files.csv"
gci $sp -recurse -include *.bak,*.trn,*Full.rar | select Name, @{N=' FileSize';E={"{0,12:N0} KB" -f ($_.Length/1kb) }}, DirectoryName, FullName, CreationTimeUtc, LastAccessTimeUtc, @{Name="DeleteCommand";E={"Del "+'"'+$_.FullName.Tostring()+'"'}} | export-csv $f -NoTypeInformation
}



As you can see the function uses some old school VBScript stuff to get a input box. I could have used newer .Net code, but the VBScript code is short and sweet.

When the function is called you are prompted for the Server Name, the Drive and the location to store the output csv files. Make sure this location exists. I could have checked for the existence of the location and created it if it didn't exist, but this may create a folder in a location that can't be remembered or found if the location is typed incorrectly. Ctrl+C, Ctrl+V for the location. ;-)

Happy cleaning and deleting.