• Categories

  • Recent Posts

  • Archives

  • Copyright Notice

    Copyright © Nancy Hidy Wilson, 2010-2013. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Nancy Hidy Wilson and nancyhidywilson.wordpress.com with appropriate and specific direction to the original content.

Discontinued Features in SQL Server 2012 – Part 3

Part 1 discussed some discrepancies between documentation of discontinued features in SQL Server 2012 and actual implementation. Part 2 discussed the database compatibility level minimum requirement. Now, in this installment, let’s look at some features which have disappeared from SQL Server 2012 which may impact your administration scripts.

The first one, the DATABASEPROPERTY function, I already covered in Part 1. While it doesn’t appear to be completely gone, the documentation states that it is gone. So, as you edit your scripts for all the other items mentioned next don’t forget to make the change to use DATABASEPROPERTYEX instead of DATABASEPROPERTY.

Next up, sp_dboption is gone. That’s right – it’s not there… You get a nice red error message in SSMS when you try to use it.

Msg 2812, Level 16, State 62, Line 1

Could not find stored procedure ‘sp_dboption’.

Everything you needed to set using sp_dboption is now available via the ALTER DATABASE syntax – and pretty much has been since SQL Server 2000. But, if you are as “experienced” as I am (going back to 4.21), then you may still have some old scripts laying around with sp_dboption in them.  Unless you still have some 7.0 or earlier systems on which to use those scripts – update them to use ALTER DATABASE and they should work on SQL Server 2000 and up (with an exception for the database ownership chaining option which was added in SQL Server 2000 SP3 and only available to set via sp_dboption in SQL Server 2000).

Now – how about a review of your Backup and Restore scripts?  Did you make use of the WITH PASSWORD or WITH MEDIAPASSWORD options on your BACKUP DATABASE and BACKUP LOG commands? If so, you’ll need to remove those options for use on SQL Server 2012. These options have been deprecated since SQL Server 2005, so in theory you should have stopped using them at that point in time. You can still use these options with the RESTORE command for those old backups which used a password, but you can’t make any new backups using the password options.  Keep in mind though that the ability to use the RESTORE command for backups with passwords is deprecated and will be removed in future version.

This next one will really date you (and me!) – do you have RESTORE DATABASE…WITH DBO_ONLY syntax in any scripts?  If so, you should have replaced WITH DBO_ONLY as WITH RESTRICTED_USER starting with SQL Server 2000.

Do you run scripts to dump the meta-data from the system tables\views?  Do they include the sys.database_principal_aliases view? If so, it is gone now, too. You’ll receive error 208 (invalid object name) if you reference it.  Aliases have been replaced by roles.

OK – so now you know what to look for, but how about an easy way to find where these keywords are used in your scripts?  Here’s a quick PowerShell command example (adapted from http://guyellisrocks.com/powershell/powershell-grep/) to help: 

Get-ChildItem -include *.sql -recurse | Select-String “sp_dboption”

Have fun hunting down these obsolete keywords!

PowerShell – Using Common Parameters

One of the most useful features provided in PowerShell are “Common Parameters”. These are parameters which you can utilize in your own scripts and functions to provide a similar look and feel to the standard PowerShell cmdlets which use them.

What are these Common Parameters? 

  • -Debug
  • -ErrorAction
  • -ErrorVariable
  • -OutBuffer
  • -OutVariable
  • -Verbose
  • -WarningAction
  • -WarningVariable

You can find out the basic functionality of each of these by running:

PowerShell Get-Help about_CommonParameters

But, how do you add them into your own script? It is actually pretty easy. Just include the following line in your script or function as the first executable line (you can only have comments preceding this):

[CmdletBinding()]

You can now use these Common Parameters with your script.

To demonstrate the regular Common Parameters, I’ll show examples with the two you are most likely to start with: the -Debug and -Verbose parameters. If you check the help, you see that both are basically a switch (true/false), but you can explicitly specify $true or $false as the value (e.g. -Verbose:$true). If you want to use Write-Verbose commands in your script, but only when needed, then by using the CmdletBinding attribute, you can pass the -Verbose parameter when calling your script and the Write-Verbose statements will display.  Consider the following test script (saved as Test-Parm.ps1):

[CmdletBinding()]
Param ()
Write-Host "Begin Testing Common Parameters" -ForeGround Green
Write-Verbose "Verbose Parm Used"
Write-Host "End Testing Common Parameters" -ForeGround Green

Execute the script as follows without any parameters:

Powershell .\Test-Parm.ps1 

You’ll just see the two output lines from the Write-Host statements in green.

Now execute the script with the -Verbose parameter:

You might think that you can just reference this parameter now in your script like $Verbose…. Nope! However, when you use the CmdletBinding attribute, the PowerShell automatic variable $PSCmdlet is set and these common parameters are considered bound parameters.

This is helpful if you want to perform specific settings or actions within your script when one of these common parameters is used as shown. For example, the default behavior for Write-Debug is to prompt the user for the action to take as shown here:

You can use the following syntax, however, to test if the -Debug parameter was passed and change the $DebugPreference variable to “Continue” in order for your script to automatically continue processing whenever a Write-Debug statement is encountered.

If ($PSCmdlet.MyInvocation.BoundParameters["Debug"].IsPresent) 
{
     $DebugPreference="Continue"
     Write-Debug "Debug Output activated"
}
Else 
{
     $DebugPreference="SilentlyContinue"
}

The IsPresent property actually evaluates to true if either syntax is used: -Debug or -Debug:$true.

If -Debug:$false is the parameter or the -Debug parameter is not supplied, then IsPresent will evaluate as false.

Consider the following script (saved as Test-Parm.ps1) and note the execution examples shown following it.

[CmdletBinding()]
Param ()
Write-Host "Begin Testing Common Parameters" -ForeGroundColor Green
Write-Verbose "The -Verbose parameter was provided" 
If ($PSCmdlet.MyInvocation.BoundParameters["Debug"].IsPresent) 
{
     $DebugPreference="Continue"
     Write-Debug "Debug Output activated"
}
Else 
{
     $DebugPreference="SilentlyContinue"
}
Write-Debug "This script was called with the -Debug parameter." 
Write-Host "End Testing Common Parameters" -ForeGroundColor Green

Hopefully this tip will help you to incorporate these features into your next PowerShell script.

My Denali First Glance – PowerShell Enhancements

First, PowerShell 2.0 is now required as a pre-requisite for installing SQL Server Denali (SQL Server 11). Hopefully, you’ll only be installing Denali in a Windows 2008 R2 environment which has PowerShell 2.0 installed by default. But, if not, you’ve now been forewarned – just in case you don’t read the “readme” prior to installing!

SQLPS is now a module (a new feature in PowerShell 2.0). The SQLPS utility has been deprecated, but if you use it, it apparently will start PowerShell 2.0 and import the SQLPS module.

Additionally, there is now a module of cmdlets for SSAS (Analysis Services). After installing Denali, if you execute the command Get-Module -ListAvailable you’ll see that in addition to SQLPS, SQLASCMDLETS also is displayed. This module contains 11 cmdlets as shown below:

But, my focus is Database Engine, so that’s all I’ll say about SSAS. If you are a BI guy or gal – check it out!

When you first import the SQLPS module, you will get the following warning unless you import the module using the -DisableNameChecking parameter.

You’ll notice that as soon as you import the SQLPS module, your location changes to the SQLSERVER provider. To see what’s new in this hierarchy, just type: dir

XEvent, IntegrationServices and SQLAS are all new. Thus, more features and components of the SQL Server stack are now being exposed for PowerShell management.

Then, there are all the new cmdlets available in SQLPS.

Considering there were only 5 cmdlets (highlighted in yellow above) for SQL Server 2008, I now count 29! Most appear directed at new features in Denali such as High Availability\Disaster Recovery (HADR) and Availability Group configurations. I’m sure I’ll be looking at those in detail when I get around to studying AlwaysOn.  However, two very traditional database commands have been provided now with their own cmdlets – Backup-SqlDatabase and Restore-SqlDatabase. These two cmdlets are modeled after the Microsoft.SqlServer.Management.Smo.Backup class, so if you are used to the properties in that class, then the parameters for these cmdlets will be very familiar.  

That means you can now execute commands like this as a SQLAgent PowerShell job step:

Dir SQLSERVER:\SQL\MyServer\MyInstance\Databases | Backup-SqlDatabase

This simple command will backup all user databases on an instance using all the defaults, of course. But, you can start imagining the possibilities here – especially if you start integrating any additional parameters needed in your environment.  

As indicated by the title, this is just a “first glance” at what is coming in Denali; I’m sure I’ll be investigating and writing more about using PowerShell with Denali in the future. What new PowerShell enhancements are you looking forward to implementing?

SQL Server Deprecated Features – Part 2

In part 1 of this mini-series on deprecated features, I explained what that meant and where to find more information. This entry will focus on issues I’ve personally encountered in this area which require close attention not just to deprecation, but actual discontinuation.

While rewriting all of our “data collection” scripts in PowerShell and SQL-SMO (because SQL-DMO was discontinued in SQL 2008 R2), I discovered an unexpected “backwards compatibility” issue.  When I ran the database collection script against certain SQL 2000 servers, the script “failed” against some databases. My first suspicion was that the databases were “offline” and I needed to add a check for that condition.  However, I added that check and still got failures. When I explicitly connected to the instances with the problem databases using SQL Server Management Studio (SSMS), I got a semi-surprise – they all had databases in 65 compatibility mode… and those were the databases which caused my script to fail.   So, instead of pulling the database compatibility setting via SQL-SMO, I now have to select it using the Invoke-Sqlcmd SQLPS cmdlet using a T-SQL statement and verify that it is greater than 65 before continuing with the collection of the database’s metadata.

What surprised me was that this happened on a SQL 2000 instance.  I knew that SQL-SMO was not supported on SQL 6.5 instances, but it did not occur to me that it would also be invalid for a database in 65 compatibility mode on an instance whose versioning supported SQL-SMO.

At that time, I decided against collecting further metadata from 65 compatibility mode databases.  There are too few to justify the additional scripting required.  The main thing is that we will be able to report how many 65 databases are still around.

Consequently, the next issue I realized was that the SSMS tool cannot even manage a database in 65 compatibility mode as SSMS uses SQL-SMO almost exclusively behind the scenes. Remember, I’m not in a day-to-day production DBA role to have encountered this scenario.

Then, I found another “backwards compatibility” issue when using the SSMS 2008 R2 to script a drop/create of a table from a database on a SQL 2000 instance.  The database containing the table was in 80 compatibility mode (so not quite the same issue as previously discussed).  What I noticed in the auto-generated T-SQL statement was that the “exist” check for the table prior to the drop, did not select from the SQL 2000 system tables, but from the newer system views in SQL 2005 and up.  Consequently, if I had tried to execute the generated script, it would have failed since sys.objects does not exist in SQL 2000.  I have verified that this has actually been the case since SSMS 2005, so hopefully everyone besides me was already aware of this gotcha.

Bottom line: be very aware of the version of the tools you are using to access down-level instances and down-level databases.  In the words of Ronald Reagan, “Trust, but verify”.

Taking Advantage of the CMS to Collect Configuration Data

In an earlier blog entry, I showed you how you could build and populate your CMS groups to use with SSMS to run multi-server queries.  Okay, so now you are probably thinking of lots of configuration type info which you’d like to collect periodically from your instances.  Obviously, you don’t want to have to do this manually in an SSMS Query Editor window all the time, you want to automate it, right?  You can probably guess where I’m headed next… Yep – PowerShell!

OK, so that is interesting, you have the SQL Server instance version results for all the instances registered in the defined CMS Group in a file on your hard drive. But what you really want to do is collect several different properties about your server instances and store that data such that you can query (report) it later.

For this case, let’s assume that you have a database called DBAdmin on the same instance as your CMS – this is also your “repository instance”.  In the DBAdmin database create the following table:

You do need to be aware in multi-versioned environments that some properties are not available in all versions.  Thus you’ll need to be able to handle null values returned by the SMO properties.  For these properties, you will want to collect in a variable prior to building the insert statement as shown by the $WebXPs variable in the sample code below.

Now, loop through the CMS group as before and collect the instance level information using SMO and store the results in the repository.

Hopefully, you see the possibilities for extending beyond the properties shown here.

So now all that remains is to save your code into a PowerShell script and schedule it to run from your repository instance’s SQLAgent to collect your SQL Server configuration data on a regular basis. Don’t forget to decide how much historical data you want to keep and set up a step in your job to purge the repository of old data.  And finally, in case you didn’t think of this already, you’ll need to schedule the job steps running the PowerShell script to execute as an account which has the necessary access to all of the registered servers in the CMS Group.

So, why use CMS Groups for this task?  I think this gives you more flexibility in setting the scope of your collection scripts without requiring you to change any code within those scripts once you’ve finalized them.  The only thing you need to manage is the CMS Groups, which can be manual or automated or both, as discussed in my blog entry Managing the Enterprise with CMS.  It allows you to use multiple repositories which run the same scripts but against different CMS groups (for example regionally or by customer).

Finally, this is sample code to demonstrate what is possible. It should work, but use at your own risk and test, test, test in your own environment!