• Categories

  • Recent Posts

  • Archives

  • Copyright Notice

    Copyright © Nancy Hidy Wilson, 2010-2013. Unauthorized use and/or duplication of this material without express and written permission from this blog’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Nancy Hidy Wilson and nancyhidywilson.wordpress.com with appropriate and specific direction to the original content.

SQLSaturday #150 – Baton Rouge: Presentations Uploaded

My presentation slide decks and demo scripts from SQLSaturday #150 have been uploaded.

Thanks to the planning team for selecting my sessions and thanks to everyone who attended my sessions – I enjoyed the opportunity to share my passion.   

Advertisements

What’s in Your msdb?

So, what’s in your wallet msdb?

If you’ve been working with Microsoft SQL Server anytime at all you can name the system databases fairly quickly – master, model, msdb, and tempdb. Almost rolls off your tongue as smoothly as Tenaha, Timpson, Bobo, and Blair (if you don’t know that reference, click here for some interesting Texas folklore). I’ve been working with SQL Server since 4.21 and the system db names have not changed from release to release. But, what did change drastically over all those releases are the contents – especially for msdb. Have you taken a look lately? I did and here is what I found.

Lots of new tables, lots of new views, and lots of new roles – well, just lots of new objects period! I always investigate what’s changed in master with every release, but it seems I’ve been delinquent in thoroughly examining msdb changes. It appears every new feature being added to the SQL Server database engine is being “managed” from the msdb database.

The objects supporting SQLAgent have always been in msdb – jobs, operators, alerts; along with the objects supporting database backup and recovery sets. Then DTS came along which later morphed into SSIS – guess where those objects are stored?  Maintenance plans, log shipping, and database mirroring also join the party in msdb. And, oh yeah – SQLMail which became DatabaseMail…

Now with SQL 2008 and R2, you can add to the list – CMS (Central Management Server), Data Collector, PBM (Policy Based Management), DAC (Data Tier Applications), UCP (Utility Control Point)… but “Wait!” you say. “Don’t the Data Collector and UCP have their own databases?”  Well, yes, they do – typically MDW and UMDW respectively. But, those databases only hold the collection results – the management and configuration of those features is all contained in msdb.

Here’s what the SQL Server 2008 R2 BOL says about msdb:

The msdb database is used by SQL Server Agent for scheduling alerts and jobs and by other features such as Service Broker and Database Mail.

After seeing this, I had to laugh, I totally missed Service Broker in my list, but then I’ve never implemented it. On the other hand, look at all the features using msdb that BOL just lumped into “other features”!  Bottom line – the BOL description really sells msdb short on its importance to so many features.  AND, the default recovery mode for msdb is Simple. Before SQL 2008 I would have said that a daily full backup was typically sufficient. You didn’t usually make too many changes to jobs such that you couldn’t easily recreate a day’s work if you lost it restoring to the prior day. And, best practice said to take a backup after making modifications which impacted it anyway… but now, with so many features tied to msdb – do you really know when you are making changes to msdb? Considering all that is now dependent upon msdb – is your backup and restore strategy for msdb adequate?

However, before I go examine my msdb recovery strategy, I’m thinking of opening a Connect suggestion – rename msdb to KitchenSinkDB. Do you think I could get some votes for that? 😉

Taking Advantage of the CMS to Collect Configuration Data

In an earlier blog entry, I showed you how you could build and populate your CMS groups to use with SSMS to run multi-server queries.  Okay, so now you are probably thinking of lots of configuration type info which you’d like to collect periodically from your instances.  Obviously, you don’t want to have to do this manually in an SSMS Query Editor window all the time, you want to automate it, right?  You can probably guess where I’m headed next… Yep – PowerShell!

OK, so that is interesting, you have the SQL Server instance version results for all the instances registered in the defined CMS Group in a file on your hard drive. But what you really want to do is collect several different properties about your server instances and store that data such that you can query (report) it later.

For this case, let’s assume that you have a database called DBAdmin on the same instance as your CMS – this is also your “repository instance”.  In the DBAdmin database create the following table:

You do need to be aware in multi-versioned environments that some properties are not available in all versions.  Thus you’ll need to be able to handle null values returned by the SMO properties.  For these properties, you will want to collect in a variable prior to building the insert statement as shown by the $WebXPs variable in the sample code below.

Now, loop through the CMS group as before and collect the instance level information using SMO and store the results in the repository.

Hopefully, you see the possibilities for extending beyond the properties shown here.

So now all that remains is to save your code into a PowerShell script and schedule it to run from your repository instance’s SQLAgent to collect your SQL Server configuration data on a regular basis. Don’t forget to decide how much historical data you want to keep and set up a step in your job to purge the repository of old data.  And finally, in case you didn’t think of this already, you’ll need to schedule the job steps running the PowerShell script to execute as an account which has the necessary access to all of the registered servers in the CMS Group.

So, why use CMS Groups for this task?  I think this gives you more flexibility in setting the scope of your collection scripts without requiring you to change any code within those scripts once you’ve finalized them.  The only thing you need to manage is the CMS Groups, which can be manual or automated or both, as discussed in my blog entry Managing the Enterprise with CMS.  It allows you to use multiple repositories which run the same scripts but against different CMS groups (for example regionally or by customer).

Finally, this is sample code to demonstrate what is possible. It should work, but use at your own risk and test, test, test in your own environment!

PASS Community Summit 2010 Recap

A little slow on writing up my recap of my 12th PASS Community Summit, but I guess “better late than never”.  As usual there were several highlights and one major lowlight.  Highlights included a record number of registrations – it was amazing how many people were in line to check-in on Sunday night when the booths opened.  This truly speaks to the value provided to the SQL Server professional community by this conference.  Over the past 12 years I have periodically had the opportunity to attend an additional conference during the year, but continue to find that the best value for my training time and dollars is provided at the PASS Community Summit.  So, I’ll go ahead and get my biggest disappointment of the conference out of the way – the Tina Turner impersonator at the opening.  I think the “Simply the Best” theme was good, but the choice of execution for the opening of the conference did not present a very professional image of PASS, especially considering the strong support for WIT (Women In Technology) that exists in the community.  I attended the Board of Directors Q&A later in the week, and I believe that additional scrutiny will be applied in the future as to how “performances” may come across to the diversity in our community (not just differences in gender, but also cultural differences in a global community).  OK – enough said about that subject.

The technical highlight from the keynotes (which are available in full here, including the WIT luncheon) was obviously the CTP1 announcement and demos of “Denali” aka “SQL Next” aka “SQL 11”.  The demo by Amir Netz of project “Crescent” which provides data visualization really got everyone pumped up.  In case you missed it, check the keynote video from Wednesday and an additional video from the Reporting Services Dev Team is also here.  Attendees were provided with a CTP1 DVD of Denali and it is now also available for download.  Interestingly, it is available in both a 32-bit and 64-bit platform. I don’t know about you, but I was really counting on this version being 64-bit only.  If you only want to start reading up on it, then the Denali BOL is also available here

The Thursday keynote by Dr. David Dewitt just reminded me once more of why I fell in love with database technology.  His presentation on query optimization almost made me want to go back to school again!  I love this stuff!  You can find his slide deck on the Microsoft Jim Gray Research Lab’s Facebook Links tab.  How cool is that? But, seriously, you have to watch the keynote video, too.

There are so many great speakers and presentation topics to attend in person that you really do need to get the DVD set of the conference to have the chance to hear the ones where you couldn’t be in two (or even three or four) places at one time.  Or for that matter, to re-watch ones that you did attend!  If you were not able to attend PASS at all, then you can also now order the Summit 2010 DVDs here. If you did attend the Summit, but didn’t order the DVDs and want to now, then go here to email or call Shannon Cunningham to purchase at a reduced rate.  

Over the years, I’ve learned to change my plans for session attendance on the fly.  This year was no exception. Many of the Microsoft sessions which were to be about “future” features could not actually be promoted as such until Denali was announced in Tuesday’s keynote.  Thus, I found myself immediately changing session plans for the very first session following the keynote in order to attend Paul Mestemaker’s and Bob Ward’s session on “Atlanta” – a new configuration assessment service from the cloud.  Basically, Atlanta is a checklist of best practices/configurations as compiled by Microsoft’s SQL Server CSS team.  It is planned to be refreshed as needed and based in the cloud.  The goal is to help you prevent problems (not detect them), thus it is complimentary to SCOM (or other real-time monitoring tools).  It will require an agent to be deployed on your server to collect and of course would require internet access, so it remains to be seen how useful it will be for production environments as most I’m aware of do not allow their SQL Servers to have internet access.  You don’t have to wait for Denali to try out Atlanta, but it does require Windows 2008+ and SQL 2008+. You can try it out today at https://www.microsoftatlanta.com/. If you get the DVDs, then this was session DBA-226M.

I’ll just quickly mention two other sessions which I attended and picked up nuggets to be researched further.  The first was Bill Ramos’ DBA-450M session entitled “Advanced Data Collection & Reporting with the MDW” (or something to that effect).  If you are doing Data Collection, then Bill has published several very useful report definitions to his blog in an ongoing series. The second session was Joe Yong’s DBA-353M “Upcoming SQL Server Upgrade Planning”. Key learning from this session is to watch out for increased lockdown in W2K8R2 (if you haven’t moved here already) and there will be a distributed replay feature available in the CTP1 for Denali to help analyze upgrade impacts.

Besides the great sessions, one of the biggest reasons for attending conferences such as the PASS Community Summit is the networking.  Multiple opportunities exist for mingling with your peers – from the Opening Night reception (with the now infamous annual Quiz Bowl event) to lunch which is also geared around networking each day (BOF – Birds of a Feather, WIT Luncheon, Regional\Local User Groups).  I was able to speak with several people from the Houston area (and some planning to move to the Houston area) about our local user group (Houston Area SQL Server User Group) and upcoming SQL Saturday 57 planned for January 29, 2011. By the way, registration is still open for attendees, speakers, and sponsors for SQL Saturday 57!

Next year’s conference will be in Seattle again, but a month earlier than usual – October 10-14.  I’m hoping to experience slightly better weather in that timeframe, although we did see a significant amount of sunshine this year.  So, start your planning now!  There is a really good rate offered through the end of the year which includes the pre-conference seminars – – $1295 for the full bundle, including 3 days plus 2 Pre-Cons until Dec. 31; $995 (for 3 days) until Jan. 15 – check it out at: www.PASSSummit.com. If you need to plan your training for earlier in the year or don’t have quite the budget for the full-blown conference, then you should check out SQL Rally.  This shorter 2-day conference with a 1-day pre-con will be held in Orlando from May 11-13 – details are at: www.SQLRally.com

And, finally, if those dates and prices don’t work for you, did I mention that there is a SQLSaturday on January 29, 2011 in Houston?  SQLSaturday events are “free” training days (with usually a nominal lunch fee charge). If Houston isn’t in your travel plans, then there is likely to be another event scheduled close to you – check out the main website: www.SQLSaturday.com for places and dates.