Tuesday, November 29, 2011

Indexes and Powershell

I ran across a blog before the Thanksgiving Holiday that caught my eye. Its something Ive been intending to get into recently, but just didn't find the time. So on the slow day of Wednesday before the Thanksgiving holiday I took the time to dig into it and play with it in my system. It was a blog by SQLFool (Michelle Ufford | blog | twitter ) on how she deals with indexes. She basically has a proc that shows her Missing Indexes. She also uses Kimberly Tripp's (blog | twitter) sp_HelpIndex2 proc to dig into the actual table that says its missing indexes. The results of these 2 sets of information are manually processed and decisions are made about indexes.

Simple enough. You need the knowledge of how your systems is being accessed, coupled with the information these two procs provide and a little time, and you too can make informed decisions about your indexing needs.

I then coupled this information with a script from Pinal Dave (blog | twitter) that looks at unused indexes. I tweeked both these scripts from Pinal and Michelle to match the results I was looking for. Small tweeks, mind you. The result were 2 queries I could use to look into a particular database, see in a glance all missing and unused indexes. Using Kimberly's proc to review the table for existing indexes and combining this with the knowledge I had of our systems, I made choices and either added or removed indexes at will. I felt powerful. The power of the DBA coursing thru my veins...

< side note >
I want to take this chance to thank these folks for their work, their sharing and their knowledge. It's nice to stand on their shoulders, grab solutions that work, implement them, tweek them to my needs and system and topology, and devise a workable solution that fits my needs.
Thank You!!
< /side note >

After digging into several databases in several servers, I ended up on a particular database last night, just before I went home. While I was looking at this db one of our database developers called me with a complaint about a slow running report he was working on. After looking into it, locating the spid and digging around it to see what I could determine, we determined that it was indeed processing, it was indeed working, just very slowly. It had been running for an hour and a half so far, when he called me. My first thought was that I had caused the slowness with the index work I was performing. After disclosing this to him, and determining that it was not the case, we kept looking. At one point I decided to give up, remembering that I was just about to dig into this database in particular and look at its missing and unused indexes. I informed him we should do that now, instead of looking more.

He killed the report and we watched it complete. He could tell that it had processes somewhat, but was not finished yet. I then looked for missing indexes, and with his help, identified likely candidates that would affect his query in particular. We added a few indexes to these tables along with 2 other tables. I think in total we added 6 indexes only to this database. We then removed several indexes that were unused. After this work I suggested he kick off the report. Almost immediately we received an alert from SQLMonitor indicating that we had high processor utilization. This lasted for 3 minutes and levels returned to normal. The report was finished in 4 minutes. This report, remember, had been approaching 2 hours previously, and now completed in 4 minutes. We executed it again, and monitored again. Same results. Yeah!! Database Developer is happy. I am happy. I go home.

The next day I realize that this is valuable information and could help me historically, if I could collect it. I set out to do so, and if you are still reading, do not be disappointed. I succeeded, or I wouldn't be writing this blog about it, I'd still be working on it...

I converted the 2 queries I was using into powershell variables, and then I execute this query against each database I want too, against each database server I choose too. I collect this data, and store it in a staging table in my DatabaseMonitoring database. After it has all been processed I call a proc that sticks this collected data from the staging table into a historical table, adding an Identity ID and a datetime stamp. Now I know when the execution occurred, along with the information about the missing and unused index, per server, per database, per table, etc.
I did not get fancy with the powershell and do any looping or anything like that. The DBs that I look into are hard coded in the script. Someday, maybe, I can get back to the script and make it purty. But now, I'm done. The code is put into a job that will fire off periodically and collect this information historically. Now I just have to look at it periodically as well.

I want to take some time soon to pull this accumulated data into a report and add it to my 'Monitor Everything' report that sends me daily email statuses for all these processes I have added to our monitoring topology. Soon. After a few projects are complete, I will return.

Tuesday, November 15, 2011

a little powershell will do me good

I had an issue today and needed to report versions of OS and SQL Server for a few machines to support. I was unsure of the OS versions of these machines. After asking my team for any documentation on said machines or what versions the OS was currently, I thought... Powershell can probably do this for me. Whip out my google-fu and a few minutes later I have a small working script I can execute against a box to determine the pieces of information I desired.


get-wmiobject Win32_OperatingSystem -comp Server1 '
| select __Server,Caption,ServicePackMajorVersion

get-wmiobject Win32_OperatingSystem -comp Server2 '
| select __Server,Caption,ServicePackMajorVersion

Here is a sample of the Powershell script I used. It returns 3 columns. The Server Name, the OS version and the Service Pack. Simple little table of information.

Obviously you could add to this script to perform a loop from a list of servers. You could augment this to do many things. But for me, it was a simple solution to determining what the OS version was so that I could report this to the support staff I was dealing with. And whats better, I didn't have to bother any other humans for the needed information.

Monday, November 14, 2011

What have I done?

I thought this was interesting when I compiled it recently for a consulting company that I am working with (SQLSolutionsGroup). Its basically a list of the speaking engagements I have performed as well as articles written. I enjoyed putting it together, as it represents rows of data that indicate an activity I performed on a certain date. Its a table of data. And since I had never really sat down to write it down, to form the data, I thought it was interesting. I plan on keeping track of this and adding to it.

So, look, if you care too. This is mainly for me. I enjoy it.

Speaking Engagements

User Groups

Utah User Groups June 2005 Reporting Services

Utah User Groups August 2005 Profiler / Trace

Utah User Groups February 2006 Sarbanes-Oxley

Utah User Groups January 2008 Query Processing

Salt Lake City User Group 2004-2008 various topics

Utah County User Group 2006-2008 various topics

Code Camps

Salt Lake City Code Camp October 2006 Profiler / Trace

Salt Lake City Code Camp April 2008 Query Processing

Salt Lake City Code Camp November 2008 Query Processing


Microsoft 2006 Analysis Services

ACS 2006 Sarbanes-Oxley

PASS SIG 2007 Some topic

SQL Server World Wide User Group Virtual Conference

SSWUG vConference April 2009 Auditing your users

SSWUG vConference April 2009 Best Practices

SSWUG vConference April 2009 Job Execution System

SSWUG vConference April 2009 Monitor Index Fragmentation

SSWUG vConference April 2010 Configuring and Managing GoldenGate

SSWUG vConference April 2010 Database Switch

SSWUG vConference April 2010 SQL Server Profiler and Tracing

SSWUG vConference October 2010 Database Change Process

SSWUG vConference October 2010 On Call Duties

SSWUG vConference October 2010 Professional Development

SSWUG vConference October 2010 SQL Server Logical Query Processing

SSWUG vConference April 2011 Data File Sizes

SSWUG vConference April 2011 Documentation – you know you love it

SSWUG vConference April 2011 Server Profiler and Tracing, a “How To”

SSWUG vConference April 2011 Stand-alone DBA dailyweeklymonthly

SQL Saturday

SQL Saturday 54, SLC October 2010 Database Switch

SQL Saturday 54, SLC October 2010 SQL Server Profiler and Tracing

SQL Saturday 66, Colorado February 2011 Documentation – you know you love it

SQL Saturday 94, SLC October 2011 Data File Sizes

SQL Saturday 94, SLC October 2011 Documentation – you know you love it

SQL Saturday 97, Austin October 2011 Data File Sizes

SQL Saturday 104, Colorado Springs January 2012 Utility Belt Pre-Con



From the Real World : On Call Again – 04 Apr 2009

On-Call Duties – 27 Oct 2008

Audit Database Changes in the Real World – 11 June 2008

A Simple DR Solution – 31 July 2008

Job Execution System – 27 Mar 2008

A Failed Jobs Monitoring System – 4 Sept 2009


Can Twitter Help your Professional Development? - 16 Dec 2008


Monitoring Index Fragmentation - 14 Nov 2008

So, you find yourself On-Call - 22 Aug 2008

Automate Audit Requests - 26 May 2008

Audit Data Modifications - 15 Apr 2008

Monday, November 07, 2011

Meme Monday: SQL Family

Tom LaRock’s (blog|twitter) question this month: What does #sqlfamily mean to me?

When I think of Family, I think of my parents, my brothers and sister. I think back to the beginning when I was younger and we all hung out together. I remember the fun things we did together. I recall the times that they helped me or that I helped them. I know that there were bad times. But i also know and remember the good times. If I haven't seen my brother for years, when I do, it always begins with an embrace, a recounting of stories, a telling of my current history. We recall the good things. And push out the bad things, because after all, we are all Family. That trumps all. Family.

Now as I think about my #SQLFamily, I think the same things. Some of the members of this family have been around for a long time and there is a long history of good memories to draw from. Let's recount those. Let's relish in those. Let's repeat those by helping each other out when we need it most. When I wonder around the virtual house we all inhabit late at night, worried about something, I know that one of my brothers or sisters will be there, they will see my hurt, and they will ask what they can do to help out. Just like my family would do. When I meet a member of my #SQLFamily virtually for months or even years in the intertubes we occupy, our relationship can still be formed and solidified even though no physical presence between us exists. But on that fateful day when we actually meet, you can bet there will be a joyous embracing and retelling of stories that made us a better person for having interacted. I can recall almost each and every instance when I finally met a member of my extended #SQLFamily for the first time, and felt that tug of kinship, that knowingness of shared experiences, that pull of another family member you have forged a relationship with. I can feel that urge to help and be helped and better ourselves through our interactions.

Just like a family that loves each other, regardless of the goings on, regardless of the silliness we all experience from time to time, our Family and #SQLFamily will always be there for us. We will be there for them. And they for us. Always.

This is what Family means to me. This is what #SQLFamily means to me.