Now I have to do this for multiple databases. Repeat those steps multiple times. I dont like repeating things, unless they are fun things. This is not a fun thing. I kinda want a portal of sorts that i can click, and get the answer I want. This is not that solution.
I think about powershell. Hum... I could have a PS script that can pull the results of the table counts, and deposit it into a table. If I do this for each side of replication, for each database, I will have a simple table of results, and a simple SQL query will then let me show off only those records that are not equal, meaning replication is off on those tables. Hum... this is promising...
I easily create the script as the following.
- $DESTINATION="DBSERVER"
- $query = "SELECT
- @@ServerName as [ServerName],
- DB_NAME() as [DatabaseName],
- o.name AS [Name],
- i.rowcnt AS [Count]
- FROM sysobjects o
- JOIN sysindexes i ON i.id = o.id AND indid IN(0,1)
- WHERE xtype = 'u'
- and o.name in ( 'Table1', 'Table2')
- order by i.rowcnt desc"
- $Data = invoke-sqlcmd -query "Delete from ReplicationReviewResults where ServerName = 'DBSERVER' and DatabaseName = 'DBName'" -ServerInstance "DBSERVER" -database "DATABASEMONITORING"
- $Data = invoke-sqlcmd -query $query -ServerInstance "DBSERVER" -database DBNAME
- Write-DataTable -ServerInstance $DESTINATION -database "DATABASEMONITORING" -TableName ReplicationReviewResults -Data $Data
- $Data = invoke-sqlcmd -query "Delete from ReplicationReviewResults where ServerName = 'DBSERVER2' and DatabaseName = 'DBNAME'" -ServerInstance "DBSERVER" -database "DATABASEMONITORING"
- $Data = invoke-sqlcmd -query $query -ServerInstance "DBSERVER2" -database DBNAME
- Write-DataTable -ServerInstance $DESTINATION -database "DATABASEMONITORING" -TableName ReplicationReviewResults -Data $Data
This I repeat as many times as necessary. In my case, 3 times, with 3 different DBNames.
All this I shove into a PS script. I test it out and it works fine. It removes the data from the last execution, for a particular DBNAME and DBSERVER and then collects it again and deposits it into a central database on some db server. Simple. As long as connectivity exists, PS can execute, users exist, and all that type of config, I am able to execute this, and populate data. Woohoo.. Closer.
So now as I think about the portal idea... I think of Reporting Services. A report could easily display this data. Summed data or detail data. Something easily retrieved from the table ReplicationReviewResults. Easy. However, that data won't be accurate. It will be latent, since the last time that I ran the PS script mentioned. So... hum... What if I put that PS Script into a SQL Server Job? Could that work? I do have other PS scripts executing with jobs... that should work.
After testing the job and seeing success, I know have a way for my data to be populated. All I need to do is incorporate this into a Report with a step to call the job execution, then maybe wait a bit for it to finish, then display the data. Yeah!! However the user that can run reports doesnt have a lot of rights, and running jobs isn't one of them. He is a simple datareader...
So, after playing with granting rights to sp_start_job and even select on sysjobs, I'm still stuck. Duh...
I realize that this user needs to have Agent rights and recall that there are roles in the MSDB with something to do with Jobs. I add the role [SQLAgentOperatorRole] to this user, and they can in fact now execute a job. Woohoo. So I am now armed with a PS Script that is embedded in a job that can be executed by the user that reports run under. I create a proc, embedding the job start code along with the query to produce results, and then I create a report that calls this proc. Viola! I now have a portal that I can refresh, it retrieves new data, and displays that data to me.
Replication status updates as simple as hitting F5. No more repetitive tasks taking me minutes of my precious day...