Tuesday, June 12, 2012

T-SQL Tuesday #31 – Logging

Today’s T-SQL Tuesday, hosted by Aaron Nelson (@sqlvariant | sqlvariant.com) has the theme Logging.



A little background into what my company does first. We teach English to the children of the world. That's our motto and goal. We do this thru a program that kids run at school now, at home soon, and elsewhere. This interactive program assesses them, places them within the product, and then teaches them English. That's the basics of the product. 

So, there is a kid out there somewhere, that is logging into a computer, connecting to a database server and media server via his browser and our application. This app feeds this child activities, records their actions and responses, keeps track of where they are, where they need to be and so on. All this data gets stored locally on servers at the schools or districts. Periodically, this information is sent back to our corporate offices, where it is compiled into actionable information to help with sales, support, enhancements and so on.

Its a lot of distinct data. I can see a particular child's information and responses. I can see a report on an entire district. I can see what versions of operating systems are being used in the field. I can see how much ram is being used by some machines. I can see a lot of information. And it comes in in various forms. 
Some of this information is sent back to us using MS Sync Framework, while other data is put into a proprietary format and zipped up and sent to us via web services. Other data reaches our home systems in a variety of ways. And we need to know what its doing, how its doing and that its still doing it.

This is where my logging tale comes in. We log a lot of the goings on into our various databases. 

I keep track of the data files that are sent in to us via web service. There is a job that runs every 15 minutes and goes to a few particular directories, counts the number of files within that directory and the max date and min date of the files contained within. This information tells me a tale of a process. Is the process running? Is it collecting similar amounts to what it did last week? Is the service that delivers the data files ON but the service that processes them is OFF, since the size and quantity continues to grow? Are any of the services in a hung state, preventing typical processing? All this can be determined by the collection and interpretation of this logged data. All day long its logging what it can collect, and humans and TSQL decipher it into reports that help tell the tale of whats going on.

A service that recently was discovered to be OFF apparently doesn't have sufficient logging on it. This service collects error log files and processes them. For some reason it was OFF. Nothing was logging its process and we were caught unaware that it was OFF for an extended period of time. We need to implement several things to log its processing and functionality. Many layers of logging. For example, if something was watching the folders where the log files were deposited, it could tell me that the quantity of files is increasing. If we had a process to determine if the service was functioning, that would be helpful. If we monitored the server that these services reside upon was up and running, that would be helpful as well. I am sure there are other things we could implement to log the goings on of this process. And sometimes, more than one is necessary to ensure proper coverage.

Another simple logging process that I have watches the number of unreplicated commands for several native replication publications. This tells me the tale of how my replication is processing. However, this may be misleading, unless I also know a historical average of replication commands we typically process on a given day. And all this is for naught if the actual replication jobs are not functioning. So I also log if the jobs are enabled and running. If they are not, I get an alert, so that it can be remedied.

All told, I have spent countless hours enabling and creating logging processes that simply sit and watch another process, and save the data they collect. Something will collect this data and report on it, letting me know that all is well. And with this knowledge, I know that we are collecting the information, the valuable information that can help sales drive to their goals, that can arm support with data about problems in the field, that can ease my mind that internal processes are functioning as expected. All told, logging is my best friend and makes my day to day job easier, empowering me to see beyond the current fire, over the horizon, into the next potential problems and head them off at the pass. Logging makes me a better Data Professional.

So, remembering that what I do, is all for the kids. Proper logging of my databases,its attendant processes and its data, makes kids like this


not like this poor kid, who obviously is suffering because of my lack of logging.



No comments: