Central Management Servers to Evaluate Policies – “All about being in control”


Policy Based Management or PBM was introduced in SQL Server 2008 initially.This feature will help Database Administrators to create policies and ensure that his/her environment is in compliance with that policy.

Lets take a simple example of recovery model.Using PBM a DBA has got the ability to enforce a policy which will check for the recovery model and database creation with recovery model other than Full can be evaluated or even it can be enforced.

What about if DBA is running an environment with 100 production servers and he/she wants to evaluate a policy?

The short answer is we can use Central Management Servers to import policies and Evaluate them.

Demo

1. Using SSMS I created a policy called ‘Check_Autoclose_Policy_Enterprise’ which will use condition ‘Check_AutoClose_Condition_Enterprise’

You can ref BOL topic [here] to understand how to create a policy and a condition.

This policy simply means that I have set a condition with an expected value ‘False’ for Autoclose property and a policy is created for that condition.

If I evaluate that policy for my servers,it will check the condition for each and every database and will report me in case there are any deviation.If Autoclose = True,then I am notified.

2. Now I will proceed and export this policy.Export action will create an XML file and I will store the XML file to the local drive.

3. I will now proceed to use Central Management Servers(Available under registered servers).For the purpose of this demo I have added 2 Instances of SQL 2012 RC 0 already to Central Management Servers as show below.

 

4. Using Root level Central Management Server we can Import the policy which I had created.

5. Once import action is completed.The policy is spread across all the 2 instances.

6. For the purpose of the demo I had created 2 similar databases for both Instance B and Instance C.The name of the database is Test_PBM.

For Instance B – Autoclose is set as False

For Instance C – Autoclose is set as True

7. Using Root level Central Management server,we can now evaluate the policy

8. Once evaluation is completed,we are provided with a clear explanation on how the evaluation went through and which all databases were not according to our policy compliance.

The policy was created for Auto Close property to be False(Condition) for all the databases and after evaluation we found that there is one deviation and that is our Test_PBM database which is residing under Instance C.

 

Conclusion – The power of PBM and Central Management Servers can be combined for great control.

Have you deployed anything like this in your environment.I am interested to know your scenarios.

Thanks for reading.

Installed SQL Server features discovery report


Have you ever faced a situation where in you had to double check what all features you deployed for your SQL Server environment ? Oh! did I selected Full-Text search feature during installation?

There are many number of ways you can double check the features which are available

1. Check services running under SQL Server Configuration Manager

2. Check the install folders for logs(Example – C:\Program Files\Microsoft SQL Server\110\Setup Bootstrap\Log) which will give details about features.

Think about getting a report of this! Wow,that’s a cool thing isn’t? This is what exactly ‘Installed SQL Server features discovery report’ wizard does for you.

You need to run SQL Server setup to launch this wizard and this is rightly available under the tool section of the page which we have seen many times -

We just need to run this option and it will provide us with a very structured report which talks about all the features which were installed for the server.

I was amazed to know couple of  features(Example – LocalDB) were present in my machine.

The contents of the report is actually getting pulled from the summary.txt file which is created after the install is completed (C:\Program Files\Microsoft SQL Server\110\Setup Bootstrap\Log\summary.txt)

So,if you just need the data then its easy to look at the file rather than running the setup.exe.However the report looks neat and structured.

Thanks for reading.

Running Performance Dashboard Reports for SQL 2008 R2 and SQL 2012 RC 0


Today I decided to leverage performance dash board report for my SQL Server 2008 R2 instance and SQL 2012 RC 0 Instances.

SQL 2008 R2 RTM

I downloaded the msi file and ran setup.sql under SQL 2008 R2 RTM instance. Straight away I was hit by the error

Msg 207, Level 16, State 1, Procedure usp_Main_GetCPUHistory, Line 6
Invalid column name ‘cpu_ticks_in_ms’.
Msg 15151, Level 16, State 1, Line 1
Cannot find the object ‘usp_Main_GetCPUHistory’, because it does not exist or you do not have permission.

The error is because there is no column ‘cpu_tickets_in_ms’ for the DMV sys.dm_os_sys_info,hence I changed the column name in Setup.sql as ms_ticks.

Rob Carrol has a blog post which talks about the same and you can read it here

Once the column was changed the script ran without any trouble.Report worked as per the expectations.

SQL 2012 RC 0

Same testing was done for SQL 2012 RC 0 and I was amazed by the results.The RDL file worked perfectly for SQL 2012 too after changing the same code for setup.sql as mentioned above.

Here is a quick preview of the report which came straight out of SQL 2012 instance.

Performance Dashboard tool will give you some good amount of information and you can always prepare your own customized reports using builds and use it for better control.

I would like to hear from you,if you have built any of these solutions for your environment.

Thanks for reading.

Deleted the TUF file!!! Boy, that’s trouble


Just 2 days back I wrote a post of TUF files related to log shipping. You can read the post here 

Today we will see what is going to happen if someone deleted the TUF file accidentally or by any chance it got missed.

I tried to simulate this on my test machine which had log shipping configured. Below are the steps which I followed -

1. Deleted the TUF file which was available in the secondary server.

2. The delete operation was successful.

3. Checked log shipping status and found that the health is ‘Good’

4. Both primary and secondary databases are synced and both have got same set of data. Row by row,Col by Col.

Note - Ideally deleting the TUF file should also cause issues to log shipping secondary restores, however my simulation did not faced that behavior.

All looks good, and you might be wondering that deleting a TUF file is easy and it’s not going to hurt me much!!!

Now, let’s assume that we lost our primary database server due to Memory burn(Short circuit) and we are in need of the Secondary database.

The RTO and RPO matrix is quite okay and we are allowed to bring the secondary database up within 30 minutes. Walk in the park right? We just have to bring the database up, the users/jobs/other objects are already taken care and just the database needs to be up.

Let’s write this simple 6 word TSQL to bring our database up.

RESTORE DATABASE [XenDevDS] WITH RECOVERY

XenDevDS is my test database which is available in the secondary server and its primary copy was the one which was residing on the server which just went for a trip(Memory burn!)

As soon as we execute this command with a big smile assuming that the database will be up, we will get this message -

Msg 3013, Level 16, State 1, Line 1
RESTORE DATABASE is terminating abnormally.
Msg 3441, Level 17, State 1, Line 1
During startup of warm standby database ‘XenDevDS’ (database ID 7), its standby file (‘C:\Program Files\Microsoft SQL Server\MSSQL11.SERVER2012B\MSSQL\DATA\XenDevDS_20120112191505.tuf’) was inaccessible to the RESTORE statement. The operating system error was ‘2(The system cannot find the file specified.)’. Diagnose the operating system error, correct the problem, and retry startup.

What does it mean – It simply means that you have done a good job by deleting the TUF file and now please bring it back.

TUF file is required for the Stand by database to recover and we will not be able to bring the database up without the same.

As the simulation was in a very controlled environment, I brought back the TUF file and ran the restore command once again.

RESTORE DATABASE [XenDevDS] WITH RECOVERY

RESTORE DATABASE successfully processed 0 pages in 0.908 seconds (0.000 MB/sec).

The database was recovered and was accepting new connections.

Conclusion – TUF file is a very important part of recovery of a stand by database and we have to educate server ops team or anyone who is responsible for cleaning up files and make sure that this is un-touched.

Do you have any ways to recover a stand by database in log shipping secondary without TUF file.If Yes,then please share your experience in the comments section of this post.

Thanks for reading.

SQL Beautifier


Today I came across some tools which does smart way of formatting your SQL code.Who does’nt like a code which is well formatted and easy to read.

I remember my college days when I used to write multiple lines of code for my final semester project and manually format the code for printing and ease of understanding purpose.

I would have scored better grades if I had thought about this tool long back!!! There are many tools out there,however the one which I liked well is  -

http://www.dpriver.com/pp/sqlformat.htm?ref=wangz.sqlformat.htm

You can get beautiful and well formatted codes like

BEGIN TRAN

DECLARE @endDate DATETIME

SET @endDate = Dateadd(hh, 1, Getdate()) – 1 hours from now

WHILE Getdate() < @endDate
BEGIN
INSERT INTO [Test]
VALUES      (1)

WAITFOR delay ’00:00:02′;
END

Hope you all liked this tool and happy coding!

<Update Added 12/10/2012

Today I found another tool which is pretty impressive,its http://sqlformat.appspot.com/

<Update>

TUF File – Not a very famous member,but does his job pretty well!


I have seen various questions related to TUF files,and one of the discussion was interesting and it was something like below-

<Start>

John  – I don’t understand why we need this TUF file in SQL Server, what does it do? I have been looking around for more information, but seems there is no great information around the same.

Kim – Are you talking about .TRN files?

John – No, I am talking about .TUF files. Trust me it’s there!

Kim – Oh, then I am missing something. Let me check that out.

</End of discussion>

So what is this TUF file is all about?

I was also not very sure of what TUF file deals with, however after some research I was able to understand the concept of TUF files and decided to write this post.

TUF file or a Transaction Undo File is created when performing log shipping to a server in Standby mode. This file contains information on all the modifications performed at the time backup is taken.

This file is important in Standby mode of log shipping were you can access the secondary database. Database recovery is done in standby mode when log is restored.

While restoring the log backup, un-committed transactions will be recorded to the undo file and only committed transactions will be written to disk there by making users to read the database. When we restore next Tlog backup SQL server will fetch the un-committed transactions from undo file and check with the new Tlog backup whether the same is committed or not. If its committed the transactions will be written to disk else it will be stored in undo file until it gets committed or rolled back.

A small graphical representation of the above statement is shown below -

I configured log shipping to test TUF file and created a scenario like below -

1. Created a primary database.

2. Configured log shipping to another Instance within the same box.

3. Backup, Copy and Restore to happen every 15 minutes.

4. Continuously inserted data to the primary database to simulate TUF creation.

5. I was able to find TUF file created under the same path were I had placed my system databases files.

There seems to be changes in this path were we can find the TUF files. It will be available in the root as mentioned above for SQL Server 2008 above and used to be in the LS_Copy folder for earlier versions.

 

 

Coming up next - What happens when I delete this file? So please stay tuned my friends.

Thanks for reading.