Quick alert on servers which are not compliant for policies set using PBM


Last month I wrote about leveraging policy based management for regular DBA compliance checks. You can read that post here.

Policy based management(PBM) is indeed one of my favorite features. I normally like to play around exploring more and more about it, and eventually end up learning something new(Yay!).

Today’s post is based on that surprise learning. Small but very useful feature.

PBM currently lacks flexible reporting capabilities like sending an automated email to the DBA every week stating compliance status or even a reporting dash-board for the DBA to constantly monitor the details. I hope Microsoft is working on these enhancements, and we can expect some good news when next major bits of SQLServer is out.

Note – Enterprise policy management framework has great enterprise wide policy compliance reporting capabilities, and I’m still exploring this great tool. This is a codeplex project and you can explore the same by following http://epmframework.codeplex.com/

So here is my little story – When I walk in to office on a Monday morning I wanted to know which servers are having compliance issues. To start of I ‘m interested to check my super critical business servers. I used to connect to those servers and do a View History(Under Management->Right click Policy Management)  to get the details of compliance issues.

Note – I evaluate my policies on a schedule.

Recently I observed a quick alert when connected to a  server from SSMS which was telling me that there was a policy compliance issue, and I need to act on it.

The alert looks like – PBM Visual Alert There was indeed a policy violation which was picked by a schedule which ran on the weekend. Monday morning when I connected to that box,SSMS reported me that there is a policy violation via the above described small(but very helpful) graphical alert.

Once we fix the compliance issues, the alert is removed. PBM Visual Alert2 Conclusion

These simple but useful features is indeed helpful for DBA’s to have better control over compliance issues and I’m really looking forward for more and more features and enhancements for PBM from Microsoft SQLServer team.

Thanks for reading and keep watching this space for more.

Advertisement

Leverage Policy Based Management for regular Page Verify =Checksum compliance monitoring


Page Verify = Checksum is one option which I like all my servers to be compliant. By default all the new databases which gets created will have page verify = checksum as this option is inherited from the model database.

NOTE – APPLICABLE FROM SQL2005 AND ABOVE.

SELECT name,page_verify_option_desc FROM sys.databases
WHERE database_id=3

name                                page_verify_option_desc
=======                         ==================
model                               CHECKSUM

So far so good. But what will happen if we restore a database from SQL 2000 or even restore a database which is having a different page verify option other than checksum.

Lets check that out –

For the demo purpose on Server A we already have a database named Check_Page_Verify which is having a different page_verify option set as NONE(Bad idea !).

****Server A****

--Check the page_verify option of the db
SELECT name,page_verify_option_desc FROM sys.databases
WHERE database_id=6

name                         page_verify_option_desc
=====                       ==================
Check_PageVerify     NONE

We will back this database up and will restore it in Server B.

****Server A****

--Backup the database
BACKUP DATABASE Check_PageVerify
TO DISK='I:\Check_PageVerify.bak'
****Server B****

--RESTORE the database
RESTORE DATABASE Check_PageVerify
FROM DISK='I:\Check_PageVerify.bak'
WITH MOVE 'Check_PageVerify' TO 'D:\Check_PageVerify.mdf',
MOVE 'Check_PageVerify_log' TO 'D:\Check_PageVerify_log.LDF'

“Data file and log file in the same drive ! (Bad idea),however as this is a demo we can ignore this.”

In Server B lets check the page_verify option

****Server B****

--Check the page_verify option of the db
SELECT name,page_verify_option_desc FROM sys.databases
WHERE database_id=6

name                              page_verify_option_desc
=====                            ==================
Check_PageVerify          NONE

I started this post by saying Page Verify = Checksum is one option which I like all my servers to be compliant. Now after this restore my server is not complaint anymore for page verify. I will never come to know about this miss until and unless I regularly do compliance checks.

So how do we do that compliance checks ? This is where Policy Based Management (PBM) comes into picture.
We will need to create a condition first using the wizard.

PBM PageVerify1

I’m using facet ‘Database options’ for this demo and added the expression @pageverify= checksum.

Now, lets create a policy for this condition.

PBM PageVerify2

When creating the policy we will need to choose the condition which we created earlier. The evaluation mode for this policy are of 3 types

1. On demand 2.On schedule and 3.On change Log only.

To know more about PBM concepts please ref http://technet.microsoft.com/en-us/library/bb510667.aspx 

We are picking On schedule for this demo. This option needs a schedule to be created. We will create a schedule which will run at 12 AM every Sunday.

PBM PageVerify3

Once the schedule is created we need to ensure that policy is enabled.

PBM PageVerify4

The idea is to have the schedule run every Sunday to check if there are any databases which are violating the rule of page verify = checksum and on Monday morning when we come to work we will just need to check history of the policy to see if there are any violations.

For the demo purpose, lets trigger the schedule to run manually.

PBM PageVerify5

This is the job which was created automatically for the schedule which we created earlier.

After running the job successfully, all we need to do is check the history of the policy

PBM PageVerify6

There you go, we got the first violation details of the policy

PBM PageVerify7

It clearly says database Check_PageVerify is not compliant for the policy which we created.

If you have multiple databases which are not compliant, then you have the option to evaluate the policy manually ,choose the one which are not complaint and force them to be complaint.

PBM PageVerify8

PBM PageVerify9

PBM PageVerify10

PBM PageVerify11

Great, now we have everything under control and all the databases are compliant.

Conclusion

There are great things which we can do with PBM and if you have a better way to achieve this using PBM(Alternate facets,condition etc) or any other methods, then I’m eager to hear from you.

Keep watching this space for more.

Thanks for reading.

DBCC Memorystatus SQL2012 – A quick observation for node Process/System Counts


This post is not a detailed explanation of DBCC Memorystatus for SQL2012,but a quick observation I made recently for the node Process/System counters.

This is one area to look at under DBCC Memorystatus output to get a quick snap shot of the current state of memory and understand if there is a memory pressure.

In SQL 2012 you are provided with this information,ie info from the Process/System counters right away you run DBCC Memorystatus. This is the first set of data which is retrieved by the DBCC command.

DBCCMem1

I really liked the idea of getting information like system physical memory high/low, Process physical/virtual memory low etc right away I run DBCC as this info is one which I look at all the time during initial analysis phase.

In SQL 2008 R2,you still have this data but you will need to scroll all the way down to get this info. Still good, atleast we have this info for a quick ref.

In SQL2005,you don’t have this info at all.Yes,things are not so easy during the early days !

Conclusion

The process and system counters section is very useful and keep watching this space for more.

Thanks for reading.

SQLAgent Job related reports and out of box real time job monitoring tool !


Lets start New Year with a very simple yet useful topic.

I am a big fan of out of box reporting capabilities of SSMS. Some of the reports are really useful.

One of the reports which I really like is the Top Jobs report. This one can be pulled by traversing through –

Job Monitoring 1

The report comes in really neat with a graph and 3 major information related to SQL agent jobs.

Job Monitoring 2

The ones which I’m really interested all the time are 20 Most Frequently Failing Jobs and the 20 Slowest jobs.

This information is very important for me as I can trend the job information over a period of time(Even though its for a weeks time) for the most business critical servers.

Job Monitoring 3

The reports provide lot of useful information such as Avg Run Duration,Total Executions etc which will help you to observe how jobs are behaving under normal circumstances and can even use this as a baseline. Pretty neat stuff out there.

We also have the Job steps execution history report which will provide us with information regarding various steps within the job. Again a useful one if you are having lot of steps which are configured for the same job.

Now lets look at a scenario  – Your server is hitting 100% CPU and you believe that a heavy SQL job running at that point of time might be a reason for this spike.

How do you check which SQL job is running at that very point of time? Well, there are multiple ways to accomplish this. A TSQL code which joins couple of job related MSDB tables can quickly pull this information for you.

What if there is a light weight tool sitting right inside SSMS which can pull that information for you, lets look at that right away

Job Activity Monitor option will help you to monitor SQL jobs real-time.

Job Monitoring 4

If you need to know about jobs which are executing right now, then you will need to use the filter option and apply the required status.

Job Monitoring 5

Job Monitoring 6

Conclusion

There are multiple ways to monitor your jobs and its statistics, however the above mentioned methods are simple and efficient to get a quick view of SQLAgent jobs and related information.

Next Gen File System ReFS(Resilient File System) and SQLServer !


“ReFS”, this term was absolutely new to me when I was reading this white paper from Microsoft ‘Storage Windows Server 2012’.

ReFS is the next gen file system from Microsoft, and available in Windows Server 2012.

Here are some of the key ReFS benefits and capabilities –

1. Robust disk updating  This avoids problems associated with power failures during disk updates.

2. Data Integrity This ensures detection of all forms of disk corruption and it uses checksum.

3. Availability – Ensures that the whole volume will never go down when there is a corruption.

4. Scalability – ReFS is highly scalable and its a great benefit considering the fact that data demands are getting high and high.

5. Proactive error identification  Scans the volume for errors.

For the complete list and detailed explanation refer the below mentioned white papers and blog posts –

Optimized Storage Efficiency with Windows Server 2012

Windows Server 2012 – Storage

Building the next generation file system for Windows: ReFS

 

Being a SQLServer professional my next focus was to understand how ReFS is going to benefit my SQLServer configurations, and to my surprise there was no real good news !

Denny Cherry(B/T) has a great blog post on why SQLServer will have issues if we run on ReFS. I would encourage you to read that post now.

This is mainly because some of the NTFS features such as named streams and sparse files are not supported in ReFS. SQL Server uses these features for DBCC CHECKDB and Snapshots.

Without using named streams DBCC CHECKDB will lock objects and this is not a desirable situation from everyone’s perspective.

I personally would like to test these situations and see what all scenarios comes up.I’m setting up an environment for this very purpose and looking forward to come with more details.

Conclusion –

If you are deciding to leverage ReFS for SQLServer deployments, then you would need to think twice and hold on to the decision.

Backing up the “log” of a database which is in Simple recovery model ! – Watch out for this!


In theory we cannot backup the log of a database which is in Simple recovery model, however there is a small bug in SQL 2012 which allows you to do this, and you need to be very careful about this.

Below is the detailed repro steps for this issue and the testing has been done using SQL 2012 RTM bits.

1. Change the recovery model of Model database to Simple.

2. Create a new database.

3. The newly created database will be having recovery model as simple as it inherits the properties of the model database.

4. Backup the newly created database.

BACKUP DATABASE Test_LogBackup
TO DISK='I:\Test_LogBackup_Full.bak'
WITH STATS=1

Processed 264 pages for database 'Test_LogBackup', file 'Test_LogBackup' on file 1.
100 percent processed.
Processed 2 pages for database 'Test_LogBackup', file 'Test_LogBackup_log' on file 1.
BACKUP DATABASE successfully processed 266 pages in 0.570 seconds (3.632 MB/sec).

5. Now try to backup the log of this database which is in simple recovery model

BACKUP LOG Test_LogBackup
TO DISK='I:\Test_LogBackup_Tran.trn'
WITH STATS=1

Processed 3 pages for database 'Test_LogBackup', file 'Test_LogBackup_log' on file 1.
BACKUP LOG successfully processed 3 pages in 0.003 seconds (5.371 MB/sec).

This behavior is not expected and this was reported by one of the users(@nayanraval) on twitter. After multiple successful repro’s the user was suggested to open a case with Microsoft PSS group for a quick resolution.  A connect item is also open for you to sign-in and vote.

https://connect.microsoft.com/SQLServer/feedback/details/765315/when-model-db-is-set-to-simple-recovery-newly-created-databases-are-not-truely-in-simple-recovery

I will keep watching this connect item and will update this post once it’s resolved or we get a response from Microsoft on this behavior.

Thanks for reading.

Cannot assign a default value to a local variable !


You will hit this error Cannot assign a default value to a local variable in case you are running SQL 2005/below,and you try to declare a variable and assign it a value in one statement.

Something like  –

DECLARE @Var Varchar(15) = 'Test'

Msg 139, Level 15, State 1, Line 0
Cannot assign a default value to a local variable

Variable declaration has been enhanced in SQL2008/above and I realized this today when I was deploying a code for multiple systems.

The same declaration mentioned above will work great for 2008/above

Apparently we also have the flexibility to just use one Declare statement to declare multiple variables  –

DECLARE 
@Var Varchar(15) = 'Test',
@Char Varchar(10) = 'Test2',
@Char2 Varchar(10) = 'Test3'

These are indeed “feel good to have” enhancements !

[Update Added on 10/24/202]

In SQL2005 or below you will need to declare the variable and then assign it a value using Set statement.

Example –

DECLARE @Var Varchar(15) 
SET @Var = 'Test'

[Update]

Thanks for reading.

You don’t want to miss this out,Serious SQLServer training – 24 Hours of PASS event !


Back to back quality SQLServer sessions! Yes, you heard it right and that’s what 24 Hours of PASS is all about.

The event kicks off at 12:00 GMT on Sep 20 and runs for 24 consecutive hours.

The sessions are selected carefully, and are categorized into 6 tracks –

Enterprise Database Administration (DBA)

Application Development (AppDev)

BI Information Delivery (BID)

BI Platform Architecture, Development & Administration (BIA)

Cloud Application Development & Deployment (CLD)

Professional Development (PD)

Complete details of sessions are available at http://www.sqlpass.org/24hours/fall2012/SessionsbyTrack.aspx

I’m looking forward to attend some great sessions next week, and my favorite picks are –

Characteristics of a Great Relational Database by Louis Davidson

Digging Into the Plan Cache  by Jason Strate

Three Ways to Identify Slow Running Queries by Grant Fritchey

Fasten Your Seatbelt – Troubleshooting the Most Difficult SQL Server Problems  by Klaus Aschenbrenner

SQL Server Private Cloud != Azure by Allen Hirt and Ben DeBow

What’s All the Buzz about Hadoop and Hive? by Cindy Gross

DBCC, Statistics, and You by Erin Stellato

Best Practices for Upgrading to SQL Server 2012 by Robert Davis

PowerShell 101 for the SQL Server DBA by Allen White

Using SQL Server 2012 Always On by Denny Cherry

Leadership – Winning Influence in IT Teams by Kevin Kline

I’m really excited ! Are you ? Register here

Thanks for reading.

SSMS Report Error – Index(zero based) must be greater than or equal to zero and less than the size of argument list !


SSMS reports are one of the best ways to pull some great deal of Server level or Database level information within SQLServer.

Today I was running the ‘Disk Usage’ report (This one is my favorite) for one of the database and immediately got this error!

Error – Index (zero based) must be greater than or equal to zero and less than the size of argument list!

Well that’s one really cool error message to get when you are seriously working on something important!

 

Immediately I checked to see if the same report is causing issues for other databases which were available on the same instance.

The report was working great for all other databases except one which gave the error.

All the 4 ‘Disk Usage’ reports where having issues for that particular database, and I decided to compare this database with one for which the report was running fine.

Eventually the culprit was found. The database was having compatibility level 80 (SQL 2000) and was residing on a SQL Server 2005 SP4 instance. SSMS which I used to run reports was of version 10.50.2500 (2008 R2 SP1).

There are multiple T-SQL statements which are ran to pull this report,and this can be tracked by running a SQL-Profiler(Again, with caution for a production environment)

Changing the compatibility to 90 fixed the issue and the reports were running as expected.

Note – Changing the compatibility needs to be reviewed properly with the application owners as it can bring unexpected results. Proper testing and approval needs to be taken.

Conclusion –

Yet another reason to find all the databases which are running with lower compatibility and push application owners to fix their code where ever its required and move forward with latest compatibility levels.

For SQL Server 2012 compatibility level 80 is not supported, hence you won’t face these type of compatibility related issues anymore.

How many SQL Servers you run within your environment ? – MAP could be your best bet to answer this question !


A blog post about Microsoft Assessment Planning Toolkit was in my list for quite a while and today I decided to write about it.

Let’s assume that you are working for a pretty large organization and there are a lot of database servers running within the environment.

One fine day companies CTO or even the DBA manager walks to desk and taps your shoulder to ask this question –

“How many SQL Servers are we running here?

In case you are maintaining a proper repository after building each and every server,then the answer could be given fast.

If you dont have any repository and you don’t maintain a registered server list,then the question can be as good as a direct punch !

A smart DBA is one who has the right set of tools to get the right information at the right time.

We have a great tool called Microsoft Assessment Planning Tool Kit which does a great job to provide you sufficient information to answer the above question and much more. There are lot many features available this tool kit and we will try to cover few of them.

A very recent copy of MAP was published last month and currently the version is 7.0.The tool kit can be downloaded from here.

Normally I prefer to run this tool on my laptop or work machine to pull information/reports and I never do this install for a server.

In case you are running the old Windows XP OS within your laptop or work machine, then there is bad news. The latest version will work only for the below mentioned OS –

  • Windows 7 with Service Pack 1 (Professional, Enterprise, and Ultimate editions only)
  • Windows Vista with Service Pack 2 (Business, Enterprise, and Ultimate editions only)
  • Windows Server 2008 R2 with Service Pack 1
  • Windows Server 2008 with Service Pack 2

Now lets do MAP install and see how cool this tool is all about.

Once the setup is downloaded we can run the same to start the whole install process. The setup will do some initial checks like disk free space availability etc and will give you this screen.

The summary clearly states that the setup will install SQL Express LocalDB too along with the tool. Later on we will see why SQL express is required for the tool.

The install is pretty fast and once the setup is completed first launch screen looks like below

The initial setup screen will prompt us to create a database.

In this walk through we will name the database as ServerInventory. Clicking on OK will create the database for us.

Once the database is created, the initial screen will show us the details of steps to be completed.

We will now do Perform Inventory step. Clicking on GO will lead us to the below screen

This window will confirm how powerful this tool is. You can collect inventory of multiple application servers and even Oracle/MySQL and Sybase.

We are interested in SQL Server and will select the option SQL Server with Database details. Clicking on next will lead us to the below screen which will ask us which method we need to use to discover the servers. For this demo we will choose a single server, hence will select the option as below

The next 2 screens will deal with adding a computer name and the credentials. Final action will be to hit Finish and the assessment will begin

Once the assessment is completed, we can close this window and can start exploring the reports under Scenario Status section.

We can click on the view option of SQL Server Discovery report to get a very neat formatted report of instances available for the particular server.

In case you are interested only in total Database Engine instances, then the section to look at is Microsoft SQL Server Database Engine Editions section.That will give you an accurate count.

Another report which looked really cool to me is the Software usage tracking report.

Another report which excites me a lot is the Virtual Machine Discover Results report –

Conclusion

MAP tool has great potential and it allows you to make crucial business decisions during consolidation efforts.

Thanks for reading.