SQL Server Data Tools is now a separate installer – Things to know!


Recently I had to develop some SSRS report, and my desktop machine had SQL Server 2014 installed on it. I had previously used SQL Server Data Tools, and I tried to search for it under Programs. To my surprise, it was missing.

I thought I might have missed adding it while installing SQL 2014, and re-ran the setup to add it again. There comes my next surprise –

There is no option to add SQL Server Data Tools!

SSDT1

After some research got to know that Microsoft moved away from bundling data tools with SQL 2014(For obvious reasons!) and this link provides the required bits.

Picking the right installer was again little confusing after browsing the above URL and eventually I was able to download the right product after carefully checking the description.

SSDT2

Installing the tool was pretty straightforward and here are the instructions –

  1. Run the setup.exe(After extracting the installer).
  2. Click on New SQL Server stand-alone installationSSDT3
  3. Perform a new installationSSDT4
  4. Choose SQL Server Data Tools – BI for Visual Studio 2013SSDT5
  5. Install.
  6. Eventually you will have it on the Program list SSDT6

Conclusion –

Microsoft’s decision to un-bundle SSDT with SQL 2014 installer is acceptable as this gives them the opportunity to update this tool on a fast pace.

Thanks for reading,and keep watching this space for more!

 

Prevent creating stored procedures with RECOMPILE hint – Policy Based Management is your friend !


I got motivated to write this post as there was a question on twitter which was like  –

PBM Recompile 1

The user was checking if there was a way to prevent procedures which are created with RECOMPILE hint.

I was pretty sure that Policy Based Management did provide us a Stored Procedure Facet, and there will be some expressions which we can create using the same to ensure that no new stored procedures are created with this hint.

Will I use RECOMPILE hint ever in my stored procedure? That’s a huge ‘It depends’ question. I might not use it within a stored procedure, but at times I might have to use it during run time.

Kendra Little(B/T) has an excellent video on recompile, and you can find it here. [Really nice stuff out there !]

I did reply to the user that I’m positive that there will be some option, but it needs to be tested out .

So, this post is my test case to try and see if we can actually prevent users from creating stored procedures with RECOMPILE hint using PBM .

First of I created a condition using facet stored procedure and added an expression like what is shown in the image below –

PBM Recompile 2

I went ahead and created a policy which used the condition which was created in the first step, and chose evaluation mode as On Change : Prevent

PBM Recompile 3

Note – I’m doing this check for every database and every stored procedures as this is a test case. This can be checked against a single db using another condition and that’s the recommended approach.

Once the policy is created I made sure that its in enabled state.[Apparently nothing will be validated if the policy is not enabled ]

Next step is to try and create a stored procedure with recompile hint. For this purpose I will be using a database called SuperMan (Yes,just for fun).This db got a table called People and it has 3 rows of data.

PBM Recompile 4

I tried to create a simple stored procedure named FindSuperMan which will do a simple SELECT * from the table.This stored procedure will try to force a RECOMPILE hint.

CREATE PROCEDURE dbo.FindSuperMan 
WITH RECOMPILE
AS
SELECT * 
FROM [dbo].[People]
GO

As soon as I ran the command there was an error which said  –

Policy ‘Policy_Prevent_Recompile’ has been violated by ‘SQLSERVER:\SQL\TESTServer\DEFAULT\Databases\SuperMan\StoredProcedures\dbo.FindSuperMan’.
This transaction will be rolled back.
Policy condition: ‘@Recompile != True()’
Policy description: ”
Additional help: ” : ”
Statement: ‘CREATE PROCEDURE dbo.FindSuperMan
WITH RECOMPILE
AS
SELECT *
FROM [dbo].[People]
‘.
Msg 3609, Level 16, State 1, Procedure sp_syspolicy_dispatch_event, Line 65
The transaction ended in the trigger. The batch has been aborted.

What actually happens here is the transaction will be rolled back as it violated the policy which was defined.

Conclusion

This proves the fact that PBM can indeed be used to prevent some conditions which you don’t want to happen in your db/environment.

Thanks for reading and keep watching this space for more !

Have you configured Database Mail to utilize multiple SMTP Accounts ?


Lets start discussing about this subject by closely looking at the term Redundancy.  What is redundancy all about ?

Wikipedia defines redundancy as “In engineering, redundancy is the duplication of critical components or functions of a system with the intention of increasing reliability of the system, usually in the case of a backup or fail-safe.”

Technically this is redundancy too  !

Redundency

That’s a very nice and to the point definition isn’t? Well, very true.

What’s this hype all about redundancy in database mail ? Lets closely look database mail for more details.

When you create a new database mail profile, you have an option to add multiple SMTP accounts. If an account fails when sending an email, the profile uses the next account in the priority list.

Yes, this is some level of redundancy for database mail.

SMTP Red

You can set the priority of accounts by moving them up and down.

Adding a new account is a simple process and can be done easily with the GUI.

SMTP Red2

The ability to configure multiple SMTP accounts comes really handy when your enterprise infrastructure is having multiple SMTP servers as part of redundancy plan.

Conclusion

Incase you do have an opportunity to add multiple SMTP accounts for database mail, then dont think twice. Its always good to have a backup account which will ensure that you will never miss that important alert/email.

Thanks for reading and keep watching this space for more.

Quick alert on servers which are not compliant for policies set using PBM


Last month I wrote about leveraging policy based management for regular DBA compliance checks. You can read that post here.

Policy based management(PBM) is indeed one of my favorite features. I normally like to play around exploring more and more about it, and eventually end up learning something new(Yay!).

Today’s post is based on that surprise learning. Small but very useful feature.

PBM currently lacks flexible reporting capabilities like sending an automated email to the DBA every week stating compliance status or even a reporting dash-board for the DBA to constantly monitor the details. I hope Microsoft is working on these enhancements, and we can expect some good news when next major bits of SQLServer is out.

Note – Enterprise policy management framework has great enterprise wide policy compliance reporting capabilities, and I’m still exploring this great tool. This is a codeplex project and you can explore the same by following http://epmframework.codeplex.com/

So here is my little story – When I walk in to office on a Monday morning I wanted to know which servers are having compliance issues. To start of I ‘m interested to check my super critical business servers. I used to connect to those servers and do a View History(Under Management->Right click Policy Management)  to get the details of compliance issues.

Note – I evaluate my policies on a schedule.

Recently I observed a quick alert when connected to a  server from SSMS which was telling me that there was a policy compliance issue, and I need to act on it.

The alert looks like – PBM Visual Alert There was indeed a policy violation which was picked by a schedule which ran on the weekend. Monday morning when I connected to that box,SSMS reported me that there is a policy violation via the above described small(but very helpful) graphical alert.

Once we fix the compliance issues, the alert is removed. PBM Visual Alert2 Conclusion

These simple but useful features is indeed helpful for DBA’s to have better control over compliance issues and I’m really looking forward for more and more features and enhancements for PBM from Microsoft SQLServer team.

Thanks for reading and keep watching this space for more.

Leverage Policy Based Management for regular Page Verify =Checksum compliance monitoring


Page Verify = Checksum is one option which I like all my servers to be compliant. By default all the new databases which gets created will have page verify = checksum as this option is inherited from the model database.

NOTE – APPLICABLE FROM SQL2005 AND ABOVE.

SELECT name,page_verify_option_desc FROM sys.databases
WHERE database_id=3

name                                page_verify_option_desc
=======                         ==================
model                               CHECKSUM

So far so good. But what will happen if we restore a database from SQL 2000 or even restore a database which is having a different page verify option other than checksum.

Lets check that out –

For the demo purpose on Server A we already have a database named Check_Page_Verify which is having a different page_verify option set as NONE(Bad idea !).

****Server A****

--Check the page_verify option of the db
SELECT name,page_verify_option_desc FROM sys.databases
WHERE database_id=6

name                         page_verify_option_desc
=====                       ==================
Check_PageVerify     NONE

We will back this database up and will restore it in Server B.

****Server A****

--Backup the database
BACKUP DATABASE Check_PageVerify
TO DISK='I:\Check_PageVerify.bak'
****Server B****

--RESTORE the database
RESTORE DATABASE Check_PageVerify
FROM DISK='I:\Check_PageVerify.bak'
WITH MOVE 'Check_PageVerify' TO 'D:\Check_PageVerify.mdf',
MOVE 'Check_PageVerify_log' TO 'D:\Check_PageVerify_log.LDF'

“Data file and log file in the same drive ! (Bad idea),however as this is a demo we can ignore this.”

In Server B lets check the page_verify option

****Server B****

--Check the page_verify option of the db
SELECT name,page_verify_option_desc FROM sys.databases
WHERE database_id=6

name                              page_verify_option_desc
=====                            ==================
Check_PageVerify          NONE

I started this post by saying Page Verify = Checksum is one option which I like all my servers to be compliant. Now after this restore my server is not complaint anymore for page verify. I will never come to know about this miss until and unless I regularly do compliance checks.

So how do we do that compliance checks ? This is where Policy Based Management (PBM) comes into picture.
We will need to create a condition first using the wizard.

PBM PageVerify1

I’m using facet ‘Database options’ for this demo and added the expression @pageverify= checksum.

Now, lets create a policy for this condition.

PBM PageVerify2

When creating the policy we will need to choose the condition which we created earlier. The evaluation mode for this policy are of 3 types

1. On demand 2.On schedule and 3.On change Log only.

To know more about PBM concepts please ref http://technet.microsoft.com/en-us/library/bb510667.aspx 

We are picking On schedule for this demo. This option needs a schedule to be created. We will create a schedule which will run at 12 AM every Sunday.

PBM PageVerify3

Once the schedule is created we need to ensure that policy is enabled.

PBM PageVerify4

The idea is to have the schedule run every Sunday to check if there are any databases which are violating the rule of page verify = checksum and on Monday morning when we come to work we will just need to check history of the policy to see if there are any violations.

For the demo purpose, lets trigger the schedule to run manually.

PBM PageVerify5

This is the job which was created automatically for the schedule which we created earlier.

After running the job successfully, all we need to do is check the history of the policy

PBM PageVerify6

There you go, we got the first violation details of the policy

PBM PageVerify7

It clearly says database Check_PageVerify is not compliant for the policy which we created.

If you have multiple databases which are not compliant, then you have the option to evaluate the policy manually ,choose the one which are not complaint and force them to be complaint.

PBM PageVerify8

PBM PageVerify9

PBM PageVerify10

PBM PageVerify11

Great, now we have everything under control and all the databases are compliant.

Conclusion

There are great things which we can do with PBM and if you have a better way to achieve this using PBM(Alternate facets,condition etc) or any other methods, then I’m eager to hear from you.

Keep watching this space for more.

Thanks for reading.

How many SQL Servers you run within your environment ? – MAP could be your best bet to answer this question !


A blog post about Microsoft Assessment Planning Toolkit was in my list for quite a while and today I decided to write about it.

Let’s assume that you are working for a pretty large organization and there are a lot of database servers running within the environment.

One fine day companies CTO or even the DBA manager walks to desk and taps your shoulder to ask this question –

“How many SQL Servers are we running here?

In case you are maintaining a proper repository after building each and every server,then the answer could be given fast.

If you dont have any repository and you don’t maintain a registered server list,then the question can be as good as a direct punch !

A smart DBA is one who has the right set of tools to get the right information at the right time.

We have a great tool called Microsoft Assessment Planning Tool Kit which does a great job to provide you sufficient information to answer the above question and much more. There are lot many features available this tool kit and we will try to cover few of them.

A very recent copy of MAP was published last month and currently the version is 7.0.The tool kit can be downloaded from here.

Normally I prefer to run this tool on my laptop or work machine to pull information/reports and I never do this install for a server.

In case you are running the old Windows XP OS within your laptop or work machine, then there is bad news. The latest version will work only for the below mentioned OS –

  • Windows 7 with Service Pack 1 (Professional, Enterprise, and Ultimate editions only)
  • Windows Vista with Service Pack 2 (Business, Enterprise, and Ultimate editions only)
  • Windows Server 2008 R2 with Service Pack 1
  • Windows Server 2008 with Service Pack 2

Now lets do MAP install and see how cool this tool is all about.

Once the setup is downloaded we can run the same to start the whole install process. The setup will do some initial checks like disk free space availability etc and will give you this screen.

The summary clearly states that the setup will install SQL Express LocalDB too along with the tool. Later on we will see why SQL express is required for the tool.

The install is pretty fast and once the setup is completed first launch screen looks like below

The initial setup screen will prompt us to create a database.

In this walk through we will name the database as ServerInventory. Clicking on OK will create the database for us.

Once the database is created, the initial screen will show us the details of steps to be completed.

We will now do Perform Inventory step. Clicking on GO will lead us to the below screen

This window will confirm how powerful this tool is. You can collect inventory of multiple application servers and even Oracle/MySQL and Sybase.

We are interested in SQL Server and will select the option SQL Server with Database details. Clicking on next will lead us to the below screen which will ask us which method we need to use to discover the servers. For this demo we will choose a single server, hence will select the option as below

The next 2 screens will deal with adding a computer name and the credentials. Final action will be to hit Finish and the assessment will begin

Once the assessment is completed, we can close this window and can start exploring the reports under Scenario Status section.

We can click on the view option of SQL Server Discovery report to get a very neat formatted report of instances available for the particular server.

In case you are interested only in total Database Engine instances, then the section to look at is Microsoft SQL Server Database Engine Editions section.That will give you an accurate count.

Another report which looked really cool to me is the Software usage tracking report.

Another report which excites me a lot is the Virtual Machine Discover Results report –

Conclusion

MAP tool has great potential and it allows you to make crucial business decisions during consolidation efforts.

Thanks for reading.

Is there a way to find Historical Recovery Model changes on a database? Yes,you are lucky !


Today there was an interesting question posted on twitter, and that made me think about ways on how we can play around with SQL Server features/options to find solutions for tricky questions and requirements.

There are multiple ways to find this info, and the first and foremost thing which came on my mind was to read the error logs.

1. We can use sp_readerorrlog procedure to pull the details easily, however this details will going off when the error logs are recycled.

Sp_readerrorlog 0, 1, 'recovery'

The above method is great when you dont need huge amount of historical data and you are looking only for a recent specific time frame.

2. The second option is to use the default trace to find this info, however this is very tough as the recovery model changes are tracked under event class 164 which is the one for object_changes.There is no specific info which says the recovery model was changed.

A good post which leverage default trace + sp_readerorrlog + sp_enumerrorlogs is written by Simon Sabin(B/T) is available here.

In case you are using SQL 2008/above and would like to track this from now on then you also can leverage PBM(Policy Based Management) for this purpose.

The idea about PBM didn’t came to my mind early on, however it eventually came and I decided to test it to see if we can track historical info on recovery model changes.

A policy was created for all the databases with evaluation mode was ‘On Schedule’

The schedule was set for every 1 minute to ensure that multiple recovery model changes are evaluated and not missed.

The condition was also created as

The condition@RecoveryModel = Full is just one value for creating a condition, so that we can evaluate the policy.

Initially the database for this test case Adventureworks2012 was running Full recovery model and I changed it to Simple, Full and then Bulk Logged.

This means that I changed recovery model 3 times and currently its running on Bulk logged recovery model.

If we want to evaluate the policy manually we can do that too.if we opt to do a manual evaluation then we will get the evaluation details which will clearly says that database Adventureworks2012 is having a different recovery model than what is mentioned in the policy.


Our policy evaluation is running in the back ground to ensure that manual intervention is not required and we don’t miss any recovery model changes which might happen frequently (Multiple changes between 1 min can be missed out)

Note – Running schedules for every 1 min is not really recommended and should be used only if there is a specific requirement.

The tracking of recovery model changes comes in under the View History option available under Policy Management.

The first policy deviation was when the database was set as Simple recovery model.If we click the hyperlink under Detail section we will get the details of actual value and expected value.

The second policy evaluation success message is when the recovery model was changed to Full.

The third policy deviation was when the database was set as Bulk logged recovery model.

This way we can track the recovery model changes and we can disable/delete this policy once the requirement is completed.

The total number of days to retain this history for all policies is available too.It would have been great if we could retain history per policy,however that option is not available at this moment.

Conclusion 

As a hard and fast rule I definitely won’t allow anyone to change the database recovery model option. I can definitely enforce a policy to restrict it too, however as the requirement of the user was to track the changes I mentioned couple of ways to deploy it.

I am interested to know if you have dealt any similar situation like this. Feel free to pass on your experience as comments.

Thanks for reading.

The compressed(zipped) folder is invalid or corrupted – Not always true !


Have you ever encountered this error message when tried to extract zipped backup files using built in ‘Extract All‘ facility of Windows –

The error message will first give you an impression that the zip file might be corrupt.

This might be true in some cases like a file not downloaded fully from FTP, copied over a network and got corrupted etc.

In most of the cases I have experienced, the zip file was fully functional and it never had any corruptions. The extract all facility still throws errors even though the zip file is valid.

I am not at all a big fan of using the zip/unzip OS facility provided by default, hence I depend on tools like WinRar,PKZIP etc which offers better compression rate, speed etc.

I would recommend extracting the zipped file using any of the above mentioned tools and see if the zip file is really corrupted.

The same zip file which showed errors while tried to extract using the default extract all feature was proved fine and was inflated using PKZIP utility

Conclusion

It’s not always true that the zipped file might be corrupt, hence you should always leverage third party tools to test the file validity.

Thanks for reading.

Data collector jobs [ collection_set_2_collection / collection_set_3_collection] long running duration !


Recently there was an interesting question from one of the forum users ,the question was something like ‘How long the data collector jobs are scheduled to run when I configure Data Collection in SQL Server 2008?My job is running since 3000 minutes’

To understand this behaviour we need to look into what Data Collector is all about.Data collector was first introduced in SQL Server 2008 and it used to collect various data and it can be stored in a database called Management Database Warehouse(MDW).Data collection is made and stored in the database to analyze for different time periods.Data collector currently provides 3 reports,which are Server Activity,Query Stats,Disk Usage.

To understand more about Data Collector you can always refer http://msdn.microsoft.com/en-us/library/bb677248.aspx

The data collector jobs will run continuously as long as the SQL Server Agent is running[This is the default setting] and it is scheduled to start when the Agent starts.This jobs will collect the data continuously and the upload job runs as per schedule(By default every 15 minutes).

 

[Data collector is available only in SQL Server 2008 and above]

Best Practice Analyzer – Can sneak in and give some important tips!


Lets start the month with some best practice’s.

Microsoft® SQL Server® 2008 R2 Best Practices Analyzer will allow you to scan your sql server instance and will report potential issues.There is a error area and warning area which will give detailed explanation of the issue.

Looking at this report will help you to fix potential problems.

Best practice analyzer works with Microsoft Baseline Config Analyzer and baseline config analyzer is used to scan the instance.

Recently when I scanned instance,I got the below report which quickly helped me to fix some potential issues.

Just give a look at the error part.You should never see this in PRODUCTION environment!