Data Collector in SQLServer 2014 – What’s New !


Today I noticed couple of changes(Neat changes !) for Data Collector in SQLServer 2014. This post is a detailed walk-through of setting up Data Collector in SQL2014. In the walk-through if there is a new feature or change which is specific to SQL2014,then I will highlight that.

As soon as you right click Data Collector and choose Tasks you can see 3 options(This is a change in SQL2014) –

  • Configure Management Data Warehouse
  • Configure Data Collection
  • Cleanup Data Collectors (This is a change in SQL2014)

Lets talk about each option in detail now.

Option 1

The first option will create and configure a Management Data Warehouse.

DataCollector20141

We will proceed by clicking Next and it will take you to the screen which will help you to create/choose the MDW database. I have already created a blank database, so I will select that db in this walk through and will proceed.

DataCollector20142

Next up is to grant permissions to users. Reader role is for reports, writer is for uploading data and admin has all these rights.

DataCollector20143

Final step is to finish the operation.

DataCollector20144

DataCollector20145

Option 2

Configure data collection option helps you start data collection and store collected data to the MDW.

DataCollector20146

This step lets you choose the MDW db,a directory to store the data before its uploaded to the MDW and the collection sets. There are 2 collection sets –

  • System Data Collection Sets
  • Transaction Performance Collection Sets (This is new in SQL2014)

The above selections will create 2 new data collector sets called Stored Procedure Usage Analysis and Table Usage Analysis along with regular 3 data collector sets which was available in the previous versions.

DataCollector20147

Complete the wizard by clicking finish.

DataCollector20148

DataCollector20149

Now that we have configured a MDW and started Data Collection, lets see what happens in the cached folder C:\MDW_Temp

There you go,lot of CACHE files out there and its getting loaded/cleared very fast.

DataCollector201410

 Option 3 – (This is new in SQL2014)

Cleanup Data Collectors is an option which will be enabled after you enabled data collection and this wizard will help you the clean up the collection set.

DataCollector201411

DataCollector201412

DataCollector201413

So as the 1st phase of configuring data collector is now over, we will drill down little more.

If you collapse the Data Collection tree you can see 2 new data collection sets( This is new in SQL2014)

  • Stored Procedure Usage Analysis
  • Table Usage Analysis

DataCollector201414

 

These data collectors are integrated with data collector for a purpose. The data which is collected will provide you with valuable insights on tables and stored procedures which are good candidates to migrate to In Memory OLTP(Hekaton).

I would recommend you to read a great blog post by Benjamin Nevarez(B/T) on this very topic.

Regular data collection sets which were there in the previous versions are also available and its under System Data Collector Sets

DataCollector201415

 

Lets have a quick look at the reports which data collector offers. These are pretty cool out of the box basic reports and they will help you understand some level of trending.

Query Stats History Report –

This report will give you details of Top 10 TSQL queries and you have the option to rank queries on the basis of CPU, Duration, Total I/O, Physical Reads, Logical Writes. There is also an option to do play back.

DataCollector201416

DataCollector201417

 

You also have an option to drill down to particular queries to get more details.

DataCollector201418

DataCollector201419

Server Activity History Report –

This is one of my favorite reports. Its tell you about CPU, Memory, Disk, Network usage trends. The report also gives valuable insights about Wait Stats and Server Activity. Again you have the option to drill down for some of the graphs.

DataCollector201420

Disk Usage Summary Report –

This is a great source to know the growth trends.

DataCollector201421 Conclusion

Data Collector is a great feature, and In Memory OLTP being the buzz word, the data it collects can provide you with valuable insights.

Thanks for reading and keep watching this space for more ! (Next up is some cool things about WindowsAzure(Oops !) Microsoft Azure !!!

SQL Server 2014 Install – Step by Step


Hot cake for the day is SQLServer 2014 RTM bits. Its generally available starting today, and I downloaded a fresh copy to start testing it.

I will be writing a separate post on the features I love about SQLServer 2014,but here is the step by step install process for a standalone install.

There are no changes for the GUI based install process when you compare with SQLServer 2012,but one thing I’m pretty sure that the installer is sleek and faster.

Step 1  – The landing page looks pretty similar to the earlier versions, and you can choose the installation tab to get things started.

SQL2014 RTM Install1

Step 2 – Choose the stand-alone install option, and it will take you to the product key page.

SQL2014 RTM Install2

Step 3 – This screen is where you need to accept the license terms. There is also an option to turn on select Customer Experience Improvement Program and Error Reporting.

SQL2014 RTM Install3

Step 4 – 6 These are the steps where Global Rules, Product Updates are checked and Setup files are extracted and installed. If the checks are good, then the install will proceed to the next step automatically.

SQL2014 RTM Install4

Step 7 – Install rules are checked for this step, and I have highlighted a check which was interesting to me.

SQL2014 RTM Install5

 Step 8 – This step provides you with Feature installation options. In this install I will discuss about the first option, ie SQLServer Feature Installation.

SQL2014 RTM Install6

Step 9 –  This step is where you will select the features. I’m choosing only those ones which I need for my testing.

SQL2014 RTM Install7

Step 10 – This step lets you decide if you want to go with Default Instance or a Named Instance.

SQL2014 RTM Install8

Step 11 – This step lets you select the account for the services. I’m choosing all defaults as that’s good for my test lab.

SQL2014 RTM Install9

Step 12 – This step lets you choose the Authentication mode, Data directories, and File Stream options.

SQL2014 RTM Install10 SQL2014 RTM Install11 SQL2014 RTM Install12

Step 13 – Feature Configuration Rules step. As mentioned earlier if the checks are fine, setup automatically proceeds to the next step.

SQL2014 RTM Install13

Step 14 – This is the final step of the installation, and you can pick the ConfigurationFile.ini from the path

C:\Program Files\Microsoft SQL Server\120\Setup Bootstrap\Log\Date_Time\ConfigurationFile.ini

SQL2014 RTM Install14

 

If everything goes smooth, your install will succeed.

SQL2014 RTM Install15

I’m really excited about some real good features in SQLServer 2014,and I will be writing about them during the next couple of days.

Thanks for reading and keep watching this space for more !

 

Folks,Know your Outbound Data Transfer Costs – Windows Azure


The cost associated with Outbound Data Transfer is very important when you consider bridging your On-premise network with Windows Azure.

Yes, when I said cost I indeed meant the $ amount which is involved. The good news however is that all the Inbound Data sent to Azure is free. The outbound data is all what really matters.

As per Microsoft

Outbound data is charged based on the total amount of data moving out of the Windows Azure datacenters via the Internet in a given billing cycle. Data transfers between Windows Azure services located within the same datacenter are not subject to any charge. The first 5 GB of outbound data transfers per billing month are also free.

Lets talk about this simple scenario – You have deployed SQLServer on a Windows Azure Virtual Machine. Data out of the SQLServer database will be accessed by clients which are located in your On-premise data center. Below diagram will explain the scenario.

Note – This is a high level diagram and doesn’t explain the actual network intricacies which are involved while bridging your network with Azure.

Outbound Azure

All the outbound data which is flowing out of the Azure data center is a cost ($) factor for you and inbound data sent is free.

How much should I pay?  Well,that really depends on your usage.

First 5 GB a month is free and beyond that cost is like(As of today) –

Outbound Azure2

Ref – http://www.windowsazure.com/en-us/pricing/details/data-transfers/

Note – Always ref http://www.windowsazure.com for any pricing estimates as these numbers($) tend to change.

Conclusion

Analyzing and understanding various cost factors which are involved with Windows Azure is one of the key areas an Architect should focus on and comparing this costs with ongoing on premise operational costs will help management get a clear picture.

Thanks for reading and keep watching this space for more !

Rollback DBCC CHECKDB REPAIR !


I learned something new today while I was listening to one of the courses on corruption authored by the master, Paul Randal(B/T).

In the course Paul mentioned that you can actually rollback DBCC CHECKDB REPAIR by using it within a Transaction.

This is something which is absolutely new to me, and I decided to test this stuff for some fun.

Why we need to rollback a repair?

The short answer is – There is no specific need to rollback a repair, but technically this will work and its possible.

Here is a quick demo on how this works –

We will be using a database named DBCC_PLAY which is already corrupt,and the corruption is for pageID 288.  An extract from DBCC CHECKDB output is mentioned below.

Msg 8928, Level 16, State 1, Line 55
Object ID 245575913, index ID 0, partition ID 72057594040549376, alloc unit ID 72057594045792256 (type In-row data): Page (1:288) could not be processed. See other errors for details.

If we try to do a read(SELECT) from one of the tables in the database, then that will also throw an error.

--SELECT DATA FROM THE TABLE
SELECT * FROM Test_Table1
GO

Msg 824, Level 24, State 2, Line 35
SQL Server detected a logical consistency-based I/O error: incorrect checksum (expected: 0xdf2e6f16; actual: 0xdfaeef16). It occurred during a read of page (1:288) in database ID 5 at offset 0x00000000240000 in file ‘C:\SQLDATA\DBCC_PLAY.mdf’. 

Before we try to do a repair, we will bring the database to single user mode.

--SET DB SINGLE_USER FOR REPAIR
ALTER DATABASE DBCC_PLAY SET SINGLE_USER
GO

Now we will run the repair in a transaction, but wont do a commit.

--REPAIR IN A TRAN DEMO
BEGIN TRAN
GO
DBCC CHECKDB('DBCC_PLAY',REPAIR_ALLOW_DATA_LOSS)
GO
--ROLLBACK TRAN

DBCC REPAIR does its job, and the error has been repaired

Msg 8928, Level 16, State 1, Line 65
Object ID 245575913, index ID 0, partition ID 72057594040549376, alloc unit ID 72057594045792256 (type In-row data): Page (1:288) could not be processed. See other errors for details.
The error has been repaired.
Msg 8939, Level 16, State 98, Line 65
Table error: Object ID 245575913, index ID 0, partition ID 72057594040549376, alloc unit ID 72057594045792256 (type In-row data), page (1:288). Test (IS_OFF (BUF_IOERR, pBUF->bstat)) failed. Values are 10249 and -4.
The error has been repaired.

We can run DBCC CHECKDB to be sure.

--CHECK CORRUPTION
DBCC CHECKDB('DBCC_PLAY') WITH NO_INFOMSGS;
GO

Command(s) completed successfully.

We can also do a SELECT from the table and see if that’s working too.

--SELECT DATA FROM THE TABLE
SELECT * FROM Test_Table1
GO

(832 row(s) affected)

SELECT is also working,and we now dont have any logical consistency error.

Now comes the fun part. Lets ROLLBACK the transaction which we had started for the repair.

ROLLBACK TRAN

Command(s) completed successfully.

Lets run a DBCC CHECKDB

--CHECK CORRUPTION
DBCC CHECKDB('DBCC_PLAY') WITH NO_INFOMSGS;
GO

Gosh, corruption is back !

Msg 8928, Level 16, State 1, Line 55
Object ID 245575913, index ID 0, partition ID 72057594040549376, alloc unit ID 72057594045792256 (type In-row data): Page (1:288) could not be processed. See other errors for details.

Can we do a SELECT from the table? Lets try

--SELECT DATA FROM THE TABLE
SELECT * FROM Test_Table1
GO

Nope, there comes the logical consistency error

Msg 824, Level 24, State 2, Line 35
SQL Server detected a logical consistency-based I/O error: incorrect checksum (expected: 0xdf2e6f16; actual: 0xdfaeef16). It occurred during a read of page (1:288) in database ID 5 at offset 0x00000000240000 in file ‘C:\SQLDATA\DBCC_PLAY.mdf’. 

This clearly shows that you have the ability to roll back a repair.

Lets clean up things, by repairing the database and committing it.

--REPAIR IN A TRAN DEMO
BEGIN TRAN
GO
DBCC CHECKDB('DBCC_PLAY',REPAIR_ALLOW_DATA_LOSS)
GO
COMMIT TRAN

Command(s) completed successfully.

Run a DBCC CHECKDB to ensure that there is no more corruption.

--CHECK CORRUPTION
DBCC CHECKDB('DBCC_PLAY') WITH NO_INFOMSGS;
GO

Command(s) completed successfully.

Thanks for reading and keep watching this space for more !

One error which you don’t want to see while doing upgrades – Missing MSI !


The cached MSI file ‘C:\Windows\Installer\<name>.msi’ is missing !!! This is one error which you don’t want to see when you are trying to do a service pack upgrade or a CU update during a critical change window.  Here is what the error looks like  –

Missing MSI1

If you are hit with this error then there is definitely good amount of help out there, but the amount of time it takes to fix the error can vary from case to case. There is a good amount of chance that there are multiple MSIs which are missing, and you will need to take action to replace the missing MSIs before trying to re-run the SP/CU setup again.

So why this error comes up ? The short answer to this question is that the Windows Installer Cache(Under C:\Windows\Installer) on your server is compromised. This folder is used to cache important setup files, and these files are needed during situations like patch upgrades or even for uninstalls.  If this cache is compromised, then you will definitely need to work with your  windows admins to understand why it was compromised.

As mentioned earlier there is a good amount of chance that multiple MSIs are missing, and replacing them is time consuming.

Microsoft has a great KB article http://support.microsoft.com/kb/969052 which will explain the entire process of taking corrective actions to fix this problem, and there is a smart VB script which will tell you the details of files which are missing and it will provide you commands to take corrective action. I would definitely encourage you to read this article once.

In short here are the steps which I followed when faced this error while doing an SP2 upgrade for SQL2008 R2  –

1. Copied the FindSQLInstalls.vbs script from the KB and placed under C:\Script

2. Open a command window(As Admin) and ran the command

Cscript FindSQLInstalls.vbs %computername%_sql_install_details.txt

This created a txt file,and that file is your best bet for fixing the missing MSI errors.

3. Opened the txt file to see the amount of details it has on the missing msi files.  I was interested in the Action needed area where it will provide a copy command with the msi details.Something like –

Copy “O:\SQLINSTALL\SQL2008StdRTM\x64\setup\sql_bids.msi” C:\windows\Installer\df2ee59.msi

I ran the command(As Admin) and it copied the files to the cached folder. I scrolled down the txt file to understand that there are multiple MSI files which are missing,and I had to run the copy command multiple times(I counted till 15 !).

4. Once I completed running all the copy commands which were available in the txt file, I ran SP setup again and it was successful this time.

Being proactive

If you have a real tight change window and you don’t want to deal with these errors, then I would recommend running the VB script in advance for your server and see if its reporting any missing MSI for your SQL Server. If there are errors, then you can fix this in advance and ensure that the upgrade will go smooth. If there are no errors after running the VBscript, then there is a level of confidence that you will not hit the missing msi errors for sure.

Thanks for reading and keep watching this space for more !

Setting up the Min and Max Memory options on the fly – Word of caution !


Apparently,setting up the Min and Max Memory setting for SQLServer should be definitely encouraged and no server should be left out with the default settings.But here is a quick word of caution around this –

Do you want to change these numbers on the fly randomly when there is good amount of load on the production system ?

The answer is ‘No’. You shouldn’t be changing these numbers on the fly for your random testing on a production box with good amount of load.

Changing these numbers using sp_configure and hardening the changes using a reconfigure or doing it via GUI will clear the plan cache, and that’s not something which you would like to happen on a busy production box.

Let me show a quick demo to explain this.

Step 1

I will query the employee table which is available for the AdventureWorks2012 database

USE [AdventureWorks2012]
GO
SELECT * FROM [HumanResources].[Employee]

Step 2 

The above query will cause an adhoc plan to be stored in the cache. We can verify this by querying the plan cache

SELECT st.text , qp.query_plan , cp.cacheobjtype , 
cp.objtype , cp.plan_handle 
FROM sys.dm_exec_cached_plans cp 
CROSS APPLY sys.dm_exec_sql_text(cp.plan_handle) st 
CROSS APPLY sys.dm_exec_query_plan(cp.plan_handle) qp 
WHERE objtype = 'Adhoc';

Min Max Cache Relation1

Step 3

I will change the Min Server Memory of the server using sp_configure

sp_configure 'Min Server Memory (MB)',501
RECONFIGURE

As soon as I run this command,I can check the sql server error log to see messages like

SQL Server has encountered 1 occurrence(s) of cachestore flush for the ‘Object Plans’ cachestore (part of plan cache) due to some database maintenance or reconfigure operations.
SQL Server has encountered 1 occurrence(s) of cachestore flush for the ‘SQL Plans’ cachestore (part of plan cache) due to some database maintenance or reconfigure operations.
SQL Server has encountered 1 occurrence(s) of cachestore flush for the ‘Bound Trees’ cachestore (part of plan cache) due to some database maintenance or reconfigure operations.

Step 4

Query the plan cache again to see that its cleared

Min Max Cache Relation2

 

Conclusion

Changing instance level settings needs to done with care and if possible these should be done during a proper change/maintenance window for production systems.

Thanks for reading and keep watching this space for more.

Prevent creating stored procedures with RECOMPILE hint – Policy Based Management is your friend !


I got motivated to write this post as there was a question on twitter which was like  –

PBM Recompile 1

The user was checking if there was a way to prevent procedures which are created with RECOMPILE hint.

I was pretty sure that Policy Based Management did provide us a Stored Procedure Facet, and there will be some expressions which we can create using the same to ensure that no new stored procedures are created with this hint.

Will I use RECOMPILE hint ever in my stored procedure? That’s a huge ‘It depends’ question. I might not use it within a stored procedure, but at times I might have to use it during run time.

Kendra Little(B/T) has an excellent video on recompile, and you can find it here. [Really nice stuff out there !]

I did reply to the user that I’m positive that there will be some option, but it needs to be tested out .

So, this post is my test case to try and see if we can actually prevent users from creating stored procedures with RECOMPILE hint using PBM .

First of I created a condition using facet stored procedure and added an expression like what is shown in the image below –

PBM Recompile 2

I went ahead and created a policy which used the condition which was created in the first step, and chose evaluation mode as On Change : Prevent

PBM Recompile 3

Note – I’m doing this check for every database and every stored procedures as this is a test case. This can be checked against a single db using another condition and that’s the recommended approach.

Once the policy is created I made sure that its in enabled state.[Apparently nothing will be validated if the policy is not enabled ]

Next step is to try and create a stored procedure with recompile hint. For this purpose I will be using a database called SuperMan (Yes,just for fun).This db got a table called People and it has 3 rows of data.

PBM Recompile 4

I tried to create a simple stored procedure named FindSuperMan which will do a simple SELECT * from the table.This stored procedure will try to force a RECOMPILE hint.

CREATE PROCEDURE dbo.FindSuperMan 
WITH RECOMPILE
AS
SELECT * 
FROM [dbo].[People]
GO

As soon as I ran the command there was an error which said  –

Policy ‘Policy_Prevent_Recompile’ has been violated by ‘SQLSERVER:\SQL\TESTServer\DEFAULT\Databases\SuperMan\StoredProcedures\dbo.FindSuperMan’.
This transaction will be rolled back.
Policy condition: ‘@Recompile != True()’
Policy description: ”
Additional help: ” : ”
Statement: ‘CREATE PROCEDURE dbo.FindSuperMan
WITH RECOMPILE
AS
SELECT *
FROM [dbo].[People]
‘.
Msg 3609, Level 16, State 1, Procedure sp_syspolicy_dispatch_event, Line 65
The transaction ended in the trigger. The batch has been aborted.

What actually happens here is the transaction will be rolled back as it violated the policy which was defined.

Conclusion

This proves the fact that PBM can indeed be used to prevent some conditions which you don’t want to happen in your db/environment.

Thanks for reading and keep watching this space for more !