Deleting a Storage Account from Windows Azure – The right way !

I like to clean up things after I’m done with my testing and this morning I decided to clean up my VM’s, Databases which were created under Windows Azure.

Everything went smooth till I tried to delete my storage account. As soon as I tried a delete there was an exception.

Azure Storage Delete1

The ‘Details’ section gave me more details on why this action cant be completed.

Azure Storage Delete2

The error message is quite self explanatory. This was telling me that the storage account which I was trying to delete was having a container which was holding an active image.

This was indeed true. I decided to go ahead and explore the container for the account. I had created multiple files under this container before.Azure Storage Delete3

Then I decided to go ahead and check the blobs under the container, and there was indeed a blob available. As a matter of fact even the container itself is a blob.

Azure Storage Delete4

I decided to delete this blob, and then successfully deleted the storage account from the portal.

To sum up, the right way to delete a storage account from windows azure is to check if you have any active blobs under the container and remove that blob well before you try and remove the storage account.


Hope this short explanation helps you understand how storage is handled within Azure and how things are categorized with respect to Accounts, Container Blobs and Blobs within the containers.


Checkout SQL Server 2014 CTP1 and Windows 2012 R2 preview up in the cloud

You must be really busy spinning up VMs to test SQL Server 2014 CTP1 and Windows Server 2012 R2 preview.

If your on-premise desktop computer/laptop is running low on resources and if you don’t want to take the pain of handling multiple VMs locally,then you have an option to test them out right up in the cloud.

Yes, You can always spin up a Windows Azure VM and do all your testing and shut that down once you are done with your work to save some money.

Sounds like a deal right? Yes, its pretty easy and fast. It takes less than 10 minutes to provision a Windows Azure VM.

Lets look at the options we have in Azure for the test VMs –

1. SQL Server 2014 CTP1 Evaluation Edition on Windows Server 2012

This image will setup SQL2014 CTP1 for you on Windows Server 2012. Once the VM is provisioned and is in running state, you can connect it via RDP to see that SQL Server is ready to test.


2. Windows Server 2012 R2 Preview

This image will provision a VM for you with Windows Server 2012 R2 preview. You can spin up multiple VMs to do your hardcore testing with Windows.



This is indeed a great way to test the latest builds with minimum efforts and you always have the option to shutdown the VMs after use to avoid a heavy bill.

Thanks for reading and keep watching this space for more.

Moving a database from on-premise to the cloud – Step by Step Walkthrough

Cloud is the future and moving there is now as easy as few mouse clicks.

Today we will see how an On-premise database can be moved to a SQL Server instance running up on a Windows Azure Virtual Machine.There are multiple ways to accomplish this and in this post we will see how this can be achieved using SSMS wizards.

Step 1

I have a database named “MoveMeToCloud”,and this is one who is ready to take the flight and move to the cloud. This is a database which is deployed on my on-premises SQL Server Instance.

The first process is to export this database. For that purpose we will choose Tasks —

Export Data-tier Application


As always an introduction wizard is opened up, and you can proceed by choosing next.(Also can say do not show this again !)


Step 2

In this step we will connect Windows Azure Storage Account and we will export the BACPAC directly to the cloud storage.


When we connect the Azure storage account we will need to provide the account name and the key. Manage Access Keys under the Azure portal(Storage section) will get this info for us.



Step 3

Once the storage account connection is established we can proceed and finish this export activity.


The wizard will do multiple checks, and the process will complete eventually(Provided all the conditions are met).

I will talk about the conditions in a different post.


Step 4

Once the export is completed successfully we can check and verify that the BACPAC is indeed available under Windows Azure Storage Account


Step 5

As the BACPAC is now local to the cloud we can now proceed and do an Import of that file.

We will connect to a Windows Azure Virtual Machine which is running SQL Server 2012 up in Azure to do the import.

We will choose Import Data Tier Application to initiate the import.


We will need to provide the storage account details under the import settings


Once the connection is established correctly we can proceed and choose the database settings


Database settings will allow us to place the data file and log file according to our requirements.

We can proceed and finish the wizard to complete the import activity.



Step 6

Once the import is completed we are good to access the database up in the cloud.

SSMS Cloud

Thanks for reading and keep watching this space for more.

Windows Azure pay per minute – Time is money !

One of the most exciting news which got announced yesterday was related to Windows Azure. When Scott Guthrie mentioned that billing model for Windows Azure is going to be per minute basis from now on, then there was a great cheer from the crowd.

Time is money

This is indeed a game changing announcement. I definitely foresee great amount of Windows Azure usage in the future, and this announcement is a great deal for the cloud adopters.

Earlier if I had used my cloud service for 20 minutes,and then turned that off I was charged still for the full hour. Lot many cloud providers still operate with this policy.

Going forward I will be charged for what I useI will be charged for just 20 minutes, no questions asked.

Another super news which was really exciting for me was related to no charge for stopped VMs.

Think about a situation –

Your Dev/Test folks works from 9 AM – 5 PM window. On premise Dev/Test boxes sits idle after 5PM utilizing all the power and other compute resources in your data center. This is sheer waste of money and resources, and in 2013 you should definitely think about moving these to the cloud.

Dev/Test environments can be easily moved to Windows Azure VMs and you can just stop them after 5PM and just don’t have to pay for anything. This is a big deal isn’t? I have no second thoughts about it.

Cloud is the future and these great announcements are making it more affordable to the public.

Thanks for reading and keep watching this space for more.

SQL Server 2014 – A new and exciting journey !!! “Into the cloud”

Today Microsoft announced their next major version of SQLServer, SQL Server 2014 at Tech-Ed NA 2013.

This version of SQLServer is indeed a reflection of Microsoft’s vision towards the future computing. A bright and colorful “In to the Cloud” future !

The journey of SQLServer from SQL2000 – SQL2014 is just amazing. On a related note,Its worth reading this post by Quentin Clark which explains this journey with the help of a neat diagram.

So when can I start exploring the new features? Where can I download the same to evaluate it ?

Thinking boy

Here is the answer for you  –

Go to  and choose option Get-Notified >

You will be asked to enter few details and you can sit and wait for that “email” which will eventually allow you to download the bits and do some testing.

What’s new in SQL2014 that I am excited about ? Lets do a quick walk through of my favorite features –

Note- This is not an extensive list of features which SQL2014 will provide you.I will write about that altogether in a different post.These are some of the features which I really like about.
  • In Memory OLTP(Code named Hekaton) – 

This is one of my favorite features that I’m really looking forward to work with. This feature will be called as SQLServer In-Memory OLTP Engine for SQL2014.

SQLServer In-Memory OLTP Engine will boost  OLTP workloads to perform remarkably better. The beauty of this feature is you can now pick tables and define it as memory optimized and these tables are fully transactional.

Anytime I will suggest you to download and read the white paper ‘SQL Server In-Memory OLTP Project “Hekaton” Internals Overview for CTP1’ by Kalen Delaney(B/T) to understand this feature really well.

  • 8 Readable Secondaries –

With SQL2014 we will have the ability to add up-to 8 readable secondaries for our read-only workload(Mainly reporting queries).

  • Always on to Windows Azure Virtual machine – 

This is one feature which I’m really looking forward. This will enable us to add a secondary replica directly in Windows Azure Virtual Machine. This is one feature which the CIOs would love, as you pay per use for Windows Azure VMs.

  • Buffer pool extension using SSDs – 

This can improve query performance by allowing the use of non-volatile solid-state drives to reduce SQLServer memory pressure with no risk of data loss.

Brent Ozar(B/T) has a great post explaining this and you can read it here.  He also have provided a great deal of information about other SQL2014 features too. Always a great read.

  • Performance Data Collector – 

I’m also looking forward for the enhancements which Performance Data Collector module will come with. The details are not yet out on what is new, but I definitely hope that there will be some good changes here and there will be deep integration with SSMS for this module.


With SQL2014 I’m pretty sure that your move towards the cloud will be more easy and a nice experience and I’m really looking forward for it.

Thanks for reading and keep watching this space for more(Lot of SQL2014 for sure !)

How many SQL Servers you run within your environment ? – MAP could be your best bet to answer this question !

A blog post about Microsoft Assessment Planning Toolkit was in my list for quite a while and today I decided to write about it.

Let’s assume that you are working for a pretty large organization and there are a lot of database servers running within the environment.

One fine day companies CTO or even the DBA manager walks to desk and taps your shoulder to ask this question –

“How many SQL Servers are we running here?

In case you are maintaining a proper repository after building each and every server,then the answer could be given fast.

If you dont have any repository and you don’t maintain a registered server list,then the question can be as good as a direct punch !

A smart DBA is one who has the right set of tools to get the right information at the right time.

We have a great tool called Microsoft Assessment Planning Tool Kit which does a great job to provide you sufficient information to answer the above question and much more. There are lot many features available this tool kit and we will try to cover few of them.

A very recent copy of MAP was published last month and currently the version is 7.0.The tool kit can be downloaded from here.

Normally I prefer to run this tool on my laptop or work machine to pull information/reports and I never do this install for a server.

In case you are running the old Windows XP OS within your laptop or work machine, then there is bad news. The latest version will work only for the below mentioned OS –

  • Windows 7 with Service Pack 1 (Professional, Enterprise, and Ultimate editions only)
  • Windows Vista with Service Pack 2 (Business, Enterprise, and Ultimate editions only)
  • Windows Server 2008 R2 with Service Pack 1
  • Windows Server 2008 with Service Pack 2

Now lets do MAP install and see how cool this tool is all about.

Once the setup is downloaded we can run the same to start the whole install process. The setup will do some initial checks like disk free space availability etc and will give you this screen.

The summary clearly states that the setup will install SQL Express LocalDB too along with the tool. Later on we will see why SQL express is required for the tool.

The install is pretty fast and once the setup is completed first launch screen looks like below

The initial setup screen will prompt us to create a database.

In this walk through we will name the database as ServerInventory. Clicking on OK will create the database for us.

Once the database is created, the initial screen will show us the details of steps to be completed.

We will now do Perform Inventory step. Clicking on GO will lead us to the below screen

This window will confirm how powerful this tool is. You can collect inventory of multiple application servers and even Oracle/MySQL and Sybase.

We are interested in SQL Server and will select the option SQL Server with Database details. Clicking on next will lead us to the below screen which will ask us which method we need to use to discover the servers. For this demo we will choose a single server, hence will select the option as below

The next 2 screens will deal with adding a computer name and the credentials. Final action will be to hit Finish and the assessment will begin

Once the assessment is completed, we can close this window and can start exploring the reports under Scenario Status section.

We can click on the view option of SQL Server Discovery report to get a very neat formatted report of instances available for the particular server.

In case you are interested only in total Database Engine instances, then the section to look at is Microsoft SQL Server Database Engine Editions section.That will give you an accurate count.

Another report which looked really cool to me is the Software usage tracking report.

Another report which excites me a lot is the Virtual Machine Discover Results report –


MAP tool has great potential and it allows you to make crucial business decisions during consolidation efforts.

Thanks for reading.

SQLSailor is exploring(Part6) – Understanding firewall concepts for SQLDatabase on Windows Azure

Security is one of the most discussed topics for any cloud deployment plans and no organisation would like to compromise on security when they make that ‘final’ decision to move their database to the cloud.

In this blog post we will review how protected are our SQLDatabases within Windows Azure using firewalls.

I encountered couple of firewall related issues when I first started working with Windows Azure, and that motivated me to write this blog post.

One of the error message or a warning was

The error message was pretty much saying that the IP address (Hidden in the picture) is not configured to manage the SQLDatabase within the server.

Before we see how we can configure the IP address to grant access, we will need to review what levels of firewall rules can be set on Windows Azure.

We can set firewall rules at Server Level and Database Level

Server level rule simply means that a client within a particular IP or within a range can access all your databases within the same SQLDatabase server.

In case you would like to restrict a client with a specific IP or a range of IPs to access only specific databases within the SQLDatabase server, then the role of Database Level firewall  rules pitch in.

SQLDatabase firewall is considered very important within Windows Azure architecture. Connection attempts from the Internet and Windows Azure must first pass through the SQL Database firewall before they can reach your SQL Database server or database.

Let’s consider a simple example to learn this in a better way –

When computer A tries to connect SQLDatabase Server from web, then the originating IP address is first checked by the SQLDatabase firewall against the server level firewall and grants access if the IP is within server level range. It will check the database level firewall rule in case the IP is not found within server level firewall range and connection is granted for the specific database in case the IP falls in database level rule.

Setting up server level rules

The easiest method to create a Server level firewall is via the Windows Azure portal itself.

We will need to select the server and choose the Config option to set a range of IPs or a single IP.

We have the option to check what all IPs or IP ranges are configured at server level using view sys.firewall_rules after connecting to master database.

Setting up database level rules

The recommended approach to create a database level firewall is to use the stored procedure sp_set_database_firewall_rule

We will need to connect to the exact database for which we need to create the rule and run the stored procedure with the parameters.

[The IP mentioned above is just a test case one]

The database level IPs or range can be checked using another view sys.database_firewall_rules and we will need to connect to the database and run this view to get the accurate details.

A request with IP will be able to access only the database for which the rule was set and not all the databases within the SQLDatabase server.


There is good amount of protection available on Windows Azure platform for protecting your mission critical SQLDatabases and in this post we had a quick overview of firewall level protection.

You can read more about the comprehensive security concepts for a SQLDatabase on Windows Azure here –

Windows Azure SQL Database Connection Security

Keep checking this space for more and there are lots of new and exciting stuff coming up !

This post is in continuation with my earlier posts

SQLSailor is exploring(Part1) – Creating my first SQL Database on Windows Azure

SQLSailor is exploring(Part 2) – Managing SQL Databases on Windows Azure

SQLSailor is exploring(Part3) – Designing SQL Databases on Windows Azure

SQLSailor is exploring(Part4) – Basic DML,DDL operations with SQL Databases on Windows Azure

SQLSailor is exploring(Part5) – SQLDatabase Dashboard on Windows Azure

Thanks for reading.

SQLSailor is exploring(Part5) – SQLDatabase Dashboard on Windows Azure

Well, that is one impressive dashboard of Audi RS4! It gives huge amount of flexibility for the driver to control the transmission with lot of ease, and get quick view of crucial information.

Windows Azure portal does provide a dashboard for SQLDatabase too!

It will give information regarding space usage, available space, status of the database, server under which the database lives, collation, edition, location etc.

One interesting option I noticed under the dashboard is Show Connection strings.

It gives you the connections strings for ADO.NET, JDBC, PHP and ODBC.

There is a dashboard view for the Server too

Next time when someone asks you a question like “How many databases can be created on each SQL Database server?”

You can answer them by saying “You can create up to 149 databases in each SQL Database server!”

Each server supports 150 databases and one among them is Master and 149 are user databases.

This post is in continuation with my earlier posts

SQLSailor is exploring(Part1) – Creating my first SQL Database on Windows Azure

SQLSailor is exploring(Part 2) – Managing SQL Databases on Windows Azure

SQLSailor is exploring(Part3) – Designing SQL Databases on Windows Azure

SQLSailor is exploring(Part4) – Basic DML,DDL operations with SQL Databases on Windows Azure


Dashboard gives you a quick overview of database as well as a server on windows azure and I hope over long run Microsoft will provide more and more features and flexibility for dashboard views.

Thanks for reading.

SQLSailor is exploring(Part4) – Basic DML,DDL operations with SQL Databases on Windows Azure

In this post we will review some basic DML,DDL capabilities of SQL Database on windows Azure.

This post is in continuation with my earlier 3 posts

SQLSailor is exploring(Part1) – Creating my first SQL Database on Windows Azure

SQLSailor is exploring(Part 2) – Managing SQL Databases on Windows Azure

SQLSailor is exploring(Part3) – Designing SQL Databases on Windows Azure

For the purpose of this demo we will use a database named SQLDatabase and Table dbo.Employee.

There is already some data which is entered for this table. Using Desgin view and under Data column we can see these details.

The best part of the portal I liked is it gives a clear tree like structure of my activities and the object hierarchy is also shown pretty neat.

We can write a simple SELECT statement using the New Query option to retrieve the data.

If you run with Actual Execution plan, then the query plan details are also made available.

There are 3 options to view the Query plans and they are –

Graph, Grid and Tree

Now we will quickly do a simple DELETE statement for one of the rows. Once delete is completed we can go back to Table view under My Work and see if changes are reflected (Refresh is required).

It’s pretty easy to create your own scripts and Open them in the design view of the portal and create your objects and relations.


We quickly looked at simple DML and DDL operations within a SQLDatabase and for a complete list of supported TSQL Statements for Windows Azure SQL Database you can bookmark the below link.

Supported Transact-SQL Statements (Windows Azure SQL Database)

There is more exciting stuff lined up, so keep checking this space !

Thanks for reading.

SQLSailor is exploring(Part3) – Designing SQL Databases on Windows Azure

This post is in continuation with my earlier posts

SQLSailor is exploring(Part1) – Creating my first SQL Database on Windows Azure

SQLSailor is exploring(Part 2) – Managing SQL Databases on Windows Azure

In this post we will check out how to design a SQL Database which is on Windows Azure.

Once the database is created we have the Manage button to start working on administration and design of the database (As mentioned in the earlier posts)

Clicking on Manage button will route us to another page where in we will have to enter the credentials(Set during database creation)

Once the credentials are validated we will have a page where in we can choose the design option.

Let’s start building a table for the database MySQLDatabase which we have already created earlier.

We have the option called New table which will help us to start designing the tables and the columns.

We can specify the table name, start building the columns, set Primary key, set Default values, specify Is Identity, Is Required etc.

Once the required details have been filled in we have the option to Save the structure.

We have the option to create and manage the indexes too. The Indexes and Keys section will display the existing indexes and we have the option to create new ones too.

Creating an index is also pretty easy and we have some advanced options like Auto Update Stats and Rebuild Index Online.

Once the database and table is ready, we will need some data to do some DML operation. We can insert some data via the Data section.

The portal is pretty smart as it does validation the same time you are entering the data.


Designing the tables,indexes,relations etc are really easy via the portal and we saw some real time validations too.

I will write about some DML operations during the coming days and there some exciting stuff coming up too.Please stay tuned.

Thanks for reading.