Windows Azure Service Dashboard – Checkout the health of Azure Services


Windows Azure Service Dashboard is an excellent way to know the health/availability of your Windows Azure Services.

Azure Doctor

You might always want to know if any of the service is down or having some performance degradation, and with windows azure service dashboard this information is just few clicks away.

All you need to do is access http://www.windowsazure.com/en-us/support/service-dashboard/

This portal is available to all, irrespective of your account availability.

To me this is a great initiative by Microsoft to report service availability details as this is super useful from user perspective.

The portal is really impressive with the amount of information it provides. It will help you understand if

1. The service availability is normal

2. There is a performance degradation

3. There is a service interruption

4.There is any specific information(FYI type) regarding a service

WindowsAzureDash1

You can drill down each of the service and see what’s the status for each region

WindowsAzureDash2

Incase there is a problem, then drilling down the problem description will help you understand what is the current status of the fix, and how the support team is acting on the problem.

WindowsAzureDash3

I’m really impressed with this level of details. This will come handy when you have production deployments in azure and you need to keep your business users informed when there is a problem.

Now comes the icing of the cake –

Historical data ! Yes, you have the flexibility to see historical data and understand if there was a problem

WindowsAzureDash4

Conclusion

These are cool facilities made available to you by the Azure team and this will definitely prove helpful to you.

Thanks for reading and keep watching this space for more !

Advertisement

High Availability within Windows Azure VMs – An exciting opportunity !


Windows Azure Virtual Machines is a real game changer as it gives us ability to spin up VMs in no time to setup a full blown SQL Server.

The topic of High Availability(HA) will come to your mind in case you are planning to deploy a mission critical SQLServer on Windows Azure VMs and you cannot afford any downtime.

Do Microsoft take the VMs in Azure down for maintenance ? Yes, they do that. There are scheduled windows when MSFT will do maintenance for their data centers, host machines and they will restart the VMs which are part of the host.

What does this mean to your mission critical application which utilizes SQLServer deployed on this VM ?

“Outage”

Though Microsoft is really careful planning and scheduling these maintenance to occur during non business hours,it might still impact your SLAs(How much time your server will be up and running).

Is there a way to override this ?

Yes,there are ways to avoid this and there comes the concept of Availability Sets.

In a nut shell, you will have to have 2 or more VMs for the application to be highly available and you configure them on the same Availability Set.

When you configure 2 VMs in an availability set, Microsoft will never take them down at the same time during maintenance windows(Related to update domains), and also it can overcome single point of failures within the racks(Fault domains).

The below picture should help to understand the concept well.

AzureVMHA

VM2 and VM8 are part of an availability set and with these VMs your application can be highly available.  During a maintenance window these 2 VMs wont go down at the same time.

You can create an availability set when configuring your 1st VM.

AlwaysONHA1

When creating the 2nd VM,you have the option to add it to the availability set.

Azure VM Availability Set

Always ON Availability Groups can be configured to ensure that outages are reduced or avoided.

Conclusion

Windows Azure is gaining traction and with the inclusion of HA features, your move up to the cloud is more reliable now.

Thanks for reading and keep watching this space for more !

Backup your Windows Azure SQL Databases – Yes,you should !


Care about RPO’s and RTO’s?  Then you should be backing up your Windows Azure SQL Databases(Formerly SQLAzure).

clock

Windows Azure SQL Database is highly available and data is redundant(3 copies of your database is stored elsewhere),however that doesn’t help you to recover from the below situation  –

“Hey DBA, I deleted few records from the database accidently !!! Can you please recover the database for me ? “

You definitely need a backup of the database to recover from this situation.

One of the assumptions which I normally hear while talking about Windows Azure SQL Database is that you don’t need to backup your databases and Microsoft takes care of it under the hood ! This is wrong, and you should do it in-case you have a need to tackle situations like what was mentioned above.

You can either do a manual export of your database to the storage account or you can schedule the exports(New Update, Scroll down for details).This exported copy can be used to do the restores(Imports).

The import options are really limited. You cannot do operations like overwriting(Replace) a database etc. I’m really confident that Azure will reach that point pretty soon.

In this post we will see how the manual export process works and will also see how we can import an exported database back.

When doing this manually its always a good idea to get a transactionally consistent backup copies. For this purpose we will need to copy the database to the same server or to a different server. In this post we will do a copy to the same server.

So, we have a database called WASDRocks with a table named ChildTable. The table as 2 records as shown below

Azure backup1

We will now do a database copy to the same server using command

CREATE DATABASE WASDROCKS_Copy AS COPY OF [WASDROCKS]

There you go, we have the new database ready now which is a transactionally consistent copy.

Azure backup2

We will now export the WASDROCKS_Copy and keep it safe under our storage account. Export option is available right below the database selection.

Azure backup3

Storage account needs to be selected along with container details and once the credentials are entered correctly(Real time check of passwords !!!) the .BACPAC will be available.

Azure backup4

Azure backup5

Great, so now we have a transactionally consistent database backup. We can drop this database to avoid additional costs(Yes,its billed)

Now, lets’ do some deletes !!! We will delete a record from the ChildTable

Azure backup6

We can recover this data using the backup which we had taken earlier. All we need to do is an Import

Azure backup7

Note – In a real world situation be very careful about your RPO values. You might have to increase  or decrease the number of exported copies to achieve your SLA. More number of exported copies means, more cost overheads for the storage.

If you try to overwrite the database by giving the same database,ie WASDROCKS is our case,then there will be an error.

Azure backup8

This clearly states the limitations of import which we talked about earlier. You cannot overwrite an existing database.

We will import the backup copy as a different database named WASDROCKS_Recovery.

Azure backup9

There you go, the recovered database is ready for use now.

Azure backup10

If we connect to the recovered database and check for the table, then we can find the details of the deleted row.

Azure backup11

Yes, this is not something which is really flexible to do point -in-time restores,but it works just fine. What really matters is your RPO/RTO numbers and you can plan your exports according to that need.

Is there a way to schedule exports, so that manual intervention is limited ?

Yes,this is what I really love about Windows Azure Team. They are really aggressive and they keep adding features at a great pace.

Automated Database Exports was related last month and please refer this blog post by Scott Guthrie for more details.

Keeping backing up your databases and do random restores to ensure that they are good.

Thanks for reading and keep watching this space for more !

You don’t want to miss this out,Serious SQLServer training – 24 Hours of PASS event !


Back to back quality SQLServer sessions! Yes, you heard it right and that’s what 24 Hours of PASS is all about.

The event kicks off at 12:00 GMT on Sep 20 and runs for 24 consecutive hours.

The sessions are selected carefully, and are categorized into 6 tracks –

Enterprise Database Administration (DBA)

Application Development (AppDev)

BI Information Delivery (BID)

BI Platform Architecture, Development & Administration (BIA)

Cloud Application Development & Deployment (CLD)

Professional Development (PD)

Complete details of sessions are available at http://www.sqlpass.org/24hours/fall2012/SessionsbyTrack.aspx

I’m looking forward to attend some great sessions next week, and my favorite picks are –

Characteristics of a Great Relational Database by Louis Davidson

Digging Into the Plan Cache  by Jason Strate

Three Ways to Identify Slow Running Queries by Grant Fritchey

Fasten Your Seatbelt – Troubleshooting the Most Difficult SQL Server Problems  by Klaus Aschenbrenner

SQL Server Private Cloud != Azure by Allen Hirt and Ben DeBow

What’s All the Buzz about Hadoop and Hive? by Cindy Gross

DBCC, Statistics, and You by Erin Stellato

Best Practices for Upgrading to SQL Server 2012 by Robert Davis

PowerShell 101 for the SQL Server DBA by Allen White

Using SQL Server 2012 Always On by Denny Cherry

Leadership – Winning Influence in IT Teams by Kevin Kline

I’m really excited ! Are you ? Register here

Thanks for reading.

SQLSailor is exploring(Part6) – Understanding firewall concepts for SQLDatabase on Windows Azure


Security is one of the most discussed topics for any cloud deployment plans and no organisation would like to compromise on security when they make that ‘final’ decision to move their database to the cloud.

In this blog post we will review how protected are our SQLDatabases within Windows Azure using firewalls.

I encountered couple of firewall related issues when I first started working with Windows Azure, and that motivated me to write this blog post.

One of the error message or a warning was

The error message was pretty much saying that the IP address (Hidden in the picture) is not configured to manage the SQLDatabase within the server.

Before we see how we can configure the IP address to grant access, we will need to review what levels of firewall rules can be set on Windows Azure.

We can set firewall rules at Server Level and Database Level

Server level rule simply means that a client within a particular IP or within a range can access all your databases within the same SQLDatabase server.

In case you would like to restrict a client with a specific IP or a range of IPs to access only specific databases within the SQLDatabase server, then the role of Database Level firewall  rules pitch in.

SQLDatabase firewall is considered very important within Windows Azure architecture. Connection attempts from the Internet and Windows Azure must first pass through the SQL Database firewall before they can reach your SQL Database server or database.

Let’s consider a simple example to learn this in a better way –

When computer A tries to connect SQLDatabase Server from web, then the originating IP address is first checked by the SQLDatabase firewall against the server level firewall and grants access if the IP is within server level range. It will check the database level firewall rule in case the IP is not found within server level firewall range and connection is granted for the specific database in case the IP falls in database level rule.

Setting up server level rules

The easiest method to create a Server level firewall is via the Windows Azure portal itself.

We will need to select the server and choose the Config option to set a range of IPs or a single IP.

We have the option to check what all IPs or IP ranges are configured at server level using view sys.firewall_rules after connecting to master database.

Setting up database level rules

The recommended approach to create a database level firewall is to use the stored procedure sp_set_database_firewall_rule

We will need to connect to the exact database for which we need to create the rule and run the stored procedure with the parameters.

[The IP mentioned above is just a test case one]

The database level IPs or range can be checked using another view sys.database_firewall_rules and we will need to connect to the database and run this view to get the accurate details.

A request with IP 10.0.0.5 will be able to access only the database for which the rule was set and not all the databases within the SQLDatabase server.

Conclusion

There is good amount of protection available on Windows Azure platform for protecting your mission critical SQLDatabases and in this post we had a quick overview of firewall level protection.

You can read more about the comprehensive security concepts for a SQLDatabase on Windows Azure here –

Windows Azure SQL Database Connection Security

Keep checking this space for more and there are lots of new and exciting stuff coming up !

This post is in continuation with my earlier posts

SQLSailor is exploring(Part1) – Creating my first SQL Database on Windows Azure

SQLSailor is exploring(Part 2) – Managing SQL Databases on Windows Azure

SQLSailor is exploring(Part3) – Designing SQL Databases on Windows Azure

SQLSailor is exploring(Part4) – Basic DML,DDL operations with SQL Databases on Windows Azure

SQLSailor is exploring(Part5) – SQLDatabase Dashboard on Windows Azure

Thanks for reading.

SQLSailor is exploring(Part3) – Designing SQL Databases on Windows Azure


This post is in continuation with my earlier posts

SQLSailor is exploring(Part1) – Creating my first SQL Database on Windows Azure

SQLSailor is exploring(Part 2) – Managing SQL Databases on Windows Azure

In this post we will check out how to design a SQL Database which is on Windows Azure.

Once the database is created we have the Manage button to start working on administration and design of the database (As mentioned in the earlier posts)

Clicking on Manage button will route us to another page where in we will have to enter the credentials(Set during database creation)

Once the credentials are validated we will have a page where in we can choose the design option.

Let’s start building a table for the database MySQLDatabase which we have already created earlier.

We have the option called New table which will help us to start designing the tables and the columns.

We can specify the table name, start building the columns, set Primary key, set Default values, specify Is Identity, Is Required etc.

Once the required details have been filled in we have the option to Save the structure.

We have the option to create and manage the indexes too. The Indexes and Keys section will display the existing indexes and we have the option to create new ones too.

Creating an index is also pretty easy and we have some advanced options like Auto Update Stats and Rebuild Index Online.

Once the database and table is ready, we will need some data to do some DML operation. We can insert some data via the Data section.

The portal is pretty smart as it does validation the same time you are entering the data.

Conclusion


Designing the tables,indexes,relations etc are really easy via the portal and we saw some real time validations too.

I will write about some DML operations during the coming days and there some exciting stuff coming up too.Please stay tuned.

Thanks for reading.

SQL Azure Compatibility Assessment Tool


Today I was thrilled to hear about the new tool which was released by Microsoft yesterday(3rd Jan 2012).The code name for the tool is “SQL Azure Compatibility Assessment Tool” and this is the right tool to test compatibility if you are planning to move your databases to SQL Azure.

All the details of the release if available under

http://social.technet.microsoft.com/wiki/contents/articles/6246.aspx

I decided to test this wonderful tool which will in a matter of minutes will help me to understand any compatibility issues if I move my databases to the cloud(SQL Azure).

For the test to be successful you will need a LIVE ID and a .dacpac

I already had a live ID which I am using since many years and I decided to go ahead with the process of creating a .dacpac

How do I prepare a .dacpac – 

1. You will need to download and install SQL Server Data Tools to proceed and create a .dacpac

SQL Server Data Tools needs to have Visual Studio 2010 with SP1,and as I already had them installed on my test server,the tool got installed correctly.

2. Opened SQL Server Data Tools GUI to Create a Database Project and Imported a database which I had already backed up from SQL Server 2005 and restored to SQL Server 2012 RC 0.

The Next setup was to proceed and build this.

After the build was completed a .dacpac was created.It was available under

\\Path\Visual Studio 2010\Projects\Database2\bin\Debug

Assessment after .dacpac creation

I went ahead and accessed the portal https://assess.sql.azure.com .Logged in with my live ID and choose New Assessment

This gave me an option to upload the .dacpac which was created earlier.As soon as I uploaded the file,I got Assessment In Progress screen

I went ahead and did a refresh to see if the assessment has completed and found that its done and report is ready

The view button gave me this report which had 2 sections Not Supported and Need to fix

My test database had 2 Not supported explanations and it can be found here –