Microsoft + Oracle = New opportunity…Windows Azure is a happening place !


Couple of months back(To be specific in June) Microsoft and Oracle announced their partnership to help enterprise customers embrace the cloud.

It was almost clear that Oracle was going to be available in Windows Azure under the IaSS platform(Virtual Machines’),and today during one of the Oracle Open World keynotes this was officially announced to be available.

I guess Brad Anderson(Corporate Vice President, Cloud and Enterprise Engineering, Microsoft) will be the first Non Oracle person to deliver a keynote during an Oracle Conference(Read Cloud is changing the game).

MSORA2

 

MSORA3

 

As of today our Oracle friends can configure and deploy their databases on Windows Azure. Here is what the Azure VM gallery looks like -

MSORA

 

MSORA1

 

Note  – Some of them are on preview though.

This is really an exciting news. Giants like Microsoft and Oracle teaming up is definitely going to benefit enterprise business, and I’m really positive about that.

Are you excited about this ? Please share your thoughts on the comments section.

Thanks for reading and keep watching this space for more !

Windows Azure Service Dashboard – Checkout the health of Azure Services


Windows Azure Service Dashboard is an excellent way to know the health/availability of your Windows Azure Services.

Azure Doctor

You might always want to know if any of the service is down or having some performance degradation, and with windows azure service dashboard this information is just few clicks away.

All you need to do is access http://www.windowsazure.com/en-us/support/service-dashboard/

This portal is available to all, irrespective of your account availability.

To me this is a great initiative by Microsoft to report service availability details as this is super useful from user perspective.

The portal is really impressive with the amount of information it provides. It will help you understand if

1. The service availability is normal

2. There is a performance degradation

3. There is a service interruption

4.There is any specific information(FYI type) regarding a service

WindowsAzureDash1

You can drill down each of the service and see what’s the status for each region

WindowsAzureDash2

Incase there is a problem, then drilling down the problem description will help you understand what is the current status of the fix, and how the support team is acting on the problem.

WindowsAzureDash3

I’m really impressed with this level of details. This will come handy when you have production deployments in azure and you need to keep your business users informed when there is a problem.

Now comes the icing of the cake -

Historical data ! Yes, you have the flexibility to see historical data and understand if there was a problem

WindowsAzureDash4

Conclusion

These are cool facilities made available to you by the Azure team and this will definitely prove helpful to you.

Thanks for reading and keep watching this space for more !

High Availability within Windows Azure VMs – An exciting opportunity !


Windows Azure Virtual Machines is a real game changer as it gives us ability to spin up VMs in no time to setup a full blown SQL Server.

The topic of High Availability(HA) will come to your mind in case you are planning to deploy a mission critical SQLServer on Windows Azure VMs and you cannot afford any downtime.

Do Microsoft take the VMs in Azure down for maintenance ? Yes, they do that. There are scheduled windows when MSFT will do maintenance for their data centers, host machines and they will restart the VMs which are part of the host.

What does this mean to your mission critical application which utilizes SQLServer deployed on this VM ?

“Outage”

Though Microsoft is really careful planning and scheduling these maintenance to occur during non business hours,it might still impact your SLAs(How much time your server will be up and running).

Is there a way to override this ?

Yes,there are ways to avoid this and there comes the concept of Availability Sets.

In a nut shell, you will have to have 2 or more VMs for the application to be highly available and you configure them on the same Availability Set.

When you configure 2 VMs in an availability set, Microsoft will never take them down at the same time during maintenance windows(Related to update domains), and also it can overcome single point of failures within the racks(Fault domains).

The below picture should help to understand the concept well.

AzureVMHA

VM2 and VM8 are part of an availability set and with these VMs your application can be highly available.  During a maintenance window these 2 VMs wont go down at the same time.

You can create an availability set when configuring your 1st VM.

AlwaysONHA1

When creating the 2nd VM,you have the option to add it to the availability set.

Azure VM Availability Set

Always ON Availability Groups can be configured to ensure that outages are reduced or avoided.

Conclusion

Windows Azure is gaining traction and with the inclusion of HA features, your move up to the cloud is more reliable now.

Thanks for reading and keep watching this space for more !

Backup your Windows Azure SQL Databases – Yes,you should !


Care about RPO’s and RTO’s?  Then you should be backing up your Windows Azure SQL Databases(Formerly SQLAzure).

clock

Windows Azure SQL Database is highly available and data is redundant(3 copies of your database is stored elsewhere),however that doesn’t help you to recover from the below situation  -

“Hey DBA, I deleted few records from the database accidently !!! Can you please recover the database for me ? “

You definitely need a backup of the database to recover from this situation.

One of the assumptions which I normally hear while talking about Windows Azure SQL Database is that you don’t need to backup your databases and Microsoft takes care of it under the hood ! This is wrong, and you should do it in-case you have a need to tackle situations like what was mentioned above.

You can either do a manual export of your database to the storage account or you can schedule the exports(New Update, Scroll down for details).This exported copy can be used to do the restores(Imports).

The import options are really limited. You cannot do operations like overwriting(Replace) a database etc. I’m really confident that Azure will reach that point pretty soon.

In this post we will see how the manual export process works and will also see how we can import an exported database back.

When doing this manually its always a good idea to get a transactionally consistent backup copies. For this purpose we will need to copy the database to the same server or to a different server. In this post we will do a copy to the same server.

So, we have a database called WASDRocks with a table named ChildTable. The table as 2 records as shown below

Azure backup1

We will now do a database copy to the same server using command

CREATE DATABASE WASDROCKS_Copy AS COPY OF [WASDROCKS]

There you go, we have the new database ready now which is a transactionally consistent copy.

Azure backup2

We will now export the WASDROCKS_Copy and keep it safe under our storage account. Export option is available right below the database selection.

Azure backup3

Storage account needs to be selected along with container details and once the credentials are entered correctly(Real time check of passwords !!!) the .BACPAC will be available.

Azure backup4

Azure backup5

Great, so now we have a transactionally consistent database backup. We can drop this database to avoid additional costs(Yes,its billed)

Now, lets’ do some deletes !!! We will delete a record from the ChildTable

Azure backup6

We can recover this data using the backup which we had taken earlier. All we need to do is an Import

Azure backup7

Note – In a real world situation be very careful about your RPO values. You might have to increase  or decrease the number of exported copies to achieve your SLA. More number of exported copies means, more cost overheads for the storage.

If you try to overwrite the database by giving the same database,ie WASDROCKS is our case,then there will be an error.

Azure backup8

This clearly states the limitations of import which we talked about earlier. You cannot overwrite an existing database.

We will import the backup copy as a different database named WASDROCKS_Recovery.

Azure backup9

There you go, the recovered database is ready for use now.

Azure backup10

If we connect to the recovered database and check for the table, then we can find the details of the deleted row.

Azure backup11

Yes, this is not something which is really flexible to do point -in-time restores,but it works just fine. What really matters is your RPO/RTO numbers and you can plan your exports according to that need.

Is there a way to schedule exports, so that manual intervention is limited ?

Yes,this is what I really love about Windows Azure Team. They are really aggressive and they keep adding features at a great pace.

Automated Database Exports was related last month and please refer this blog post by Scott Guthrie for more details.

Keeping backing up your databases and do random restores to ensure that they are good.

Thanks for reading and keep watching this space for more !

Deleting a Storage Account from Windows Azure – The right way !


I like to clean up things after I’m done with my testing and this morning I decided to clean up my VM’s, Databases which were created under Windows Azure.

Everything went smooth till I tried to delete my storage account. As soon as I tried a delete there was an exception.

Azure Storage Delete1

The ‘Details’ section gave me more details on why this action cant be completed.

Azure Storage Delete2

The error message is quite self explanatory. This was telling me that the storage account which I was trying to delete was having a container which was holding an active image.

This was indeed true. I decided to go ahead and explore the container for the account. I had created multiple files under this container before.Azure Storage Delete3

Then I decided to go ahead and check the blobs under the container, and there was indeed a blob available. As a matter of fact even the container itself is a blob.

Azure Storage Delete4

I decided to delete this blob, and then successfully deleted the storage account from the portal.

To sum up, the right way to delete a storage account from windows azure is to check if you have any active blobs under the container and remove that blob well before you try and remove the storage account.

Conclusion

Hope this short explanation helps you understand how storage is handled within Azure and how things are categorized with respect to Accounts, Container Blobs and Blobs within the containers.

Monitoring Memory Usage of Memory Optimized Objects – SQL Server 2014


Starting SQL 2014 monitoring memory usage of the memory optimized objects is super important to ensure that your instance don’t run out of memory and cause real bad situations.

One way of ensuring that the memory optimized objects wont utilize a certain amount of memory is by setting up resource governor. This is a great level of control when you have multiple databases running on the same instance and you don’t want memory optimized tables eating up the whole memory.

Is there an easy way to get a quick overview of memory usage of the memory optimized tables in SQL2014?

Yes, you have a real good SSMS report just for this purpose. This one report utilizes DMVs under the hood and provides you with some valuable information.

InmemOLTP Reports

Lets do a quick walk through of what this report gives -

InmemOLTP Reports1

At any point of time I will be interested to see the usage of Table Used Memory counter.

Index Used Memory is based on the bucket count which you mention during non clustered hash index creation.

The query which the report runs under the hood to give you the usage related value is -

SELECT t.object_id, t.name, 
ISNULL((SELECT CONVERT(decimal(18,2),(TMS.memory_used_by_table_kb)/1024.00)), 0.00) 
 AS table_used_memory_in_mb,
ISNULL((SELECT CONVERT(decimal(18,2),(TMS.memory_allocated_for_table_kb - TMS.memory_used_by_table_kb)/1024.00)), 0.00) 
 AS table_unused_memory_in_mb,
ISNULL((SELECT CONVERT(decimal(18,2),(TMS.memory_used_by_indexes_kb)/1024.00)), 0.00) 
 AS index_used_memory_in_mb,
ISNULL((SELECT CONVERT(decimal(18,2),(TMS.memory_allocated_for_indexes_kb - TMS.memory_used_by_indexes_kb)/1024.00)), 0.00) 
 AS index_unused_memory_in_mb
FROM sys.tables t JOIN sys.dm_db_xtp_table_memory_stats TMS 
ON (t.object_id = TMS.object_id)

This query leverages the power of a new DMV dm_db_xtp_table_memory_stats

Conclusion

SSMS reports are great ways to get a quick overview of what is happening, and I expect more and more powerful reports getting incorporated to SQL 2014 during the coming days.

Thanks for reading and keep watching this space for more.

Checkout SQL Server 2014 CTP1 and Windows 2012 R2 preview up in the cloud


You must be really busy spinning up VMs to test SQL Server 2014 CTP1 and Windows Server 2012 R2 preview.

If your on-premise desktop computer/laptop is running low on resources and if you don’t want to take the pain of handling multiple VMs locally,then you have an option to test them out right up in the cloud.

Yes, You can always spin up a Windows Azure VM and do all your testing and shut that down once you are done with your work to save some money.

Sounds like a deal right? Yes, its pretty easy and fast. It takes less than 10 minutes to provision a Windows Azure VM.

Lets look at the options we have in Azure for the test VMs -

1. SQL Server 2014 CTP1 Evaluation Edition on Windows Server 2012

This image will setup SQL2014 CTP1 for you on Windows Server 2012. Once the VM is provisioned and is in running state, you can connect it via RDP to see that SQL Server is ready to test.

VMs

2. Windows Server 2012 R2 Preview

This image will provision a VM for you with Windows Server 2012 R2 preview. You can spin up multiple VMs to do your hardcore testing with Windows.

VMs1

Conclusion

This is indeed a great way to test the latest builds with minimum efforts and you always have the option to shutdown the VMs after use to avoid a heavy bill.

Thanks for reading and keep watching this space for more.