Full Domain Trust and Natively Compiled Stored Procedures – In Memory OLTP

Recently one of my developers brought this specific problem to my attention. He was trying to create a natively compiled stored procedure(With Execute as Owner), and was getting an error –


Could not obtain information about Windows NT group/user ‘Domain\UserName’, error code 0x5.

My first reaction was to check the builds and to ensure that the user is running latest CU.(Not yet SP1 !) . The user was indeed running the latest SQL 2014 build.

I decided to repro this issue at my end, and I was able to create/compile the procedure without any errors.

My next troubleshooting direction was to check user permissions, and understand more about the environment where the user is running into this error.

The user in this particular scenario was using his domain credentials to create the SPROC. The account was something like  HQ\Developer. HQ is the domain and Developer is the user name.

SQL Engine Services was running under account LAB\SQL.LAB is the domain and SQL is the SQL Service Account.

I read more about Natively Complied Stored Procedures, and especially focused well on this MSDN article, Creating Natively Compiled Stored Procedures.

This portion of the article was my focus area(Underlying the main point) –

” Regarding EXECUTE AS and Windows logins, an error can occur because of the impersonation done through EXECUTE AS. If a user account uses Windows Authentication, there must be full trust between the service account used for the SQL Server instance and the domain of the Windows login. If there is not full trust, the following error message is returned when creating a natively compiled stored procedure: Msg 15404, Could not obtain information about Windows NT group/user ‘username’, error code 0x5.”

In my case there was one way trust between HQ and LAB, however LAB didn’t trust HQ. The error message which the user got was obvious.

There are multiple workarounds to solve this problem and its listed in the article too –

  • Use an account from the same domain as the Windows user for the SQL Server service.
  • If SQL Server is using a machine account such as Network Service or Local System, the machine must be trusted by the domain containing the Windows user.
  • Use SQL Server Authentication.

I decided to leverage a SQL account and the user was able to create the procedure with that specific account.

Conclusion – 

In memory OLTP is a very exciting feature and there are a lot of dependencies which you should be aware before deciding to deploy it on Production.

Thanks for reading and keep watching this space for more !

Memory Optimization Advisor – SQL Server 2014 CTP2

SQLServer 2014 CTP2 came with an inbuilt tool called Memory Optimization Advisor which will help you in migrating your normal tables to memory optimized tables.

Where can I find this tool ?

All you need to do is right click the table you want to migrate, and choose the option Memory Optimization Advisor.


The tool will launch with a detailed description of what its capable of


I decided to play around with this tool, and here are my observations –

Note – This is still CTP, so things can change during RTM/GA phase.

I started of with a normal table named Employee which has 3 Col and 3 rows data on it. Nothing big, pretty simple.

Launched the advisor, and the initial checks were all green. [Be sure to carefully analyze the checks, they are all interesting.]


The wizard also has the ability to export a report (Who doesn’t like a report these days !)

I clicked next to proceed, and the wizard gave me some information about the limitations of memory optimized object, and a link which will explain the limitations in detail.


I liked these warnings, because its telling me well in advance about the limitations so that I will be more careful on what I’m up to.(Everything has a cost associated with it !)

Next up is some interesting stuff. The wizard is forcing me to select the options for memory optimized objects. I have the option to mention memory optimized file group,name and the file path.

I also have the option to re-name the original table, copy data from the original table to the memory optimized table and a check box to mention if the table needs to be moved with no data durability.(Default being both schema/data durability).

I decided to go with all defaults as this was a test case.


One of the other cool option which the above wizard window gave is this value –


I presume that this value will be the cost of size in memory which will be needed when the table is moved as a memory optimized object. I might not be right at this point, but I will update this post in case this is not true.

Next screen in the wizard talks about primary key and index creation. I decided to make column ID as the primary key with a NON-CLUSTERED HASH Index and a bucket count of 1024.


The final screen provided me a summary of my selections.


Yet another cool feature in the wizard is that it allowed me to script everything before I finalize my selections.

I decided to hit Migrate, and wanted to see how it goes.

Viola,all clean and green !


The wizard was smart enough to rename the old table,and created a memory optimized one for me.



I had scripted out everything before hitting migrate and that file looked like this –



This is a very neat feel good to have tool and those warnings and pre-checks will definitely help users to streamline issues well before they are IN MEMORY !

Thanks for reading and keep watching this space for more !

Monitoring Memory Usage of Memory Optimized Objects – SQL Server 2014

Starting SQL 2014 monitoring memory usage of the memory optimized objects is super important to ensure that your instance don’t run out of memory and cause real bad situations.

One way of ensuring that the memory optimized objects wont utilize a certain amount of memory is by setting up resource governor. This is a great level of control when you have multiple databases running on the same instance and you don’t want memory optimized tables eating up the whole memory.

Is there an easy way to get a quick overview of memory usage of the memory optimized tables in SQL2014?

Yes, you have a real good SSMS report just for this purpose. This one report utilizes DMVs under the hood and provides you with some valuable information.

InmemOLTP Reports

Lets do a quick walk through of what this report gives –

InmemOLTP Reports1

At any point of time I will be interested to see the usage of Table Used Memory counter.

Index Used Memory is based on the bucket count which you mention during non clustered hash index creation.

The query which the report runs under the hood to give you the usage related value is –

SELECT t.object_id, t.name, 
ISNULL((SELECT CONVERT(decimal(18,2),(TMS.memory_used_by_table_kb)/1024.00)), 0.00) 
 AS table_used_memory_in_mb,
ISNULL((SELECT CONVERT(decimal(18,2),(TMS.memory_allocated_for_table_kb - TMS.memory_used_by_table_kb)/1024.00)), 0.00) 
 AS table_unused_memory_in_mb,
ISNULL((SELECT CONVERT(decimal(18,2),(TMS.memory_used_by_indexes_kb)/1024.00)), 0.00) 
 AS index_used_memory_in_mb,
ISNULL((SELECT CONVERT(decimal(18,2),(TMS.memory_allocated_for_indexes_kb - TMS.memory_used_by_indexes_kb)/1024.00)), 0.00) 
 AS index_unused_memory_in_mb
FROM sys.tables t JOIN sys.dm_db_xtp_table_memory_stats TMS 
ON (t.object_id = TMS.object_id)

This query leverages the power of a new DMV dm_db_xtp_table_memory_stats


SSMS reports are great ways to get a quick overview of what is happening, and I expect more and more powerful reports getting incorporated to SQL 2014 during the coming days.

Thanks for reading and keep watching this space for more.