Skip to main content

Benchmarking Azure Key Vault Decryption

Azure Key Vault is used to store Keys, Certificates and Secrets and make them available to applications safely. It can create and store asymmetric (RSA and EC) keys. These keys expose their public key material but the private key remains stored within Key Vault. For decryption, the application needs to make a REST call to the Key Vault which will then return the decrypted result. There are libraries available for various languages and frameworks (including .NET) which enable developers to do this seamlessly. Integrating Azure Key Vault with .NET applications is a straight-forward process although not documented widely.

One of the frequently recommended technique for securing data is to use Envelope Encryption. This requires use Key Encryption Key (KEK) which is typically a RSA key stored in Azure Key Vault. The Data Encryption Key (DEK) is generated for each piece of data and is then used to encrypt the data using symmetric algorithms like AES. DEK is then itself encrypted using KEK and stored along with data. When the data needs to decrypted, DEK is extracted, decrypted using KEK and then data itself is decrypted using the decrypted DEK.

In Azure Key Vault, there is no cost associated with encryption. Encrypt, wrap, and verify public-key operations can be performed with no access to Key Vault, which not only reduces risk of throttling, but also improves reliability (as long as you properly cache the public key material). However, at the time of writing this article, operations against all keys (software-protected keys and HSM-protected keys), secrets and certificates are billed at a flat rate of £0.023 per 10,000 operations. If you have millions of messages flowing through your system, you have to wonder how much will it cost you to decrypt in terms of money. Since it will be a REST call to decrypt the DEK per message, time should also be a concern.

In my quest to find out the time needed for decryption operation, I did not have much luck googling. So I decided to benchmark it myself. I used Azure Key Vault libraries available for .NET Core. For testing purpose, my Azure Key Vault is located in UK South region. I tried two scenarios running the decryption 1000 times per scenario:
  1. Run the benchmark on my local machine (situated in Hyderabad, India) over a decent internet connection.
  2. Run the same benchmark in an Azure VM located in UK South region.
Let's see the results.

For scenario 1 (local machine):
BenchmarkDotNet=v0.12.1, OS=Windows 10.0.18363.836 (1909/November2018Update/19H2)
Intel Core i5-8265U CPU 1.60GHz (Whiskey Lake), 1 CPU, 8 logical and 4 physical cores
.NET Core SDK=3.1.300
  [Host]     : .NET Core 3.1.4 (CoreCLR 4.700.20.20201, CoreFX 4.700.20.22101), X64 RyuJIT
  DefaultJob : .NET Core 3.1.4 (CoreCLR 4.700.20.20201, CoreFX 4.700.20.22101), X64 RyuJIT


MethodMeanErrorStdDev
Decrypt203.0 ms3.33 ms2.95 ms

For scenario 2 (Azure VM in same region):
BenchmarkDotNet=v0.12.1, OS=Windows 10.0.17763.1217 (1809/October2018Update/Redstone5)
Intel Xeon Platinum 8171M CPU 2.60GHz, 1 CPU, 2 logical cores and 1 physical core
.NET Core SDK=3.1.300
  [Host]     : .NET Core 3.1.4 (CoreCLR 4.700.20.20201, CoreFX 4.700.20.22101), X64 RyuJIT
  DefaultJob : .NET Core 3.1.4 (CoreCLR 4.700.20.20201, CoreFX 4.700.20.22101), X64 RyuJIT


MethodMeanErrorStdDev
Decrypt15.31 ms0.305 ms0.669 ms

The decryption time drastically reduces when running in an Azure VM in same region as Key Vault. This is expected as most of the latency, when running in local machine, comes from internet connectivity. However, this time is not zero. Considering a 15ms response time, for decrypting DEKs for 1 million messages, it would take close to 4 hours. Then there would be some time taken for decrypting the actual data too. Azure Key Vault is a fantastic piece of technology but it helps to know its limitations too.

Hope this research helps when you are designing a system's Disaster Recovery response and trying to achieve your RTO and RPO targets.

Comments

Popular posts from this blog

Integrating React with SonarQube using Azure DevOps Pipelines

In the world of automation, code quality is of paramount importance. SonarQube and Azure DevOps are two tools which solve this problem in a continuous and automated way. They play well for a majority of languages and frameworks. However, to make the integration work for React applications still remains a challenge. In this post we will explore how we can integrate a React application to SonarQube using Azure DevOps pipelines to continuously build and assess code quality. Creating the React Application Let's start at the beginning. We will use npx to create a Typescript based React app. Why Typescript? I find it easier to work and more maintainable owing to its strongly-typed behavior. You can very well follow this guide for jsx based applications too. We will use the fantastic Create-React-App (CRA) tool to create a React application called ' sonar-azuredevops-app '. > npx create-react-app sonar-azuredevops-app --template typescript Once the project creation is done, we ...

Use AI to build your house!

When a new housing society emerges, residents inevitably create chat groups to connect and share information using various chat apps like WhatsApp and Telegram. In India, Telegram seems to be the favorite as it provides generous group limits, admin tools, among other features. These virtual communities become treasure troves of invaluable insights. But whatever app you use, there is always a problem of finding the right information at right time. Sure, the apps have a "Search" button, but they are pretty much limited to keyword search and are useless when you have to search through thousands of messages. I found myself in this situation when it was my turn to start on an interior design project for my home. Despite being part of a vibrant Telegram group, where countless residents had shared their experiences with various interior designers and companies, I struggled to unearth the pearls of wisdom buried within the chat's depths. I remembered that I could take advantage o...

Add Git Commit Hash and Build Number to a Static React Website using Azure DevOps

While working on a React based static website recently, there was a need to see exactly what was deployed in the Dev/Test environments to reduce confusion amongst teams. I wanted to show something like this: A quick look at the site's footer should show the Git Commit Hash and Build Number which was deployed and click through to actual commits and build results. Let's see how we achieved this using Azure DevOps. Git Commit Hash Azure DevOps exposes a variable called  $(Build.SourceVersion) which contains the hash of the commit. So I defined a variable in the Build Pipeline using it. Build Id and Build Number Azure DevOps also exposes two release time variables  $(Build.BuildId) and  $(Build.BuildNumber) which can be used to define custom variables in the pipeline. So we have a total of 3 variables defined: Next we use these variables in our React App. I created 3 global variables in index.html and assigned a token value to them. < script   type = "text/JavaScri...