Azure API Management caching catch

It's a best practice to store secrets in Azure Key Vault, and when you need them in an Azure API Management policy, we use managed identities.

Accessing Key Vault to read the secret can be simply done with this piece of policy:

<send-request mode="new" response-variable-name="keyvaultResponse" timeout="20" ignore-error="false">
    <set-url>https://didago-kv.vault.azure.net/secrets/my-secret/d24b7ce4e3a54343b9cf0da3b6bfe156/?api-version=7.0</set-url>
    <set-method>GET</set-method>
    <authentication-managed-identity resource="https://vault.azure.net" />
</send-request>

The response contains the secret and can be read and stored in a local variable like this:

<set-variable name="myCachedSecret" value="@{ var secret=((IResponse)context.Variables["keyvaultResponse"]).Body.As<JObject>(); return secret["value"].ToString(); }" />

Having this value in a variable now, we can use it somewhere else in the policy.

This piece of policy results in a call to Key Vault and, like described here as well, there could be an issue with calling Key Vault too frequently. It can lead to throttling but also Key Vault is priced per the number of transactions it handles (€0.026/10,000 transactions).

The solution to this problem is caching the secret value, which limits the load on Key Vault, increases performance and saves money.

Caching

The most simple way to cache values is by using the out of the box policy.

You have one for looking up a value in the cache:

<cache-lookup-value key="kvValue" variable-name="myCachedSecret" default-value="" />

And one for storing a value in the cache:

<cache-store-value key="kvValue" value="@((string)context.Variables["myCachedSecret"])" duration="3600" />

By adding these to the policy, we have a cached secret which will expire and be refreshed every hour (3600 seconds). The full policy would look like this:

<cache-lookup-value key="kvValue" variable-name="myCachedSecret" default-value="" />
<choose>
    <when condition="@((string)context.Variables["myCachedSecret"] == "")">
        <send-request mode="new" response-variable-name="keyvaultResponse" timeout="20" ignore-error="false">
            <set-url>https://didago-kv.vault.azure.net/secrets/my-secret/d24b7ce4e3a54343b9cf0da3b6bfe156/?api-version=7.0</set-url>
            <set-method>GET</set-method>
            <authentication-managed-identity resource="https://vault.azure.net" />
        </send-request>
        <set-variable name="myCachedSecret" value="@{ var secret = ((IResponse)context.Variables["keyvaultResponse"]).Body.As<JObject>(); return secret["value"].ToString(); }" />
        <cache-store-value key="kvValue" value="@((string)context.Variables["myCachedSecret"])" duration="3600" />
    </when>
</choose>

The catch

We use this piece of code in a lot of policies and while testing we ran into some unexpected behavior. With the tracing feature we could track when the cache was hit and it seemed the wrong value was read from the cache. As we reuse this piece of policy, including the cache keys, this lead to the idea that cache keys are not isolated per API, but shared within a service instance.

After posting a question to the community, this was confirmed by Microsoft as behavior by design when using the built-in cache. If you need more control over your caching, you should look at an external cache.

So keep in mind when you use the cache-lookup-value policy with caching-type ‘internal’, to use unique cache key names to avoid unexpected behavior.

The benefit

However, there also is an advantage in having shared cache keys as it means you have an easy option to invalide a cache key.

For example by creating an API which takes the cache key as a parameter and have the <cache-remove-value key="myCachedSecret" /> policy executed on that call to remove the key from the cache.