Power Community

Power Community

Some tips to improve security of your Azure Blob Storage (expecially with Dynamics 365 Business Central)

In these days I’m happy to see that after my latest Azure training in Microsoft Italy (3 weeks ago) many partners are starting to use Azure Blob Storage in their projects with Dynamics 365 Business Central.

Some of you asked about security of the Azure Blob Storage in case you want to store important informations from Dynamics 365 Business Central. Here is what I personally recommend in case you want to increase security:

Disable anonymous access

This seems quite stupid, but sometimes I see blob storage with anonymous access. Azure Blob Storage supports optional anonymous public read access to containers and blobs but this present a security risk.

To disable it, click on Configuration and then set Allow Blob public access to Disabled:

Enable infrastructure encryption

When you create a storage account, all data are automatically encrypted using 256-bit AES encryption by default at the service level.

Customers who require higher levels of assurance that their data is secure can also enable 256-bit AES encryption at the Azure Storage infrastructure level for double encryption. Double encryption of Azure Storage data protects against a scenario where one of the encryption algorithms or keys may be compromised. In this scenario, the additional layer of encryption continues to protect your data.

To do that, when you create a storage account you can go to the Encryption tab and enable the  Enable infrastructure encryption option:

Enable storage key rotation

The most common way to access a storage account with Dynamics 365 Business Central is by using access keys. You should avoid distributing access keys to other users, hard-coding them in your code or saving them anywhere in plain text that is accessible to others.

To avoid having compromised keys, it’s recommended to periodically rotate those access keys. A key expiration policy enables you to set a reminder for the rotation of the account access keys. The reminder is displayed if the specified interval has elapsed and the keys have not yet been rotated. After you create a key expiration policy, you can monitor your storage accounts for compliance to ensure that the account access keys are rotated regularly.

To do that, you can select the Access keys blade and then click on Set rotation reminder:

Please note that in case the button is grayed, you need to rotate each of your access keys manually at least once in order to activate it.

In Set a reminder to rotate access keys, select the Enable key rotation reminders checkbox and set a frequency for the reminder:

To regenerate keys, just click on Rotate key:

Please remember that you can then update all your keys in the applications that uses them.

Monitor your Azure Storage Account by configuring Diagnostic Setting

The Overview page in the Azure portal for each Blob storage resource includes a brief view of the resource usage, such as requests and hourly billing. This information is useful, but only a small amount of the monitoring data is available.

Azure Blob Storage creates monitoring data by using Azure Monitor. Resource Logs are not collected and stored until you create a diagnostic setting and route them to one or more locations.

To collect resource logs, you must create a diagnostic setting. When you create the setting, choose blob as the type of storage that you want to enable logs for:

Then, specify one of the following categories of operations for which you want to collect logs:

  • StorageRead: read operations on oobjects
  • StorageWrite: Write operations on objects.
  • StorageDelete: Delete operations on objects.

and then you can specify the destination:

Now you can setup alerts on Azure Monitor or you can analize all the activities on your storage account, for example with Log Analytics and KQL. As an example, ths query genetates a summary of opertions in the last day:

StorageBlobLogs
| where TimeGenerated > ago(1d)
| summarize count() by OperationName
| sort by count_ desc

Use private endpoints with external clients

If you have applications or virtual machines that need to use the storage account, you can use private endpoints for your Azure Storage accounts to allow clients on a virtual network (VNet) to securely access data over a Private Link:

Use AAD-based access whenever possible for external applications

If you have external applications that need to interact with your storage account (for example for reading or writing blobs) use the Azure Active Directory authentication for them to access the storage account and add permissions accordingly (Storage Blob Data Reader, Storage Blob Data Contributor etc).

More informations on how to enable it can be foundhere.

Use Stored Access Policies with Shared Access Signatures

Shared Access Signatures (SAS) are a nice way to grant limited access to your storage account. They permit you to have granular control over how a client can access your data, like:

  • What resources the client may access.
  • What permissions they have to those resources.
  • How long the SAS is valid.

A stored access policy provides an additional level of control over service-level shared access signatures (SASs) on the server side. Establishing a stored access policy serves to group shared access signatures and to provide additional restrictions for signatures that are bound by the policy.

You can create Stored Access Policies for each container you want to give access to. Once the policy is created for the respective container, you can create a SAS token referencing that Access Policy. You can use a stored access policy to change the start time, expiry time, or permissions for a signature. You can also use a stored access policy to revoke a signature after it has been issued.

For creating stored access policies, you can check here.

Sometimes checking all these things could appear too much, but remember that security in a cloud world is always not too much. I always suggest to check these things when using azure Storage. Hoping that these tips can be useful also for you.

This post was originally published on this site

- Advertisement -spot_img

LEAVE A REPLY

Please enter your comment!
Please enter your name here

- Advertisement - Advertisement

Latest News

Integrating Dynamic 365 CRM with Third party API via custom connector using Power Automate – Part2

Integrating Dynamic 365 CRM with Third party API via custom connector using Power Automate – Part2 This article is in...

More Articles Like This

- Advertisement -spot_img