When I was implementing Azure Storage File Sync solution, I found that it worked pretty quick with the files to be uploaded from a server to the cloud storage. The reason for it is the agent installed on the server that checks for file changes. Unfortunately, when a file needs to be downloaded from a cloud storage to on-prem, the agent has its own schedule – once a day I reckon. When I delivered this message to the stakeholder, that is what I heard: “Can you synchronize files with your own schedule? Let’s say every 10 minutes.”
I thought, there must be a way to make it happen. Didn’t know what to say and started looking for documentation. I’ve found that Microsoft made it clear that they are working on the service improvements and that type of task is low priority for them. Nevertheless, there is an option for engineers to request the “Azure File Sync” service to initiate the sync process. I experimented with it for a while to understand how that piece of technology works and was trying to find what might be the best way to go ahead with in my situation. At the end of this research I created a short PowerShell script with Invoke-AzStorageSyncChangeDetection commandlet.
A short PowerShell script to start a sync
Please see the result of my work below. You can run this script by a scheduled task as often as you want. I tested it with every minute schedule. It worked well, but feels like an overkill.
The pre-requisites are:
- You will need to install Az PowerShell module to a server
- Create a service principal
- Assign a certificate to the service principal for authentication
- Give the managed identity permissions to access AzureFileSync service and Storage
- The server should have access to the Internet (at least to MS Azure)
This script saved a lot of time and made Azure File Sync service more useful for end users.
You can create a new file with “ps1” extension, copy the following content and paste it to your file. When you run it – it will request for all the details.
Param(
[Parameter(Mandatory = $true, Position = 1)] [string]$CertThumbprint = 'application-auth-cert-thumbprint',
[Parameter(Mandatory = $true, Position = 2)] [string]$ApplicationId = 'your-application-id',
[Parameter(Mandatory = $true, Position = 3)] [string]$TenantId = 'your-tenant-id',
[Parameter(Mandatory = $true, Position = 4)] [string]$Path = 'DataExchange\CSV' ,
[Parameter(Mandatory = $true, Position = 5)] [string]$ResourceGroupName = 'Azure-ResourceGroupName',
[Parameter(Mandatory = $true, Position = 6)] [string]$StorageSyncServiceName = 'Azure-StorageSyncServiceName',
[Parameter(Mandatory = $true, Position = 7)] [string]$SyncGroupName = 'Azure-SyncGroupName'
)
# Connect to Azure using earlier created Service Principal and certificate
Connect-AzAccount -CertificateThumbprint $CertThumbprint -ApplicationId $ApplicationId -ServicePrincipal -Tenant $TenantId
# Get cloud endpoint name and confirm that your connection to Azure is successful
$CloudEndpointName = (Get-AzStorageSyncCloudEndpoint -ResourceGroupName $ResourceGroupName -StorageSyncServiceName $StorageSyncServiceName -SyncGroupName $SyncGroupName).CloudEndpointName
# Request Azure File Sync service to review the list of files, find the changed files and run sync
Invoke-AzStorageSyncChangeDetection -ResourceGroupName $ResourceGroupName -StorageSyncServiceName $StorageSyncServiceName -SyncGroupName $SyncGroupName -Name $CloudEndpointName -DirectoryPath $Path -Recursive
# Do some housekeeping - ensure you leave no open connections
Disconnect-AzAccount
Hope that helps you to use Azure File Sync more effective and synchronize files from the cloud with your own schedule.
The next step is to create a scheduled task to run the script with the parameters you need and as often as you prefer.
See more posts about Azure services and their features here