MetaLocator supports importing location data directly from Microsoft Azure Blob Storage. This allows you to store your CSV data files in an Azure Storage container and import them into MetaLocator either on-demand or on a recurring schedule.
Getting Your Azure Credentials
Option A: SAS Token (Recommended)
A SAS token provides scoped, time-limited access to your storage container without exposing your full account key.
Sign in to the Azure Portal
Navigate to your Storage Account
In the left menu, select Shared access signature (under Security + networking)
Configure the SAS settings:
Allowed services: Blob
Allowed resource types: Container, Object
Allowed permissions: Read, List
Set an appropriate expiry date
Click Generate SAS and connection string
Copy the SAS token value (it starts with
?sv=)
Option B: Account Key
The account key provides full access to all containers in your storage account.
Sign in to the Azure Portal
Navigate to your Storage Account
In the left menu, select Access keys (under Security + networking)
Click Show next to one of the keys
Copy the Key value (a Base64-encoded string)
Security Note: SAS tokens are recommended over account keys because they can be scoped to specific containers, limited to read-only access, and set to expire after a defined period.
Step-by-Step Import Guide
Step 1: Start the Import
Log in to your MetaLocator admin panel
Navigate to the Import section
Click the MS Azure Blob Storage button
Step 2: Enter Connection Details
Fill in the following fields on the Azure Blob Storage configuration form:
Field | Description | Example |
Storage Account Name | The name of your Azure storage account (not the full URL) |
|
Container Name | The name of the blob container holding your files |
|
Authentication Method | Choose between SAS Token or Account Key | β |
SAS Token | Your SAS token (if using SAS authentication) |
|
Account Key | Your account key (if using Account Key authentication) | (Base64-encoded string) |
Step 3: Choose a File Selection Mode
MetaLocator provides two ways to select which file to import:
Select File Mode
Use this mode when you want to choose a specific file from your container.
Select Select File as the file selection mode
Optionally enter a Path Prefix to filter files by folder (e.g.,
data/to only list files inside thedatafolder)Click Test Connection & Load Files
If the connection is successful, a dropdown will populate with the CSV files found in your container
Select the file you wish to import from the dropdown
File Mask Mode
Use this mode when your file names follow a naming pattern and you always want to import the most recent matching file. This is especially useful for automated or recurring imports.
Select File Mask as the file selection mode
Enter a File Mask using
*as a wildcard (e.g.,partnerlocator*.csvmatches any file that starts withpartnerlocatorand ends with.csv)Optionally enter a Path Prefix to narrow the search to a specific folder
Click Test Connection & Load Files
If matching files are found, a preview table will show the files that match your pattern, sorted by most recent first
The most recent matching file will be used during import
Step 4: Test the Connection
After entering your credentials and file selection settings, click Test Connection & Load Files to verify:
Your credentials are valid
The container is accessible
CSV files are available at the specified location
If the connection is successful, you will see a confirmation message indicating how many file(s) were found. If there is an error, a message will describe the issue (e.g., invalid credentials, container not found).
Step 5: Start the Import
Once your file is selected and the connection is verified, click Next
The import will begin as a background job
A progress indicator will show the status of your import
When complete, you will see a confirmation message
Keeping Data in Sync (Recurring Imports)
If your plan supports background processing, you will see a Keep Azure Blob Data In Sync? checkbox on the configuration form.
Enabling this option will:
Create a recurring background job that runs automatically every 24 hours
Download the latest file from your Azure container using the same connection settings and file selection criteria
Import the data into MetaLocator, keeping your location data up to date with changes in your source file
This is particularly useful when combined with File Mask mode, as it will always pick up the most recently modified file matching your pattern.
Troubleshooting
Connection Errors
Error | Cause | Solution |
Authentication failed (HTTP 403) | Invalid or expired credentials | Verify your SAS token has not expired and has read+list permissions, or verify your account key is correct |
Container not found (HTTP 404) | Incorrect container name or storage account name | Double-check the storage account name and container name for typos |
Connection error | Network or DNS issue | Verify the storage account name is correct and the account is accessible |
No Files Found
Ensure your CSV files have a
.csvfile extensionIf using a path prefix, confirm the prefix matches the folder structure in your container (e.g.,
data/not/data/)If using file mask mode, verify your wildcard pattern matches the file naming convention
Import Fails After Download
Verify that your CSV file is properly formatted with column headers in the first row
Check that the file is not empty or corrupted
Review the import job status in the Jobs section of the admin panel for detailed error messages
SAS Token Expiration
SAS tokens have an expiry date. If your recurring import stops working, check whether your SAS token has expired and generate a new one if needed. When updating the SAS token:
Navigate to the Import section
Select Azure Blob Storage
Enter the new SAS token
Click Test Connection & Load Files to verify
Proceed with the import to update the saved credentials
Supported File Format
Azure Blob Storage import supports CSV files only. Your CSV file should:
Have column headers in the first row
Use standard CSV formatting (comma-separated values)
Have a
.csvfile extension
Best Practices
Use SAS tokens instead of account keys for better security
Set SAS token expiry dates that are reasonable for your use case and renew them before they expire
Create a container specifically for MetaLocator
Use file masks for recurring imports to automatically pick up the latest file
Organize files with path prefixes to keep your container organized and make file selection easier
Test the connection before starting each import to catch credential or access issues early
β

