Skip to main content

Importing from Azure Blob Storage

Import bulk data files directly from Azure Blob Storage

Written by Michael Fatica
Updated yesterday

MetaLocator supports importing location data directly from Microsoft Azure Blob Storage. This allows you to store your CSV data files in an Azure Storage container and import them into MetaLocator either on-demand or on a recurring schedule.

Getting Your Azure Credentials

Option A: SAS Token (Recommended)

A SAS token provides scoped, time-limited access to your storage container without exposing your full account key.

  1. Sign in to the Azure Portal

  2. Navigate to your Storage Account

  3. In the left menu, select Shared access signature (under Security + networking)

  4. Configure the SAS settings:

    • Allowed services: Blob

    • Allowed resource types: Container, Object

    • Allowed permissions: Read, List

    • Set an appropriate expiry date

  5. Click Generate SAS and connection string

  6. Copy the SAS token value (it starts with ?sv=)

Option B: Account Key

The account key provides full access to all containers in your storage account.

  1. Sign in to the Azure Portal

  2. Navigate to your Storage Account

  3. In the left menu, select Access keys (under Security + networking)

  4. Click Show next to one of the keys

  5. Copy the Key value (a Base64-encoded string)

Security Note: SAS tokens are recommended over account keys because they can be scoped to specific containers, limited to read-only access, and set to expire after a defined period.

Step-by-Step Import Guide

Step 1: Start the Import

  1. Log in to your MetaLocator admin panel

  2. Navigate to the Import section

  3. Click the MS Azure Blob Storage button

Step 2: Enter Connection Details

Fill in the following fields on the Azure Blob Storage configuration form:

Field

Description

Example

Storage Account Name

The name of your Azure storage account (not the full URL)

mystorageaccount

Container Name

The name of the blob container holding your files

my-container

Authentication Method

Choose between SAS Token or Account Key

β€”

SAS Token

Your SAS token (if using SAS authentication)

?sv=2023-01-03&ss=b&srt=sco&sp=rl...

Account Key

Your account key (if using Account Key authentication)

(Base64-encoded string)

Step 3: Choose a File Selection Mode

MetaLocator provides two ways to select which file to import:

Select File Mode

Use this mode when you want to choose a specific file from your container.

  1. Select Select File as the file selection mode

  2. Optionally enter a Path Prefix to filter files by folder (e.g., data/ to only list files inside the data folder)

  3. Click Test Connection & Load Files

  4. If the connection is successful, a dropdown will populate with the CSV files found in your container

  5. Select the file you wish to import from the dropdown

File Mask Mode

Use this mode when your file names follow a naming pattern and you always want to import the most recent matching file. This is especially useful for automated or recurring imports.

  1. Select File Mask as the file selection mode

  2. Enter a File Mask using * as a wildcard (e.g., partnerlocator*.csv matches any file that starts with partnerlocator and ends with .csv)

  3. Optionally enter a Path Prefix to narrow the search to a specific folder

  4. Click Test Connection & Load Files

  5. If matching files are found, a preview table will show the files that match your pattern, sorted by most recent first

  6. The most recent matching file will be used during import

Step 4: Test the Connection

After entering your credentials and file selection settings, click Test Connection & Load Files to verify:

  • Your credentials are valid

  • The container is accessible

  • CSV files are available at the specified location

If the connection is successful, you will see a confirmation message indicating how many file(s) were found. If there is an error, a message will describe the issue (e.g., invalid credentials, container not found).

Step 5: Start the Import

  1. Once your file is selected and the connection is verified, click Next

  2. The import will begin as a background job

  3. A progress indicator will show the status of your import

  4. When complete, you will see a confirmation message

Keeping Data in Sync (Recurring Imports)

If your plan supports background processing, you will see a Keep Azure Blob Data In Sync? checkbox on the configuration form.

Enabling this option will:

  • Create a recurring background job that runs automatically every 24 hours

  • Download the latest file from your Azure container using the same connection settings and file selection criteria

  • Import the data into MetaLocator, keeping your location data up to date with changes in your source file

This is particularly useful when combined with File Mask mode, as it will always pick up the most recently modified file matching your pattern.

Troubleshooting

Connection Errors

Error

Cause

Solution

Authentication failed (HTTP 403)

Invalid or expired credentials

Verify your SAS token has not expired and has read+list permissions, or verify your account key is correct

Container not found (HTTP 404)

Incorrect container name or storage account name

Double-check the storage account name and container name for typos

Connection error

Network or DNS issue

Verify the storage account name is correct and the account is accessible

No Files Found

  • Ensure your CSV files have a .csv file extension

  • If using a path prefix, confirm the prefix matches the folder structure in your container (e.g., data/ not /data/)

  • If using file mask mode, verify your wildcard pattern matches the file naming convention

Import Fails After Download

  • Verify that your CSV file is properly formatted with column headers in the first row

  • Check that the file is not empty or corrupted

  • Review the import job status in the Jobs section of the admin panel for detailed error messages

SAS Token Expiration

SAS tokens have an expiry date. If your recurring import stops working, check whether your SAS token has expired and generate a new one if needed. When updating the SAS token:

  1. Navigate to the Import section

  2. Select Azure Blob Storage

  3. Enter the new SAS token

  4. Click Test Connection & Load Files to verify

  5. Proceed with the import to update the saved credentials

Supported File Format

Azure Blob Storage import supports CSV files only. Your CSV file should:

  • Have column headers in the first row

  • Use standard CSV formatting (comma-separated values)

  • Have a .csv file extension

Best Practices

  1. Use SAS tokens instead of account keys for better security

  2. Set SAS token expiry dates that are reasonable for your use case and renew them before they expire

  3. Create a container specifically for MetaLocator

  4. Use file masks for recurring imports to automatically pick up the latest file

  5. Organize files with path prefixes to keep your container organized and make file selection easier

  6. Test the connection before starting each import to catch credential or access issues early
    ​

Did this answer your question?