Loading data into Azure SQL (with Azure Data Studio and Azure Blob Storage)

When you want to start working with a database (no matter what type, flavor, version ), you need some data. As I started my journey with Azure SQL, this was one of the first objectives I had; let’s get this data from spreadsheet X into database Y. As with all database flavors, there are many tools, UIs, and command line utilities to accomplish this. For this post, we are going to go as simple as possible and use Azure Data Studio for all our needs. Azure Data Studio is built on Microsoft Visual Studio Code, so it provides a very familiar environment for working with SQL Databases; extensions and all.

Create the environment

To start, we need an Azure account which you can get for free here. One of the great advantages of Azure is that you can use the same login you have for your Xbox Live account. Or, if you have an Office 365 account, it will work as well making it super easy to get started.

Once you have the account and are logged into the Azure Portal, you need to create a new Azure SQL database as well as an Azure blob storage container. You can see how to quickly do that with the following links:

Quickstart: Create a single database – Azure SQL Database

Quickstart: Upload, download, and list blobs with the Azure portal

These two quickstarts will get you up and running in no time. You can also use the Azure CLI to create these resources as well:

Create and connect to an Azure SQL DB in 6 easy steps

Manage blob containers using Azure CLI

Connect to the Database

Once the database is created, we need to connect to it so that we can start creating objects and loading data. On the Azure SQL Database details page are a few methods to get database connect information for various languages, drivers and use cases,

but we are going to use the magic connect with Azure Data Studio button.

While on the database details page, just simply click the “Connect with…” drop down and select Azure Data Studio.

This will open Azure Data Studio and prefill the database details. All you need to provide is a method of authentication be it a SQL Login, Windows Authentication or using Azure Active Directory.

Once you add the credentials you want to use

Now, click the connect button and that’s it, you are connected to your Azure SQL instance.

One of the many cool features of Azure Data Studio is that once you attempt to connect to a database and a firewall change needs to be made to allow your client access, it just pops up a windows asking if it can make the change for you. No need to run any CLI commands, log into the Azure Portal; just click OK and done!

Now that you are connected, just right click on the database in the Connections blade and select New Query.

Loading the Data

Now that we have our query worksheet open, its time to create a table and add data to it via a CSV file. The table we are going to create is a pretty simple one; we don’t need to think about optimization at this time but know its not a good idea to make all your columns nvarchar(max).

Curious about how Azure SQL or SQL Server in general stores data? Take a look at the Pages and Extents Architecture Guide. Proper data modeling makes everyone happy!

Here is our table. Copy and paste this code into the SQL Query worksheet in Azure Data Studio and run it.

create table [dbo].[bad_pets]
    id int not null,
    pet_type varchar(100),
    reason varchar(400)

Upload the CSV file to Azure Blob Storage

Table created! Now, we need to fill this table with important information about bad pets to have and we happen to have just the CSV to do this with filled with critical information we need to know.

Download the bad_pets.csv file to your local computer. We will need to upload this file to Azure Blob Storage.

When you created your storage container in the setups section, you might have noticed a few options for the access level.

For this post, we are going to use the Private access to show you the most secure way to access files from Azure Blob Storage. Just know, if you chose the Blob or Container options, the files would be accessible without authentication; makes loading them easier but it also allows anyone and their cousin to read the file as well.

After you upload this file to your blob storage container

we can start the process to load it into our table.

Create a Master Key and Scoped Credential

With the following code, we are going to set a master password to protect the credentials we are going to create and then create a scoped credential to use to access blob storage from within the database.

SECRET = 'SharedAccesstoken';

For more information on master keys and scoped credentials, refer to the following documentation:



In the code block to create the master key, be sure to replace ‘SuperSecretPassword’ with a more complex password.


Replace ‘SuperSecretPassword’ and run the code in Azure Data Studio.

Next, we need to create the Scoped Credential.

SECRET = 'SharedAccesstoken';

The clause IDENTITY = ‘SHARED ACCESS SIGNATURE’ is used for creating and accessing external data sources. For ‘SharedAccesstoken’, you need to create one on the container details page in the Azure Portal. Navigate to the container details page and click the Shared access tokens property.

And on the Shared access tokens page, on the lower part of the main section, click the Generate SAS token and URL blue button. You might want to extend the Expiry date on this page out a day so that you have some time to work with the file.

Copy the Blob SAS token and replace ‘SharedAccesstoken’ with this string.

SECRET = 'SharedAccesstoken';

Create an External Data Source

Master password and external credentials created, next let create a link from the Azure SQL database to our CSV file in Azure Blob Storage. For this command, we need to blob storage URL. The URL is in the following syntax:


With STORAGE_ACCOUNT_NAME being the name you used when creating the Azure Blob Storage account and CONTAINER_NAME being what you named the container where the bad_pets.csv file is. In my account, the storage account is named supercoolblobstorageaccount and the container name is datafiles. That would result in my storage URL being:


Find your storage account name and container name, use those values in the below code snippet and run it in Azure Data Studio

	LOCATION = 'https://STORAGE_ACCOUNT_NAME.blob.core.windows.net/CONTAINER_NAME',
        CREDENTIAL = AzureBlobStorageCredential);

Bulk Insert the Data

Now, it’s the time you have been waiting for! Let’s load the data into the table.

BULK INSERT [dbo].[bad_pets]
FROM 'bad_pets.csv'
WITH (DATA_SOURCE = 'ExternalAzureBlobStorage',

And we can query the table now to see the 6 rows have been inserted.

select * from dbo.bad_pets;


That’s it, just a few short steps and you have a table with data. Now that the credentials and external data source are created, you can use this compartment to load more data and just skip to the bulk load step. Azure SQL and Azure Blob Storage make it very easy to stage data of all types and load it directly into a table. And not just CSV files; the same process can be used for JSON and binary objects such as images. In fact, watch for another post about using JSON with Azure SQL coming soon.

Leave a Reply

Your email address will not be published. Required fields are marked *