Simple How-to: Upload a file to Azure Storage using Rest API

There are a lot of different ways to make this happen but, like before, I was looking for the “quick and easy way” to just get it done. So here is a condensed version. Please send me feedback if you find errors or need clarification in any areas. I would also like to point to the official Azure Storage API documentation.

Tools

For testing the Rest APIs I recommend using Postman.

Create a file storage

First you need to create a file storage in Azure. More information can be found here.
For this I created a storage account called bip1diag306 (fantastic name I know), added a file share called “mystore”, and lastly added a subdirectory called “mysubdir”. This is important to understand the http URIs later in this post.

Create a SAS key

In order to give access to your files you can create a SAS key, using the Azure Portal. The SAS key is very useful since it is secure, dependable, easy to use and can be set expire, if you need it.
At the moment, a SAS key created in the portal can only be set for the entire storage account. It is possible to set a particular key for a folder but in that case, you have to use code.
To create a SAS key using the portal, open the overview for the storage account and look in the menu to the left. Find “Shared Access Signature” and click it.

Select the access option you want but make at least sure that the FILE service and create is selected. If you just want to get things working, select everything and make sure the Start date and time is correct. Since I work from Stockholm, the default UTC will make me create keys that will start working an hour from now. I usually set the start date to “yesterday” just to be sure and then set the expiration to “next year”.

Click the “Generate SAS” button. The value in “SAS Token” is very important. Copy it for safekeeping until later.

Create and then upload

The thing that might be confusing is that the upload must happen in two steps. First you create the space for the file, and then you upload the file. This was very confusing to me at first. I was looking for an “upload file” API, but this is the way to do it.

There are a lot more things you can configure when calling this API. The full documentation can be found here. Note that the security model in that documentation differs from the one in this article.

Create

First you need to call the service to make room for your file.
Use postman to issue a call configured like this:

VERB: PUT
URI: https://[storagename].file.core.windows.net/[sharename][/subdir]/[filename][Your SAS Key from earlier]
HEADERS:
x-ms-type:file
x-ms-content-length:file size in bytes

Example

So, if I was tasked with uploading a 102-byte file, called myfile.txt to the share above, the call would look like this:

VERB: PUT
URI: https://bip1diag306.file.core.windows.net/mystore/mysubdir/myfile.txt?sv=2016-05-31&ss=bfqt&srt=sco&sp=rwdlacup&se=2018-06-01T21:27:52Z&st=2016-06-01T13:27:52Z&spr=https&sig=signaturegoeshere

HEADERS:
x-ms-type:file
x-ms-content-length:102

Upload

Now, it is time to upload the file, or to fill the space we created in the last call. Once again there is a lot mot you can set when uploading a file. Consult the documentation.
Use postman to issue a call configured like this:

VERB: PUT
URI: https://[storagename].file.core.windows.net/[sharename][/subdir]/[filename]?comp=range&[Your SAS Key from earlier] (remove the ?-sign you got when copying from the portal).

Note that you have to add comp=range as an operation.

HEADERS:
x-ms-write:update
x-ms-range:bytes=[startbyte]-[endbyte]
content-length:[empty]

Looking at the headers, the first one means that we want to “update the data on the storage”.
The second one is a but trickier. It tells what part of the space on the storage account to update, or what part of the file if you will. Usually this is the whole file so you set it to 0 for the startbyte and then the length of the file in bytes minus 1.
The last one, is content-length. This is the length of the request body in bytes. In postman, this value cannot be set but is filled for you automatically depending on the size of the request body. If you are using some other method for sending the request, you have to calculate the value.

Example

So, returning to the 102-byte file earlier, the call would look like this:

VERB: PUT
URI: https://bip1diag306.file.core.windows.net/mystore/mysubdir/myfile.txt?comp=range&sv=2016-05-31&ss=bfqt&srt=sco&sp=rwdlacup&se=2018-06-01T21:27:52Z&st=2016-06-01T13:27:52Z&spr=https&sig=[signaturegoeshere]

HEADERS:
x-ms-write:update
x-ms-range:bytes=0-101
content-length:

The requestbody is the file content in clear text.

Limitations

There are limitations to the storage service. One which impacted me personally. You can only upload 4mb “chunks” per upload. So if your files exeed 4mb you have to split them into parts. If you are a good programmer you can make use of tasks and await to start multiple threads. Please consult theĀ Azure limits documentation to see if any other restrictions apply.

Conclusion

There is a lot of tools out there to help you upload files to your storage. This case can be used when automating informationflow. We used it to send data from IBM DataPower to Azure Storage. Integrate everything!