112417_1021_Authenticat1.png

Why Do I Integrate?

I got a question from a colleague; 2why should I go to Integrate? Give me a reason.”

First off: If you need convincing to go to London for three days, have fun and meet new people, then you are not conference material. Bye, bye and see you when I get home.

News?

Once we went to conferences to get heads up on news, what is coming and what is important. Nowadays we get the news over Twitter or Yammer, so that is not the reason.

Educational?

Once this was the only way to get information about how to use new features and what features to use, when. Nowadays the sessions are online within an hour, so that is not the reason.

Social?

Once we weary of speaking to “the competition”. We stayed within our designated groups, fearful of saying something that might say too much about a client or a project. I remember very well trying to get two guys that had “reprogrammed the ESB Toolkit” to say why and what. I might just as well have asked them for the nuclear launch codes.

But we are getting better at this, and after a while we realized we could talk about other things besides work, we did things together, had dinner, beer and a good time.

This is one of the reasons but not the main one.

The passion <3

I am, as some know, a passionate guy. I…love…doing what I do for work. I love people that feel the same, and at Integrate I know I will meet my fellows. The place where I can be myself for three days. The only place I can discuss the merits of JSON vs XML for an hour, hear a crazy Italian guy passionately talking about his latest project, shaking the hand of that Kiwi guy that helped me get onboard the Logic Apps train.

Then, you meet the people from the team in Redmond and you realize: they are just like you. Just as passionate and just as social.

Integrate is News, Integrate is Educational and most certainly Social, but most of all: It is the passion.

Hope to see you there, I will be the guy in the front row, asking questions and arranging dinner.

112417_1021_Authenticat1.png

Simple How-to: Upload a file to Azure Storage using Rest API

There are a lot of different ways to make this happen but, like before, I was looking for the “quick and easy way” to just get it done. So here is a condensed version. Please send me feedback if you find errors or need clarification in any areas. I would also like to point to the official Azure Storage API documentation.

Tools

For testing the Rest APIs I recommend using Postman.

Create a file storage

First you need to create a file storage in Azure. More information can be found here.
For this I created a storage account called bip1diag306 (fantastic name I know), added a file share called “mystore”, and lastly added a subdirectory called “mysubdir”. This is important to understand the http URIs later in this post.

Create a SAS key

In order to give access to your files you can create a SAS key, using the Azure Portal. The SAS key is very useful since it is secure, dependable, easy to use and can be set expire, if you need it.
At the moment, a SAS key created in the portal can only be set for the entire storage account. It is possible to set a particular key for a folder but in that case, you have to use code.
To create a SAS key using the portal, open the overview for the storage account and look in the menu to the left. Find “Shared Access Signature” and click it.

Select the access option you want but make at least sure that the FILE service and create is selected. If you just want to get things working, select everything and make sure the Start date and time is correct. Since I work from Stockholm, the default UTC will make me create keys that will start working an hour from now. I usually set the start date to “yesterday” just to be sure and then set the expiration to “next year”.

Click the “Generate SAS” button. The value in “SAS Token” is very important. Copy it for safekeeping until later.

Create and then upload

The thing that might be confusing is that the upload must happen in two steps. First you create the space for the file, and then you upload the file. This was very confusing to me at first. I was looking for an “upload file” API, but this is the way to do it.

There are a lot more things you can configure when calling this API. The full documentation can be found here. Note that the security model in that documentation differs from the one in this article.

Create

First you need to call the service to make room for your file.
Use postman to issue a call configured like this:

VERB: PUT
URI: https://[storagename].file.core.windows.net/[sharename][/subdir]/[filename][Your SAS Key from earlier]
HEADERS:
x-ms-type:file
x-ms-content-length:file size in bytes

Example

So, if I was tasked with uploading a 102-byte file, called myfile.txt to the share above, the call would look like this:

VERB: PUT
URI: https://bip1diag306.file.core.windows.net/mystore/mysubdir/myfile.txt?sv=2016-05-31&ss=bfqt&srt=sco&sp=rwdlacup&se=2018-06-01T21:27:52Z&st=2016-06-01T13:27:52Z&spr=https&sig=signaturegoeshere

HEADERS:
x-ms-type:file
x-ms-content-length:102

Upload

Now, it is time to upload the file, or to fill the space we created in the last call. Once again there is a lot mot you can set when uploading a file. Consult the documentation.
Use postman to issue a call configured like this:

VERB: PUT
URI: https://[storagename].file.core.windows.net/[sharename][/subdir]/[filename]?comp=range&[Your SAS Key from earlier] (remove the ?-sign you got when copying from the portal).

Note that you have to add comp=range as an operation.

HEADERS:
x-ms-write:update
x-ms-range:bytes=[startbyte]-[endbyte]
content-length:[empty]

Looking at the headers, the first one means that we want to “update the data on the storage”.
The second one is a but trickier. It tells what part of the space on the storage account to update, or what part of the file if you will. Usually this is the whole file so you set it to 0 for the startbyte and then the length of the file in bytes minus 1.
The last one, is content-length. This is the length of the request body in bytes. In postman, this value cannot be set but is filled for you automatically depending on the size of the request body. If you are using some other method for sending the request, you have to calculate the value.

Example

So, returning to the 102-byte file earlier, the call would look like this:

VERB: PUT
URI: https://bip1diag306.file.core.windows.net/mystore/mysubdir/myfile.txt?comp=range&sv=2016-05-31&ss=bfqt&srt=sco&sp=rwdlacup&se=2018-06-01T21:27:52Z&st=2016-06-01T13:27:52Z&spr=https&sig=[signaturegoeshere]

HEADERS:
x-ms-write:update
x-ms-range:bytes=0-101
content-length:

The requestbody is the file content in clear text.

Limitations

There are limitations to the storage service. One which impacted me personally. You can only upload 4mb “chunks” per upload. So if your files exeed 4mb you have to split them into parts. If you are a good programmer you can make use of tasks and await to start multiple threads. Please consult the Azure limits documentation to see if any other restrictions apply.

Conclusion

There is a lot of tools out there to help you upload files to your storage. This case can be used when automating informationflow. We used it to send data from IBM DataPower to Azure Storage. Integrate everything!