cropped-IMG_7284.jpg

Securing passwords in Logic Apps

The history

In the beginning, storing passwords, usernames and so on in a Logic App was not very secure. You had to store the value in clear text in the Logic App “code behind”, or do some magic using a file on a blob storage and a function. That was before the wonderful service called KeyVault.

If you have experience of BizTalk in the past, then you can view a KeyVault much the same way as the SSODB; an encrypted place to store keys and values. You are not limited to just storing user credentials, you can store other things like endpoint addresses, settings, certificates and even access it using REST, but this post focuses on using the KeyVault as a storage for user credentials. More information about the KeyVault service can be found here, and here is how to get started.

NOTE: You do not need any KeyVault preconfigured. The instructions below will setup a basic one.

The scenario

You are calling an API that uses Basic Auth (aka username and password). The authentication is therefore per call and must be supplied every time the service is called. You do not want to store the username and password in clear text in the Logic App.

The Logic App is only a wrapper for the call to the service (for demo purposes).

You are using Visual Studio to develop and deploy your Logic App.

NOTE: It is possible to achieve the same result without using Visual Studio but this post does not cover that.

The Logic App without KeyVault


As you can see the password and username is clearly visible to anyone accessing the Logic App, so any Logic App contributor can access it. The JSON for it can be downloaded here. Just paste it into a new project, replacing the old text in the LogicApp.json-file.

This is how you do it

Logic Apps tooling in Visual Studio comes prepared to use KeyVault, the only tricky part is to add parameters that will make the tooling use KeyVault. We are going to make use of this together with the Logic Apps ARM template in Visual Studio. There is a lot of text here but I am sure that if you do this once you will fully understand it. Take your time to get it right.

Add parameters

Open the Logic App as JSON and scroll to the top. The Logic App always starts with a parameter for the Azure Resource Manager: the name of the logic app. Here we will add two new parameters: ExternalSupplierUsr and ExternalSupplierPwd. Add the following JSON before the first parameter.

“ExternalSupplierUsr”: { “type”: “securestring” },

“ExternalSupplierPwd”: { “type”: “securestring” },

Note the type: securestring. This will tell the tooling that we would like to use the KeyVault.

The updated JSON file can be downloaded here.

Configure the parametersfile and create the KeyVault

Next we need to make room for the keys. Save the Logic App, right-click the project in the solution explorer and choose Deploy and then the name of your project. The usual dialog appears. Fill it out and the click the Edit Parameters button. The new dialog should look something like this

See those little keys to the right. These are shown because we used the securestring type. Click the top one.

If you already have a KeyVault, you can use that. Let’s create a new one.

Click the link saying Create KeyVault using PowerShell. This will point you to this page on github.

Open the PowerShell ISE and copy + paste the powershell code into it. Update the code to your needs. I will create a KeyVault called SuperSecretVault in West Europe in its own resource group. I highly recommend this, use a separate group for your KeyVaults.

My finished script will look like this:

#Requires -Module AzureRM.Profile
#Requires -Module AzureRM.KeyVault

#Login and Select the default subscription if needed
Login-AzureRmAccount
Select-AzureRmSubscription -SubscriptionName ‘your subscription name goes here’

#Change the values below before running the script
$VaultName = ‘SuperSecretVault’

#Globally Unique Name of the KeyVault
$VaultLocation = ‘West Europe’

#Location of the KeyVault
$ResourceGroupName = ‘KeyVaultGroup’

#Name of the resource group for the vault
$ResourceGroupLocation = ‘West Europe’

#Location of the resource group if it needs to be created
New-AzureRmResourceGroup -Name $ResourceGroupName -Location $ResourceGroupLocation -Force
New-AzureRmKeyVault -VaultName $VaultName -ResourceGroupName $ResourceGroupName -Location $VaultLocation -EnabledForTemplateDeployment

Execute it and wait.

Use the KeyVault

Now go back to Visual Studio. You have to close down all dialogs except the first one. Click the little key icon again, next to the parameter called ExternalSupplierUsr

Select your subscription, select your vault and choose <Create New>

Give it a name, I will use SecretExternalSupplierUsr, and then set the value “SuperSecretUserName” for the username. Click Ok and repeat the process for the ExternalSupplierPwd (all the way back and press the little key again). Name your Logic App SecurePasswordsInLogicApps and it should look something like this:

Click Save to save the configuration into the parameters.json file. We are not going to deploy it yet but you can look at it to see what was updated.

Use the parameters in the Logic App

Here is the tricky part. You must add parameters in the JSON behind the Logic App. This is pretty hardcore and make sure to know where to type what.

Start by opening the JSON file for the Logic App, not in the designer but the whole file. Scroll down to the bottom. Here you will find the first parameters-clause. You should enter the value of the parameters at the top. At deploy time, the resource manager will take the value of the parameters given in the KeyVault and just paste them here. Since this part is never shown in the Logic App code behind, this is ok. Think of this as values being compiled into “the DLL of your Logic App”.

Make sure you use good names for these parameters. They do not have to be the same as those at the top but the name must be the same from now on. I updated my JSON-file to look like this.

“parameters”: {
“SupplierAPIUsr”: {
“value”: “[parameters(‘ExternalSupplierUsr’)]”
},
“SupplierAPIPwd”: {
“value”: “[parameters(‘ExternalSupplierPwd’)]”
}
}

 

My updated JSON file can be downloaded here.

Setup the Logic App with parameters

If you would simply paste in [parameters(‘ExternalSupplierUsr’)] in your Logic App, a deployment will replace the parameter with a value and therefor making it visible in the Logic App code behind. We have to send the value into the Logic App as a secure string.

Scroll up to the next parameters-clause. Mine are at row 87. Here you declare two new parameters, with the same name as the parameters you just declared at the bottom of the file. After update, my file looks like this:

“parameters”: {
“SupplierAPIUsr”: {
“type”: “SecureString”
},
“SupplierAPIPwd”: {
“type”: “SecureString”
}
},

 

My updated JSON file can be downloaded here.

We have now set up parameters to retrieve the value sent in using the two parameters at the bottom.

Use the parameters

The last step is to use the parameters in the Logic App. This is very simple since there is an array in the Logic App called Parameters.

Scroll up and find the username and passwords for the external API. Mine are at rows 69 and 70. Update the values to use parameters.

I updated the file to look like this:

“authentication”: {
“type”: “Basic”,
“username”: “@parameters(‘SupplierAPIUsr’)”,
“password”: “@parameters(‘SupplierAPIPwd’)”
}

The final file can be downloaded from here.

Deploy and test

Deploy your Logic App just like you usually do and then test it using Postman. We get an error back because the service called does not exist.

Look at the results

If you look at the run, you will see that this has a downside. Not all values sent as secure strings are sanitized.

But at least the password is not in clear text.

Now open the code behind of the Logic App and you can see that the values of the parameters are never shown! This is awesome!

The good thing

This gives you and your team a uniform, and secure, way to keep your passwords in check. Use different KeyVaults for different environments (one for test and one for prod) and you will be good to go.

The bad thing

Since the value of the KeyVault is only accessed when the Logic App is deployed, you must redeploy the Logic App if you need to update a value in the KeyVault. For instance, say we need to update the password in the Logic App used here, then you first need to update the KeyVault (use the portal) and then you need to redeploy the Logic App. That way the new value is picked up by the Azure Resource Manager and the Logic App is updated.

cropped-IMG_7284.jpg

Why Do I Integrate?

I got a question from a colleague; 2why should I go to Integrate? Give me a reason.”

First off: If you need convincing to go to London for three days, have fun and meet new people, then you are not conference material. Bye, bye and see you when I get home.

News?

Once we went to conferences to get heads up on news, what is coming and what is important. Nowadays we get the news over Twitter or Yammer, so that is not the reason.

Educational?

Once this was the only way to get information about how to use new features and what features to use, when. Nowadays the sessions are online within an hour, so that is not the reason.

Social?

Once we weary of speaking to “the competition”. We stayed within our designated groups, fearful of saying something that might say too much about a client or a project. I remember very well trying to get two guys that had “reprogrammed the ESB Toolkit” to say why and what. I might just as well have asked them for the nuclear launch codes.

But we are getting better at this, and after a while we realized we could talk about other things besides work, we did things together, had dinner, beer and a good time.

This is one of the reasons but not the main one.

The passion <3

I am, as some know, a passionate guy. I…love…doing what I do for work. I love people that feel the same, and at Integrate I know I will meet my fellows. The place where I can be myself for three days. The only place I can discuss the merits of JSON vs XML for an hour, hear a crazy Italian guy passionately talking about his latest project, shaking the hand of that Kiwi guy that helped me get onboard the Logic Apps train.

Then, you meet the people from the team in Redmond and you realize: they are just like you. Just as passionate and just as social.

Integrate is News, Integrate is Educational and most certainly Social, but most of all: It is the passion.

Hope to see you there, I will be the guy in the front row, asking questions and arranging dinner.

cropped-IMG_7284.jpg

Simple How-to: Upload a file to Azure Storage using Rest API

There are a lot of different ways to make this happen but, like before, I was looking for the “quick and easy way” to just get it done. So here is a condensed version. Please send me feedback if you find errors or need clarification in any areas. I would also like to point to the official Azure Storage API documentation.

Tools

For testing the Rest APIs I recommend using Postman.

Create a file storage

First you need to create a file storage in Azure. More information can be found here.
For this I created a storage account called bip1diag306 (fantastic name I know), added a file share called “mystore”, and lastly added a subdirectory called “mysubdir”. This is important to understand the http URIs later in this post.

Create a SAS key

In order to give access to your files you can create a SAS key, using the Azure Portal. The SAS key is very useful since it is secure, dependable, easy to use and can be set expire, if you need it.
At the moment, a SAS key created in the portal can only be set for the entire storage account. It is possible to set a particular key for a folder but in that case, you have to use code.
To create a SAS key using the portal, open the overview for the storage account and look in the menu to the left. Find “Shared Access Signature” and click it.

Select the access option you want but make at least sure that the FILE service and create is selected. If you just want to get things working, select everything and make sure the Start date and time is correct. Since I work from Stockholm, the default UTC will make me create keys that will start working an hour from now. I usually set the start date to “yesterday” just to be sure and then set the expiration to “next year”.

Click the “Generate SAS” button. The value in “SAS Token” is very important. Copy it for safekeeping until later.

Create and then upload

The thing that might be confusing is that the upload must happen in two steps. First you create the space for the file, and then you upload the file. This was very confusing to me at first. I was looking for an “upload file” API, but this is the way to do it.

There are a lot more things you can configure when calling this API. The full documentation can be found here. Note that the security model in that documentation differs from the one in this article.

Create

First you need to call the service to make room for your file.
Use postman to issue a call configured like this:

VERB: PUT
URI: https://[storagename].file.core.windows.net/[sharename][/subdir]/[filename][Your SAS Key from earlier]
HEADERS:
x-ms-type:file
x-ms-content-length:file size in bytes

Example

So, if I was tasked with uploading a 102-byte file, called myfile.txt to the share above, the call would look like this:

VERB: PUT
URI: https://bip1diag306.file.core.windows.net/mystore/mysubdir/myfile.txt?sv=2016-05-31&ss=bfqt&srt=sco&sp=rwdlacup&se=2018-06-01T21:27:52Z&st=2016-06-01T13:27:52Z&spr=https&sig=signaturegoeshere

HEADERS:
x-ms-type:file
x-ms-content-length:102

Upload

Now, it is time to upload the file, or to fill the space we created in the last call. Once again there is a lot mot you can set when uploading a file. Consult the documentation.
Use postman to issue a call configured like this:

VERB: PUT
URI: https://[storagename].file.core.windows.net/[sharename][/subdir]/[filename]?comp=range&[Your SAS Key from earlier] (remove the ?-sign you got when copying from the portal).

Note that you have to add comp=range as an operation.

HEADERS:
x-ms-write:update
x-ms-range:bytes=[startbyte]-[endbyte]
content-length:[empty]

Looking at the headers, the first one means that we want to “update the data on the storage”.
The second one is a but trickier. It tells what part of the space on the storage account to update, or what part of the file if you will. Usually this is the whole file so you set it to 0 for the startbyte and then the length of the file in bytes minus 1.
The last one, is content-length. This is the length of the request body in bytes. In postman, this value cannot be set but is filled for you automatically depending on the size of the request body. If you are using some other method for sending the request, you have to calculate the value.

Example

So, returning to the 102-byte file earlier, the call would look like this:

VERB: PUT
URI: https://bip1diag306.file.core.windows.net/mystore/mysubdir/myfile.txt?comp=range&sv=2016-05-31&ss=bfqt&srt=sco&sp=rwdlacup&se=2018-06-01T21:27:52Z&st=2016-06-01T13:27:52Z&spr=https&sig=[signaturegoeshere]

HEADERS:
x-ms-write:update
x-ms-range:bytes=0-101
content-length:

The requestbody is the file content in clear text.

Limitations

There are limitations to the storage service. One which impacted me personally. You can only upload 4mb “chunks” per upload. So if your files exeed 4mb you have to split them into parts. If you are a good programmer you can make use of tasks and await to start multiple threads. Please consult the Azure limits documentation to see if any other restrictions apply.

Conclusion

There is a lot of tools out there to help you upload files to your storage. This case can be used when automating informationflow. We used it to send data from IBM DataPower to Azure Storage. Integrate everything!

cropped-IMG_7284.jpg

Nullable types in JSON

Just as a personal reminder: This is the way to handle “nullable” types in JSON.

“properties”:{

“name”:{ “type”: “string”},

“twitterhandle”:{ “type”: [“string”, “null”]}

}

It is not really a nullable type. It is just an array of possible types. It might be [“number”, “string”] but who uses that?

cropped-IMG_7284.jpg

Installing Azure PowerShell

I

Thanks to everyone that attended my part of the Global Integration Bootcamp.

Also sorry for omitting the prereq of Azure PowerShell in my lab number two. The use of PowerShell to manage, deploy and automate things in Azure is very useful.

The easiest way to download and install Azure Power Shell is by using the Web Platform Installer, download here.

Open it and search for Powershell.


Look for Azure PowerShell and click Add.


Then click install (far down to the right) and let it run.   


Close any open Powershell consoles and start a new one. The login works.

cropped-IMG_7284.jpg

The easiest way to connect to the SalesForce API

This post is a simple reminder of how to simply get going connecting to Salesforce using the API. The security model is very rich and the documentation is sadly both lacking and, in some cases, wrong. I therefore took it upon me to create a “How to get to the first successful message” post.
Any additional information can be found in the Salesforce documentation.

Note that the security might not be optimal for your needs down the line in your use of Salesforce. Personally, I used the API to upload Account and Contact data and the let loose the salespeople on the application.

The steps are these:

  1. Make sure you have access to Salesforce.
  2. Make sure you are a Systems Admin.
  3. Setup a Connected Application, this is the connection used for the API calls.
  4. Getting some user info
  5. Getting your Security Token
  6. Login and get an access token.
  7. Test the access token using a standard call.

I will use the simple Username and Password flow. There are others, but this seems to fit my needs the best.

Here we go.

Make sure you have access to Salesforce

You have been assigned a user and a path for login. Usually this is login.salesforce.com or test.salesforce.com if you are using the sandbox.

Make sure you are a Systems Admin

Access the Setup part of Salesforce. This is usually done by clicking the cogs up to the right of the screen.

A new tab will open with all the settings.

Access the “Users” setting by clicking the menu to the left under Administration. Click Users and the Users again.

In the list to the right, find your identity and make sure it is System Administrator.

Setup a connected Application

In the menu to the left find the Platform Tools heading. Click Apps and then App Manager

The list to the right contains all the currently connected apps. Ignore that and look for the button saying New Connected App. It is up to the right. Click it.

Time to fill in the fields.

Connected App name: A name you choose. Can be anything.

API Name: Auto fills. Do not touch it.

Contact e-mail: Fill in a valid e-mail that you have access to.

Scroll down and choose Enable OAuth Settings.

Now comes the tricky part. Looking at the documentation you should fill in … well it does not really say there but the path is https://login.salesforce.com/services/oauth2/callback. If you are using the sandbox (or test-version) the address is https://test.salesforce.com/services/oauth2/callback.

Lastly set the OAuth Scope to the level you need. To be sure it gets all the access it needs, simple choose Full Access and click Add to the right.

Now you are done. Click to Save and the wait for instructions to wait.

Getting some user info

In order to access the API you need the application’s Consumer Key and Consumer Secret. You can get them by looking at the app you just created.

Go back to the App Manager Page and find you App in the list to the right. Look to the far right of that row and click the “down arrow”, choose View.

There are two values here that you need to copy, the consumer key (usually a very long text string of gibberish) and then your Consumer secret, usually a string of numbers.

Getting your security token

This is a token that is used to verify your password when you call to login to the API. There might be a way of getting it without resetting it (as per the instructions below) but it will at least work.

Open your own personal page (up to the right) and click settings.

I the menu to the left find the item “Reset My Security Token”

Click it and the click the Reset Security Token button.

A new token will be sent to you in a minute. Continue with the instructions here and wait for it.

Login and get an access token

Time to put all this to good use. I personally use Postman to test the API. Here is how you should configure the POST to make sure you get the access token back.

Method: POST

Headers

Content-Type: application/x-www-form-urlencoded

URL: https://login.salesforce.com/services/oauth2/token or https://test.salesforce.com/services/oauth2/token if you are using the Sandbox.

Then you need to add the following params to your URL string.

grant_type:password

client_id: The Consumer Key you copied above

client_secret: The Consumer Secret you copied above

username: Your username that you used to log into Salesforce. Note! If you are using an e-mail address you should escape the @-sign as %40. So, if your username is mikael_sand@salesforce.com it should be formatted as mikael_sand%40salesforce.com

password: The password you used to log into Salesforce and then add the security key that was e-mailed to you.

Now you are ready log in. Click Send in Postman and if it works, you will get back some nice JSON with an access-token.

Test the access token using a standard call

Now to test that the access token works.

Simply send configure Postman like this:

Method: GET

Headers

Authorization: Bearer [the access-token above] (Note that there is a space between “Bearer” and the token.

URL: Here you need to know what instance of Salesforce you are running on. This is suppled in the authorization call above in a JSON property called “instance_url”.

The path for getting information on the Account object is this: https://instance_url/services/data/v39.0/sobjects/Account/describe. The v39.0 may shift, this is the latest version at the time of writing.

Click send and you should get back some nice JSON describing the fields of the Account object.

Errors?

If you get back an error like “Session Expired or Invalid” make sure that:

  1. You send the call to the correct instance url (test vs prod got me here).
  2. You send the correct access token in the Authorization header (got me once).
cropped-IMG_7284.jpg

An easier way to install Logic App Prereqs

Recently I have been doing some teaching work on Logic Apps. The sessions have been focused on basic “beginning” but also how to use Visual Studio for development. The ALM functionality in VS is preferred at this point in time.

There were some on the installation though so I thought I would post a more to-the-point solution here.

The official instructions can be found here.

If you find anything wrong with this guide, please provide feedback using my e-mail or by commenting below.

Install Visual Studio 2015

The software can be found here, or using your MSDN subscription.

Install Azure SDK and so on

The easier way is to simply get the Web Platform Installer (WebPI).

Using that you can simply check the things you need, start installation and go have a coffee.

Finding Azure PowerShell

Search for “azure powershell”, and add it. Use the latest version.

Finding Azure SDK

Then do a search for “azure sdk”. Find the one highlighted in the picture, and add it. If the version number is higher than the one in the picture, use that version.

The downside of screen grabs is that they do not update by themselves.

Installing

Now simply click Install and have yourself a well-deserved break.

Installing the Logic Apps Extension

Open Visual Studio 2015 and choose Tools/Extensions and Updates…

Select “Online” in the menu to the left.

Search for logic apps

Select “Azure Logic Apps Tools for Visual Studio” and choose install. If you miss any prereqs the installer will point that out and you will not be able to install.

Further reading and testing it out

To make sure you have everything you need and start flexing your developer skills, you can follow this handy guide: “Build and Deploy Logic Apps in Visual Studio

cropped-IMG_7284.jpg

InvalidTemplateDeployment in Azure RM

Using scripting when deploying Logic Apps, and the surrounding bits is very useful. If you have set something up it is very easy to just script it and save it locally or under your templates.

I stored mine locally and got the error above when deploying. My thoughts where “An error in the template? The template that was generated for me? This is not good”.

I tried opening the template file and found some minor upper and lower case errors but that did not do it.

The solution was to get more information! You need to access your subscription’s activity logs. You can find it in the left side menu or by searching for “Activity Log” in the expanded menu.

The starting query should return your failed validation of the template.

Click on the row of the failed validation (it is not a link strangely) and choose to show the info as JSON.

Scroll down to the end of the message. Under the tag “properties/statusMessage” you will find the full story. In my case (I am ashamed to say) the name of the storageaccount was invalid.

cropped-IMG_7284.jpg

CaseSensitiveDeploymentParameterNamesFound

I got this error when deploying a Logic App. Since I could not find anything on it I just thought I would do a quick post about it.

If you Google it, you get zero hits and instead you get pointed to a page on keeping parameters secret. Not a bad idea but it did not solve anything for me.

The real error was easy to fix. I simply had input two parameters with the same name but with different cases. This was then interpreted as me trying to use case-sensitive names in my deployment. That is not how it’s done. Keep variable names case-insensitive.

cropped-IMG_7284.jpg

SQL Server Edition Upgrade might fail

What happened?

A while back I tasked myself with automating an SQL server edition upgrade using PowerShell.

I ran into some problems. I made sure the upgrade was as /s (silent) as possible and so I only got a very rudimentary progress bar. The upgrade would seem to take a long time and after two hours of waiting I decided that the upgrade had “hung”. I repeated the upgrade but kept a look at the log file.

What was wrong?

Looking into the log file I found that the thing that seemed to hang was this row:

Waiting for nt event ‘Global\sqlserverRecComplete’ to be created

How to solve it?

Searching for it online I found several reasons for this and one (unsupported) option stood out, simply skip the rules-check.

If the upgrade fails in this way, simply add the following to your PowerShell string:

/SkipRules=Engine_SqlEngineHealthCheck

The implications

Some images on Azure has SQL Server evaluation edition installed by default. You usually want to upgrade these to Developer Edition, using the built in Edition Upgrade functionality.

If you run into the “hang” issues you have to upgrade SQL server without checking the rule SQLEngineHealthCheck.