Installing Azure PowerShell

I

Thanks to everyone that attended my part of the Global Integration Bootcamp.

Also sorry for omitting the prereq of Azure PowerShell in my lab number two. The use of PowerShell to manage, deploy and automate things in Azure is very useful.

The easiest way to download and install Azure Power Shell is by using the Web Platform Installer, download here.

Open it and search for Powershell.

Look for Azure PowerShell and click Add.

Then click install (far down to the right) and let it run.   

Close any open Powershell consoles and start a new one. The login should work.

The easiest way to connect to the SalesForce API

This post is a simple reminder of how to simply get going connecting to Salesforce using the API. The security model is very rich and the documentation is sadly both lacking and, in some cases, wrong. I therefore took it upon me to create a “How to get to the first successful message” post.
Any additional information can be found in the Salesforce documentation.

Note that the security might not be optimal for your needs down the line in your use of Salesforce. Personally, I used the API to upload Account and Contact data and the let loose the salespeople on the application.

The steps are these:

  1. Make sure you have access to Salesforce.
  2. Make sure you are a Systems Admin.
  3. Setup a Connected Application, this is the connection used for the API calls.
  4. Getting some user info
  5. Getting your Security Token
  6. Login and get an access token.
  7. Test the access token using a standard call.

I will use the simple Username and Password flow. There are others, but this seems to fit my needs the best.

Here we go.

Make sure you have access to Salesforce

You have been assigned a user and a path for login. Usually this is login.salesforce.com or test.salesforce.com if you are using the sandbox.

Make sure you are a Systems Admin

Access the Setup part of Salesforce. This is usually done by clicking the cogs up to the right of the screen.

A new tab will open with all the settings.

Access the “Users” setting by clicking the menu to the left under Administration. Click Users and the Users again.

In the list to the right, find your identity and make sure it is System Administrator.

Setup a connected Application

In the menu to the left find the Platform Tools heading. Click Apps and then App Manager

The list to the right contains all the currently connected apps. Ignore that and look for the button saying New Connected App. It is up to the right. Click it.

Time to fill in the fields.

Connected App name: A name you choose. Can be anything.

API Name: Auto fills. Do not touch it.

Contact e-mail: Fill in a valid e-mail that you have access to.

Scroll down and choose Enable OAuth Settings.

Now comes the tricky part. Looking at the documentation you should fill in … well it does not really say there but the path is https://login.salesforce.com/services/oauth2/callback. If you are using the sandbox (or test-version) the address is https://test.salesforce.com/services/oauth2/callback.

Lastly set the OAuth Scope to the level you need. To be sure it gets all the access it needs, simple choose Full Access and click Add to the right.

Now you are done. Click to Save and the wait for instructions to wait.

Getting some user info

In order to access the API you need the application’s Consumer Key and Consumer Secret. You can get them by looking at the app you just created.

Go back to the App Manager Page and find you App in the list to the right. Look to the far right of that row and click the “down arrow”, choose View.

There are two values here that you need to copy, the consumer key (usually a very long text string of gibberish) and then your Consumer secret, usually a string of numbers.

Getting your security token

This is a token that is used to verify your password when you call to login to the API. There might be a way of getting it without resetting it (as per the instructions below) but it will at least work.

Open your own personal page (up to the right) and click settings.

I the menu to the left find the item “Reset My Security Token”

Click it and the click the Reset Security Token button.

A new token will be sent to you in a minute. Continue with the instructions here and wait for it.

Login and get an access token

Time to put all this to good use. I personally use Postman to test the API. Here is how you should configure the POST to make sure you get the access token back.

Method: POST

Headers

Content-Type: application/x-www-form-urlencoded

URL: https://login.salesforce.com/services/oauth2/token or https://test.salesforce.com/services/oauth2/token if you are using the Sandbox.

Then you need to add the following params to your URL string.

grant_type:password

client_id: The Consumer Key you copied above

client_secret: The Consumer Secret you copied above

username: Your username that you used to log into Salesforce. Note! If you are using an e-mail address you should escape the @-sign as %40. So, if your username is mikael_sand@salesforce.com it should be formatted as mikael_sand%40salesforce.com

password: The password you used to log into Salesforce and then add the security key that was e-mailed to you.

Now you are ready log in. Click Send in Postman and if it works, you will get back some nice JSON with an access-token.

Test the access token using a standard call

Now to test that the access token works.

Simply send configure Postman like this:

Method: GET

Headers

Authorization: Bearer [the access-token above] (Note that there is a space between “Bearer” and the token.

URL: Here you need to know what instance of Salesforce you are running on. This is suppled in the authorization call above in a JSON property called “instance_url”.

The path for getting information on the Account object is this: https://instance_url/services/data/v39.0/sobjects/Account/describe. The v39.0 may shift, this is the latest version at the time of writing.

Click send and you should get back some nice JSON describing the fields of the Account object.

Errors?

If you get back an error like “Session Expired or Invalid” make sure that:

  1. You send the call to the correct instance url (test vs prod got me here).
  2. You send the correct access token in the Authorization header (got me once).

An easier way to install Logic App Prereqs

Recently I have been doing some teaching work on Logic Apps. The sessions have been focused on basic “beginning” but also how to use Visual Studio for development. The ALM functionality in VS is preferred at this point in time.

There were some on the installation though so I thought I would post a more to-the-point solution here.

The official instructions can be found here.

If you find anything wrong with this guide, please provide feedback using my e-mail or by commenting below.

Install Visual Studio 2015

The software can be found here, or using your MSDN subscription.

Install Azure SDK and so on

The easier way is to simply get the Web Platform Installer (WebPI).

Using that you can simply check the things you need, start installation and go have a coffee.

Finding Azure PowerShell

Search for “azure powershell”, and add it. Use the latest version.

Finding Azure SDK

Then do a search for “azure sdk”. Find the one highlighted in the picture, and add it. If the version number is higher than the one in the picture, use that version.

The downside of screen grabs is that they do not update by themselves.

Installing

Now simply click Install and have yourself a well-deserved break.

Installing the Logic Apps Extension

Open Visual Studio 2015 and choose Tools/Extensions and Updates…

Select “Online” in the menu to the left.

Search for logic apps

Select “Azure Logic Apps Tools for Visual Studio” and choose install. If you miss any prereqs the installer will point that out and you will not be able to install.

Further reading and testing it out

To make sure you have everything you need and start flexing your developer skills, you can follow this handy guide: “Build and Deploy Logic Apps in Visual Studio

InvalidTemplateDeployment in Azure RM

Using scripting when deploying Logic Apps, and the surrounding bits is very useful. If you have set something up it is very easy to just script it and save it locally or under your templates.

I stored mine locally and got the error above when deploying. My thoughts where “An error in the template? The template that was generated for me? This is not good”.

I tried opening the template file and found some minor upper and lower case errors but that did not do it.

The solution was to get more information! You need to access your subscription’s activity logs. You can find it in the left side menu or by searching for “Activity Log” in the expanded menu.

The starting query should return your failed validation of the template.

Click on the row of the failed validation (it is not a link strangely) and choose to show the info as JSON.

Scroll down to the end of the message. Under the tag “properties/statusMessage” you will find the full story. In my case (I am ashamed to say) the name of the storageaccount was invalid.

CaseSensitiveDeploymentParameterNamesFound

I got this error when deploying a Logic App. Since I could not find anything on it I just thought I would do a quick post about it.

If you Google it, you get zero hits and instead you get pointed to a page on keeping parameters secret. Not a bad idea but it did not solve anything for me.

The real error was easy to fix. I simply had input two parameters with the same name but with different cases. This was then interpreted as me trying to use case-sensitive names in my deployment. That is not how it’s done. Keep variable names case-insensitive.

SQL Server Edition Upgrade might fail

What happened?

A while back I tasked myself with automating an SQL server edition upgrade using PowerShell.

I ran into some problems. I made sure the upgrade was as /s (silent) as possible and so I only got a very rudimentary progress bar. The upgrade would seem to take a long time and after two hours of waiting I decided that the upgrade had “hung”. I repeated the upgrade but kept a look at the log file.

What was wrong?

Looking into the log file I found that the thing that seemed to hang was this row:

Waiting for nt event ‘Global\sqlserverRecComplete’ to be created

How to solve it?

Searching for it online I found several reasons for this and one (unsupported) option stood out, simply skip the rules-check.

If the upgrade fails in this way, simply add the following to your PowerShell string:

/SkipRules=Engine_SqlEngineHealthCheck

The implications

Some images on Azure has SQL Server evaluation edition installed by default. You usually want to upgrade these to Developer Edition, using the built in Edition Upgrade functionality.

If you run into the “hang” issues you have to upgrade SQL server without checking the rule SQLEngineHealthCheck.

How to only poll data on specific weekdays using the WCF-SQL adapter

There are a lot of solutions to this particular question. The need is that we only poll data from a database on Sundays. This might be solved using a stored procedure that only returns data on Sundays. It might also be solved by using the famous schedule task adapter to schedule the poll for Sundays. You can also do some cool coding thing using a custom pipeline component that rejects data on all other days but Sundays. Your scenario might be very well suited for one of these solutions, the scenario presented by my colleague Henrik Wallenberg did not fit any of those.

The scenario

A database is continuously updated thru out the week but we need the export data from a specific table every Sunday at 6pm. We cannot use the schedule task adapter nor stored procedures. We decided to try to trick BizTalk using the PolledDataAvailableStatement in the WCF-SQL adapter on a receive port. Turns out it works! Here is how.

Please note that this does not work if you cannot use ambient transactions.

According to this post, you must set Use Ambient Transaction = true if you need to us a polledDataAvailableStatement. This seems really odd to me but after receiving feedback about this article I know that it is true.

 

The solution

  1. Create the receive location and polling statement.
  2. Find the setting PolledDataAvailableStatement
  3. Set it to: SELECT CASE WHEN DATEPART(DW, GETDATE()) = 1 THEN 1 ELSE 0 END
  4. Set the polling interval to 3600 (once an hour).
  5. Apply your settings.
  6. Set the Service Window to only enable the receive location between 6pm and 6:30 pm.
  7. Now the receive location will only poll once a day and only execute the polling statement on Sundays.

More information

How does this work? It is very simple really. The property PolledDataAvailableStatement (more info here) needs to return a resultset (aka a SELECT). The top leftmost, first if you will, cell of this resultset must be a number. If a positive number is returned, then the pollingstatement will be executed, otherwise not. The SQL statement uses a SQL built-in function called DATEPART with a parameter value of “dw”, which returns “Day Of Week”. More information here. Day 1 is by default in SQL Server a Sunday, because Americans treat days and dates in a very awkward way. There might be some tweaking to your statement in order to make Sunday the 7th day of the week. So the statement SELECT CASE WHEN DATEPART(DW, GETDATE()) = 1 THEN ‘1’ ELSE ‘0’ END returns a 1 if it is day 1 (Sunday). This means that the pollingstatement will only be executed of Sundays. We then set the pollinginterval to only execute once an hour. This, together with the service window, will make sure the statement only executes once a day (at 6pm) as the receive location is not enabled the next hour (7pm). You could update the SQL statement to take the hour of the day into consideration as well but I think it is better to not even execute the statement.

The downside

This is not a very reliable solution though. What if the database was unavailable that one time during the week when data is transported? Then you have to either wait for next week or manually update the PolledDataAvailableStatement to return a 1, make sure the data is transported and then reset the PolledDataAvailableStatement again.

In conclusion

It is a very particular scenario in which this solution is viable and even then it needs to be checked every week. Perhaps you should consider another solution. Thanks to Henrik for making my idea a reality and testing it out. If you want to test it out for yourself, some resources to help you can be found here: InstallApp Script

Connecting to Wi-Fi without verifying certificate

I love Windows 10! One reason is that it simplifies a lot of things that I do not want to care about. Usually you just click an icon and things just work. There is a downside to this though: Sometimes you want to access the “advanced properties” and that can be tricky. This is a reminder article for me (and perhaps someone else) how you can alter settings for a wireless network connection in Windows 10.

The problem

The thing was this: I just returned back to the office after spending about 18 months at client. I wanted to connect my computer to the “BYOD-network” at the office. So I just clicked the icon but got an error message, the not very informative “cannot connect to network”. So this time I needed to access the advanced options, but in Windows 10 you cannot access those simply by right-clicking.

If you try to right-click any of the icons nothing happens, so you cannot get or change any information about the Wi-Fi access point. So I did not even know what was wrong. I did remember being able to connect my phone to the network, so I tried that, and I got it to work by not verifying the server certificate. So all I needed to do is to make Windows do the same. This, as it turns out, is not easy.

The solution

Basically I needed to manually create a connection to the Wi-Fi network. To do this, there are some steps. Looking at the overview you need to:

  1. Delete the existing connection from the Windows “known connections”.
  2. Create a connection to the wireless network manually, adding settings as you go.
  3. Updating credentials.

Delete connection

I tried to connect to the wireless network the usual way. When I did the network got added to the “Known Networks” even if I failed to connect. In order to add it manually you need to remove it first. This might not apply to you but make sure it is not in the known networks list.

  1. Open the Wi-Fi settings using the icon in the lower right corner and clicking network settings.
  2. Scroll down to “Manage Wi-Fi Settings” and click it.
  3. Scroll down to manage known networks and find the network in question, click it and then click the Forget button.

Manually create the connection

In order to change anything that is not the standard settings, you have to set them yourself.

  1. Open the old control panel
  2. Choose “Network and Internet”.
  3. Choose “Network and Sharing Center”.
  4. Click Setup new connection or network
  5. Choose according to picture and click next:
  6. In the “Network name”-box you have to enter the full name of the network as it was displayed in the network list earlier. This is usually the SSID as well.
  7. The security type can be different from what is shown in the picture. Choose what is most likely for you and click next.
  8. On the next page, choose Change connection settings. If you get a message that the network already exists you must remove it first (see above). It cannot be changed.
  9. The following page appears
  10. Click the Security tab
  11. Click the Settings button indicated by the picture.
  12. Untick the box indicated in the picture, if you need to remove the certificate verification.

    This is the setting that removes the certificate check that I needed. Click OK to close.
  13. Now click advanced settings.
  14. Select to Specify authentication mode and pick the one that applies to you.

    In my case it was “User authentication” as I do not use domain logon credentials. If you do, select the “User or computer authentication”.
  15. If you selected “User Authentication” you can opt to save your credentials now by clicking that button and entering your username and password. Click OK to close.

Update credentials

You have now created a new connection. Simply select it by clicking the wireless icon. If you have configured everything correctly, you can now connect. Perhaps you need to enter your credentials.

The BTSWCFServicePublishing tool

The basis for this post is mostly to serve as a reminder of the steps you need to go thru to use this tool.

Usage and why

Why do I need this tool? It is simply put the best way to deploy WCF-services on the local IIS and even if the BtsWcfServicePublishingWizard.exe can do it on the dev machine, you still have to deploy the service on the QA and Prod machines. Also it automates the deployment process as all that is needed is the WcfServiceDescription-file generated by the Wizard.

So in short: use the BtsWcfServicePublishingWizard.exe on the dev box and the BTSWCFServicePublishing tool on all the others.

Getting the tool

The tool is strangely not a part of the BizTalk deployment. You have to download and install it separately. The tool can be downloaded here. The package is a self-extracting zip. Just unpack it at a location of choice. I usually have a “Tools”-folder somewhere.

Configuring the tool

I don’t know why but Microsoft left some configuration out when publishing this download. In order to make the tool work on BizTalk 2013 and 2013R2-versions you have to update the configuration to use version 4.0 of the .net framework, otherwise it will not be able to use the BizTalk schema DLL:s as intended. The fix is simple though. Just open the BtsWcfServicePublishing.exe.config from the newly extracted package and add the following settings at the top, just under the configuration-tag.

<startup
useLegacyV2RuntimeActivationPolicy=true>

<supportedRuntime
version=v4.0 />

</startup>

Now the tool will work properly. If you don’t do this the error Error publishing WCF service. Could not load file or assembly ‘file:///C:\Windows\Microsoft.Net\assembly\GAC_MSIL\ProjectName\v4.0_1.0.0.0__keyhere\ProjectName.dll’ or one of its dependencies. This assembly is built by a runtime newer than the currently loaded runtime and cannot be loaded.

(Side note: The “-signs in the xml above can be jumbled due to the blog’s theme. Just make sure they are the straight kind).

Running the tool

Simple as can be. Open a cmd-promt run the program and give it the WcfServiceDescription.xml file as input. The program will deploy the website/service as configured in the file.

This file is located under the App_Data/Temp folder when you use the BtsWcfServicePublishingWizard to publish the site locally.

More information

A command-line reference can be found here.