Category: Uncategorized

Getting the DateTime from unix epoch in Logic Apps

Relating to my other post on Get unix epoch from DateTime in Logic Apps, there is a much simpler way to calculate UtcNow from a Unix timestamp. There is even no real math involved. All you do is make use of the built in function addToTime.

Here is the expression: addToTime('1970-01-01', 1508852960,'second')

So if you receive a JSON body with a tag called UnixTimeStamp, containing the Unix timestamp it will look like
addToTime('1970-01-01', int(triggerBody()?['UnixTimeStamp']),'second')

Hope you can make use of it.

Get unix epoch from DateTime in Logic Apps

This was posted as a question on the forums a while back. I thought this was a very interesting question as dates, math and the combination of the two intrigues me.
There is a very easy way to achieve this using c# and Azure Functions:

Int32 unixTimestamp = (Int32)(DateTime.UtcNow.Subtract(new DateTime(1970, 1, 1))).TotalSeconds; 

But I wanted to solve it using only Logic Apps functionality or at least see if it was possible.

How to get the value (math)

To make it work we need to use the functionality called ticks. Ticks are part the Windows OS and is a large number that means the number of 100 nanoseconds that has passed since Jan 1st year 0 UTC (western Christian calendar). Unix time is the same but is the number of seconds that has passed since Jan 1st 1970 UTC. These constants in time, and their relation to each other, can be used to calculate the value we need.

One second is 10 000 000 ticks.

TTN is the number of ticks from start until now. TT1970 is the number of ticks that passed from start until 1970. This constant is 621355968000000000.

The calculation looks like (TTN-TT1970) / 10 000 000.

Calculating the Unix value for “Now” (October 24th 2017 13:28) looks like

(636444485531778827 – 621355968000000000) = 15088518843498049
15088518843498049 / 10 000 000 = 1508851993 

How to get the value (Logic App)

  1. Set up a new Logic App that can be triggered easily. I usually use a HTTP Request / Response.
  2. You need two variables so create to “Initialize variables”.
  3. Name the first TicksTo1970, set the type to Integer and set the value to ticks('1970-01-01').
  4. Name the second TicksToNow, set the type to Integer and set the value to ticks(utcNow()).
  5. Now you are ready to do the calculation. If you have used a Request / Response, set the Response Body to
    div(sub(variables('TicksToNow'),variables('TicksTo1970')),10000000)
  6. Save your flow, execute it to receive the value and validate it against https://www.unixtimestamp.com/

Authenticating an API call to Azure

This is more for me personally, rather than trying to put something new out there. A while back I struggled with getting something simple and basic to work. The reason is that there is usually too much useful information on “options” and “you have to decide”. I took upon myself to document the simplest of authentication flows, when authenticating your call to an Azure service.

Note that not all Azure Services use this way of authenticating. Azure Keyvault does its own thing, and so does Azure Storage.

This article is not a full walkthrough but a condensed walk this way.

The call should look like this

HTTP POST
https://login.microsoftonline.com/AzureTennantId/OAuth2/Token
BODY Encoding type: application/x-www-form-urlencoded

Keys and values:
grant_type
 : client_credentials
client_id : {your azure client ID}
client_secret : {your azure client secret}

Successful response

{
"token_type": "Bearer",
"expires_in": "3599",
"ext_expires_in": "0",
"expires_on": "[numeric value]",
"not_before": "[numeric value]",
"resource": "[guid]",
"access_token": "[loooong secure string]"
}

From postman

The collection can be found here.

What is all this?

Down here, I can fill in some information. Basically you need three things:

  1. The Tenant ID of the subscription you want to access.
  2. The Client ID
  3. The Client Secret.

Getting the Tenant Id

There are a lot of ways to do this. My favorite way is to use an API-call. The API-call will fail but the tenant ID can be found in the headers.

Issue a GET to https://management.azure.com/subscriptions/{AzureSubscriptionID}?api-version=2015-01-01

In the result, look at the headers and find WWW-Authenticate. In the value for that header there is a GUID, that is the tenant ID. The call can be found in the postman collection I uploaded for this post.

Getting the Client ID

This is a bit hairy as there are several steps to do this and some concepts you need to understand. The short version is this: You create a “client” in Azure. This “client” is an identity (much like a regular user). The old “service user” might be a good way of describing it. In the end you will have a GUID. That is the client ID. The best instructions on how to create a client in Azure can be found here.

Getting the Client Secret

This is just bit a further down the page on how to create a client. Make sure you save the key (secret) properly.

Full information

If you need more information on how to authenticate an API call, a very good place to start is on the Azure Rest API reference page.

BizTalk 2020 quiet release

There is a new version of BizTalk out. This time it is called BizTalk 2020 and it has some really nice new features. I was surprised that the release contains new and interesting features, not only a platform alignment.

This time it was released without any marketing whatsoever, which is called a quiet release. I can only speculate why this is, but my guess is that this is the last version and Microsoft does feel they want to onboard new customers.

So, what do you get? Here is the complete list, but let me list the ones that are most interesting, and do not forget that some previously key features has actually been removed.

My top 5 new features

1 – Operational Data Monitoring and Analytics

You can send trackingdata to Azure and get a PowerBI dashboard out of the box(!). Without any additional monitoring software. You also get access to the storage capabilities available in Azure, and store years of data rather than days.

2 – API Management

People know of my love for API management, and being able to publish BizTalk orchestrations as APIs directly, and also use the APIm policies to alter the messages before sending them to BizTalk, is very powerful

3 – Auditing

Finally! The age-old question of “who stopped the receive port” can be answered by simply looking into logs. As it should always have been.

4 – XSLT 3.0

Building powerful maps using custom XSLT will be easier and better than ever.

5 – Support for always encrypted

Built on SQL Server of course but the support will make sure that BizTalk is an on-prem force and a integration tool for those very, very secret things.

My to 3 good riddance

There are also some things that have been remove from BizTalk and these are my top three, good riddance. Some are marked as Deprecated, so “in the release, but don’t use them”.

1 – SOAP Adapter

If you built something new with it, shame on you. 32 bit old school, with functionality covered by the WCF-BasicAdapter

2 – BAM Portal

I was once forced to present the BAM portal as the viable option to a client. I still cringe.

3- Samples

Have you heard of “The Internet”? You do not need to download static versions of it anymore.

Happy 2020 and Happy 2020 version.

Using the HTTP connector for other things

There are a lot of connectors in Logic Apps, and they usually make your life a lot easier but sometimes there might be even better ways to connect to an Azure Service.

The “problem”

This is not a fix-that-bug post so there really is no problem, however I think you can consider using another approach sometimes. This was evident when the team could not use the Azure Table-connector some weeks ago. Due to security reasons we had to use the HTTP-adapter and call the table storage API-directly, and in the end it solved a very big problem for us.

Azure Services APIs

A lot of Azure services have APIs. You can find documentation for them here. They include Cosmos DB, MySQL, maintenance, subscriptions and much more. If there is no connector for the thing you need to do in Azure, perhaps there is an API that you can call. Sometimes the APIs can be much more granular and have a little more finesse than the connector.

I therefore suggest you should check out the possibilities when using Logic Apps (and even functions). If you feel the connector lack a bit of refinement, or behaves in unwanted ways, take a look at the APIs.

Azure Table storage

I will use Azure Table Storage as an example. There is a Table Storage Connector that does the job, but it does not do it very well. take a look at this flow that was built using the original connector:

The original flow has been lost to time but the important thing here is to look at the remove metadata. Every call to the storage responds with three additional properties: odata.etag, PartitionKey and RowKey. We did not want to return that data to the caller and so it was removed. However, this was done using the “RemoveProperty” operation and for some strange reason the combination of that, together with the “Add to response” at the bottom every row took between 2 and 5 seconds(!). When returning rowsets of 30 rows, we where talking minutes to respond.

What can be done using the connector?

First off, you have to ask: What can I do just using the connector? In the case above, the developer could use the parameter called Select query to return old the columns needed and omitting the artitionKey and RowKey, but the adapter would still return the odata.etag, and therefore the need for one "remove metadata" and the Add to Response message would still be needed, and was the most time consuming.

Sequential vs parallel

The next thing you can look at is the flow control. In this case the data manipulation was done in a loop. Try changing the Degree of parallelism to one and run the flow again, and then try the max value. In our case it made little to no change.

Using the API directly

To start off there is and inherent problem with using the API and that is the security model and recycling of SAS-keys. You have to be aware of it, that is basically it.

Going into this part of the plan we knew we had one issue: To return only the data we needed to send back to the caller. This meant only the columns they wanted and then remove the “odata.etag”.

Looking at the documentation for querying tablestorage for entities we found three things to use:

Authorization

According to the documentation this was supposed to be a header but you can just as easy just use the querystring, i.e. the string you copy from the storage account to give you access

/

$select

The API supports the ability to ask for only a subset of the columns and thereby having the same capability as the connector.

Ask for no metadata back

By setting the Accept header to “application/json;odata=nometadata” you can omit any metadata.

The resulting call

/
Note the URL-escaping. We did not succeed in using the Queries part and I think that is due to how they are URL-encoded when sent to the service. So we had to put everything in the URI-field.

Result

By combining these we could make sure the caller would get the correct data and we did not have to manipulate it before returning the payload. This resulted in calls that responded in milliseconds instead of a full minute.