A better alternative to Custom Connectors in Azure

What is this?

I have found a new connector which is much better suited for calling OnPrem services from Logic Apps.

The use of a custom connector

A custom connector is meant to be a bridge between your Logic App and an on premise service, handling JSON or XMLSOAP. It can also be used as a way of minimizing the clutter in a confusing API and exposing only the necessary settings to your Logic App developer. You do not even have to access on premise services using the connector.

I have have some experience in setting up OnPrem integrations and have posted about Planning installation of the On Premise Gateway and finally solving how to deploy the Custom Connector using ARM.

A better connector

My use of the Custom Connector has always been to call OnPrem services that are either SOAP or JSON based. For that I have used a custom connector but there is a great alternative, if you know how to create XML Envelopes or JSON bodies. The only thing you will loose using this connector is the nice interface in the Logic App.

Let me point you to the HTTP with Azure AD connector, and yes you can use it with Azure AD but this post is about replacing your Custom Connectors.

The scenario

I have an OnPrem service that uses old school SOAP and XML to provide information about Customers. The service uses Windows Authentication and will respond with the given customer as an XML response.

If you don´t know SOAP: it is basically a HTTP POST with a header named SOAPAction and an XML body. I will show you a JSON call at the end as well.

Using the connector

Prerequisite: In order to access OnPrem services, you must install the OnPrem data gateway.

Add the connector to a Logic App

Add a new Action the same way as always, Search for HTTP with Azure AD. I know, it says Azure AD. It is strange but trust me.
file

Now look at the available Actions.
file
The first one will only use the GET HTTP verb. Choose the other one.
You will now get this scary image.
file
I know it says Azure AD but here comes the payoff. Click the Connect via On-premises data gateway checkbox.
file
TADAAAAA! You can now start filling in the settings for your service call, directly to the OnPrem Service without using a Custom Connector.

Configure the connector

Authentication Type

Start by choosing the Authentication Type. Note that the Username and Password fields are still visible, even if you choose Anonymous.
I will use Windows Authentication, which, by the way, is not supported in the Custom Connector.

Base Resource URL

This is the start path to your service URL as if it was called from within the OnPrem network.
A full service URL might look like this: http://webapiprod/EmployeeService/GetEmployee.asmx, then the base would be http://webapiprod/EmployeeService or http://webapiprod depending on how you want to slice it. The full service URL will be defined later. Also notice that you do not end the base url with a /.

I will call a service with the full path http://erpsystem/WebServices/GetCustomers.asmx so I opt for http://erpsystem/WebServices

Windows Authentication

This is very simple: enter the username (including domain) and password to the OnPrem user you need to use for authentication. I have entered the username and domain like this companydomain\erpwebusr

Gateway

Choose the appropriate subscription and gateway. I have to censor mine for obvious reasons.

Done

The final product looks like this:
file
Just click Create to start using it.

Start using the connector

I test a lot of APIs in its raw format, using Postman or HTTP RestClient with VS Code or even SOAPUI back in the day. I know how to format a message. I think you do too. I will now configure the connector to execute a call to the SOAP service to get Customer data.

Choose a Method

You have to choose the HTTP verb you need. For SOAP you always use a POST, but your service might need to use a GET.

Url of the request

Here you can either enter the full url or the last part depending on how you feel about either.
My full path was http://erpsystem/WebServices/GetCustomers.asmx and I opted for http://erpsystem/WebServices as the base path. Therefore I can either enter the full path, or be fancy and just enter /GetCustomers.asmx. I am fancy.

Headers

You need to add headers to your call. In the case of SOAP, you need SOAPAction, Content-type and lastly an Accept header.
The last one seems to tell the connector what data to expect back. If you do not set this to text/xml, the connector will expect a JSON message and give you a 400 Bad Request back.

If you call a Rest-service you will only need a content type if you supply a BODY.

You might also need to send additional headers, like API-keys and such. Simply add the ones you need.

Add headers by clicking the Add new parameter dropdown.
file

Select Headers and fill in the headers you need, depending on your needs. For SOAP these are located in the WSDL-file.

I need to set SOAPAction: GetCustomer, Content-type: text/xml and Accept: text/xml

file

Body of the request

If you are calling a SOAP service or sending data to a rest service, you need to set the body. This Connector even supports XML in the designer. So simply supply the XML you need to execute your call. I must send this:

<soap:Envelope xmlns:soap="http://www.w3.org/2003/05/soap-envelope" xmlns:get="http://erpsystem/webservices/GetCustomers" xmlns:cus="http://myns.se/customer" xmlns:man="http://www.erpsystem.com/Manager">
   <soap:Header/>
   <soap:Body>
      <get:Execute>
         <get:request>
            <get:_requests>
               <get:GetCustomers>
                  <cus:customer_Id>XXXX</cus:customer_Id>
               </get:GetCustomers>
            </get:_requests>
         </get:request>
      </get:Execute>
   </soap:Body>
</soap:Envelope

I know. SOAP was not aimed at being lightweight.
The XXXX part will be replaced by the customer ID sent to the Logic App.

The end product looks like this:
file
The Accept header is missing in the picture. It is from an earlier version of this post.

Calling the backend service

A call to the service might look like this (I blocked out some sensitive things):
file

Looking at the call

Please note that the Body of the request is always sent as a Base64 encoded. This means that you need to have access to a decoder to read what you actually sent. It also means that the service you are calling must accept Base64 encoded payloads. Binary payloads are not supported.

Looking at a JSON Call

Here is an Action configured for JSON and the resulting execution. The use of POST and sending the customer ID in the message body is due to the design of the service. I would also add a content-type to the call, just to be sure.
file
file

Conclusion

As long as you do not have the need to send data in binary to a backend service, you should really start using this connector and not bother with the custom connector.

Removing DTA orphans in BizTalk

What is this?

This is a re-release of an old article about BizTalk. I wrote the original in 2014 and removed it in a blog migration. However, a colleague asked about it so I am re-writing it.

What are Orphans?

When tracking is performed in BizTalk, it takes several different steps. The first thing is to create a row in the dta_ServiceInstances table in the DTA database. There is a bug in BizTalk that prevents the line from being removed by the DTA Purge and archive cleaning job. This creates an Orphan and if there are many, the table is filled with unnecessary junk.

Why?

When tracking starts the field dtStartTIme is set to the current UTC time. In the same row the field dtEndTime will be set once the flow is completed. Sometimes it does not. This seems to be related to a high number of exceptions or the use of request response with a SendPort group (do not do that!).

The row should be removed by the clean up job, but that job only removes completed rows, i.e. rows with a dtEndTime that is not null.

How many Orphans?

I think about 1000 is too many. To find out the number of orphans in your BizTalk installation run the script below. It should be run in BizTalkDTADb.

select
    count(*) as 'NoOfOrphans'
from 
    [BizTalkDTAdb].[dbo].[dta_ServiceInstances]
where 
    dtEndTime is NULL and [uidServiceInstanceId] NOT IN 
    (
    SELECT 
        [uidInstanceID]
    FROM 
        [BizTalkMsgBoxDb].[dbo].[Instances] WITH (NOLOCK)
    UNION
    SELECT 
        [StreamID]
    FROM 
        [BizTalkMsgBoxDb].[dbo].[TrackingData] with (NOLOCK)
    )

The query clearly shows that you need to not only look in the DTA-database, but also count the number of instances still in the MessageBox, meaning active messages. If you want more information on the instances just do a SELECT * instead of SELECT count(*). You can clearly see that the data has a dtStartTime but not a dtEndTime.

file

How to remove them

The BizTalk Terminator tool

I do not know if this exists or is supported anymore. Be advised! The below text is form 2014.

This is the supported way of removing orphans. It is, however, important to point out that the entire BizTalk environment needs to be stopped for it to work, and if that is not a useful alternative, you can run the actual script that the Terminator tool runs.

The T-SQL Script from the terminator tool

This simple script updates the orphaned rows and sets dtEndTime to the current UTC time, making it possible for the cleanup job to remove the row.

BEGIN TRAN
USE [biztalkDTADb]

UPDATE
    [dbo].[dta_ServiceInstances]
SET 
    [dtEndTime] = GetUTCDate()
WHERE
    dtEndTime is NULL 
    AND 
    [uidServiceInstanceId] NOT IN
    (
    SELECT 
        [uidInstanceID]
    FROM
        BizTalkMsgBoxDb.[dbo].[Instances] WITH (NOLOCK)
    UNION 
    SELECT 
        [StreamID]
    FROM
        BizTalkMsgBoxDb.[dbo].[TrackingData] WITH (NOLOCK)
    )
-- If it works, uncomment the following row and run it. It commits the query to the database
-- Commit tran
-- If it DOES NOT work: uncomment the following row and run it. It undoes the query.
-- Rollback tran

Note that the script has the same query for finding orphans as the above that only counts the number of orphans.
In order to perform this update by the book, flow these steps:

  1. Run the first query to find the number of orphans.
  2. Run the second script and note the number of rows that was updated.
  3. If the number of rows matches you can uncomment and run the Commit Tran row. Done.
  4. If the number of rows does not match, something is wrong and you must run the Rollback Tran row. Not done.

NOTE! You must run both the query and the Rollback or Commit in the same query window.

If you go to number 4, check the scripts and make sure you are running the scripts on the same database. If you have a constantly increasing number of orphans in your DB, that might also be the reason.

How are the orphans removed

The rows can now be deleted by the DTA Purge and archive job, in accordance with the job settings. This might take a while as your settings might let older tracking data be in the database for several days. Also, the job runs once every minute so you just might need to wait for 60 seconds.

How to use Logic Apps Custom Connectors with ARM and CI/CD

What is this?

The documentation on how to deploy Logic Apps Connectors in a proper CI/CD scenario is … lacking.
Being able to use Connectors and to deploy them automatically, is a must in today´s enterprise landscape. I have worked with Connectors for two years and have just recently been able to have everything fully automated.

Also: A huge shout-out to Maxim Zhizhin for solving the last part of the problem and sharing it with me.

Who is this for?

This is not a full walkthru on how to use Connectors, and it mostly focuses on how to use a Connector to connect to an on-premise service, i.e. using the On Premise Data Gateway.
If you are looking at how to incorporate a Custom Connector into your CI/CD setup using ARM, this is for you.

The TLDR

If you know what you are doing and is just looking for the ARM-template file, you can get it here.

The Steps

These are the steps:

  1. Finnish configuring you connector.
  2. Export the Starting ARM Template
  3. Examining the ARM
  4. Adding the Swagger/OpenAPI Definition to the ARM template
  5. Adding additional settings
  6. The Parameter File
  7. Trying to deploy

Finnish configuring your connector

You have created a Logic App Custom Connector and know that it is working properly. In my case I built a SOAP Passthru (the best way to connect to SOAP services). The SOAP service is located on-premise. It looks like this:

file
file
The service sends a POST with a header for the SOAPAction (which a SOAP service needs) and another header for Content-type, signaling to the SOAP service that we are using XML and also in what encoding,

Export the Starting ARM Template

I put emphasis on starting because this ARM Template that will be exported is in NO WAY complete.

  • Find the Export Template in the starting page of the Connector (it is under automation).
    file
  • Click it and after the template has been generated, find Download at the top of the screen.
    file
  • Download and open the file using your preferred tool. I am using VS Code.

Examining the ARM

Looking from the top to the bottom, you first have a parameter for the connector name. Then you have basic information about the region. Further down you have information about authentication (basic or none), and funny enough some information about how to fill in the UI for the auth part.
Then we come to the use of the gateway which is a repeat of the auth settings. Is should look like this:

"gateway": {
  "type": "gatewaySetting",
  "gatewaySettings": {
    "dataSourceType": "CustomConnector",
    "connectionDetails": []
  },
  "uiDefinition": {
    "constraints": {
      "tabIndex": 4,
      "required": "true",
      "capability": [
        "gateway"
      ]
    }
  }
}

This only defines the UI for use of the gateway, it does not actually implement it.

Scroll further and you have some basic settings, like the name and description. If you have used an Icon it will be exported as a base64 encoded.

Adding the Swagger/Open API definition

The ARM Template you have saved is not complete. You have to manually add the Open API definition to it.
To find the information you go back to the portal, and the start page (Overview) of the Connector. Find the Download link at the top.
file
Click to download the file. Note! The file has no file extension and therefore you have to choose a text tool to open it.
file
Copy the text from the file into a new property called swagger, and add the copied text. The end result should look like this:

"swagger": {
        "swagger": "2.0",
        "info": {
            "title": "SOAP pass-through",
            "description": "Get Info SOAP passthru",
            "version": "1.0"
        },
        ...
    }
}

Adding Additional Settings

There are four additional settings you need to manually enter in order for this to work properly.
All these settings should be on the same level as "iconUri", so just add these settings before that property.

apiType

Set this property to

"apiType": "Soap"

backendService

This property is the path to the on-premise service as if you published your connector on the same network. In my case this is

"backendService":{ "serviceUrl": "http://onpremservice/webservices"}

wsdlDefinition

This is mostly used if you are not using the SOAP passthru method. If you do you should set it to:

"wsdlDefinition": {
   "importMethod": "SoapPassThrough"
}

capabilities

Setting this, tells the deployment to use the on premise data gateway, not only have the UI show it (see above). To make this happen add this property

"capabilities": ["gateway"]

The parameter file

This file was downloaded with you initial export of the ARM-template and is located in the same Zip-file. You do not need a parameter file to make this work, it is just good practice. When adding a parameter file you can also have different parameter files for different environments, such as TEST or PROD.
The generated file only contains a single parameter but you should parameterize additional settings in the file to fully integrate the ARM-files in your CI/CD pipeline. Here are the settings I usually parameterize:

  • Location make sure that the Location is the same as the resource group and set that from the CI/CD pipeline.
  • BackendService ServiceUrl This might be different between TEST and prod for instance.
  • Swagger Within the Swagger there might be multiple things that differ between environments. I usually make sure that Host is set by a parameter.

Trying to Deploy

If you do not want (or can´t) go thru the whole DevOps deploy pipeline to test this new ARM template, you can use a built in functionality in the Azure Portal.

In the search box at the top of the portal, search for deploy and select "Deploy a custom template".
file
This lands you on a page where you can enter your ARM-template and parameter files and deploy them.

  1. Select "Build your own template in the editor".
  2. Paste the ARM template JSON in the window.
    file
  3. Click Save at the bottom left part of the page.
  4. If you are using a parameter file, click the "Edit parameters" link at the top right of the page.
  5. Paste the content of your parameter file and click Save at the bottom left.
  6. Done. Click Review+Create at the bottom left.

If my template can help you, I have uploaded it here.

In conclusion

The Logic App Custom Connector is an underestimated, underdone and under-documented feature in Azure, but if you know how to configure it and where to get the info, it is very useful.

Any questions or feedback regarding this post can be sent to my Twitter.
Hope this helps.

Custom Connector returns 410 Gone?

What is this?

Some time ago, I got this message when executing a Custom Connector from a Logic App. The response was very strange, and was returned in that object way that an HTTP action sometimes does:

{
'content-type': 'application/octet-stream',
'content': 'eyJjb2RlIjogIkFwaURpc2FibGVkIiwgIm1lc3NhZ2UiOiAiQVBJIGhhcyBiZWVuIGRpc2FibGVkIGR1ZSB0byBpbmFjdGl2aXR5LiBQbGVhc2UgdXBkYXRlIHRoZSBDdXN0b20gQ29ubmVjdG9yIHRvIGVuYWJsZSBpdCBhZ2Fpbi4ifQ=='
}

Running the $content thing thru a Base64 decoder I got back:

{'code': 'ApiDisabled', 'message': 'API has been disabled due to inactivity. Please update the Custom Connector to enable it again.'}

The connector had been … disabled?!!!
Does this seem familiar?

What to do (short term)

The very short version is to update the connector. To basically force Azure to deploy it again. To solve this I simply added a space in the description-field and clicked Update Connector. Done.

Why did this happen?

I raised a ticket with MS Support and got a very clear answer: They used to disable connectors that had not been used for 30 days. Why? They did not say, only that they no longer have that policy. I think it was due to the datacenter being close to overloaded (thanks Covid-19) and they wanted to remove unused resources.

What to do (long term)

You need to look at your connectors and find all those that have not been used for a while. The Enfo support team checked the run history of all Logic Apps that uses Custom Connectors. If a Logic App had not been run for a long time, they updated the connector behind it. I think you should do this as well.
In our case it was easy, two Logic Apps had not been run for two months and all others are running on a daily basis.

Get to updating your connectors.

OAUTH with Azure – The just make it work edition

What is this?

I do not know how many times I have looked for an article explaining the just make it work part of how to authenticate to Azure from an application calling an Azure API. I usually find myself in a very long article on scope and OAUTH vs OAUTH2 vs OpenID.

This is more for me as documentation and perhaps for you as well, and it will not go thru why you should configure anything in a particular way. It is just a make it work.

If you do not know how to create an App Registration (necessary for login) or how to get the information used below, I have created a post here.

The three stages of logging in

  1. Get the information you need.
  2. Login to get a Token
  3. Use the Token in an API-call.

Getting the information you need

You will need:

  1. A Client ID
  2. A Client Secret
  3. The Azure Tenant ID
  4. Know which resource you are using

The 1, 2 and 3

The Client ID and the Tenant ID you can get from the App Registration Overview page of your app.
file

The client secret is either something you previously saved or something you created. Take a look at my post. Click on "Create the App Secret" in the Table of contents at the top if you need more information on how to create a secret.

The resource

This is the only tricky part.

  • If you need to manipulate about 90% of Azure you use https://management.azure.com/
  • If you are login into Storage Account you should/could use https://storage.azure.com/

Postman

The number one client for calling and testing APIs.

Login to get a Token (with Postman)

Gather all the information you have above and lets get to configuring.
You can use variables and environment settings for these.

  • Set the URI to: https://login.microsoftonline.com/[Tenant ID Goes here]/oauth2/token
  • Set the verb to POST.
  • Set the format of the body to form-data
  • Fill in the data:
    • client_id: Your Client ID
    • client_secret: Your client secret (password)
    • resource: see heading just above this
    • grant_type: client_credentials
  • Click Send and receive your access-token.
    file

    Bonus content for Postman

    If you want to be fancy, add a script to the test part and assign the token to a local variable for use in other calls.

    pm.test(pm.info.requestName, () => {
    pm.response.to.not.be.error;
    pm.response.to.not.have.jsonBody('error');
    });
    pm.environment.set("<your variable>", pm.response.json().access_token);

    Use the token in the API-call (with Postman)

    Now that you have your token, you can use it in other calls.

  • Simply click Authorization
  • In the dropdown select Bearer Token and paste your token in the token-field to the right.
    If you used the fancy script, use the variable instead.
    file
  • Done!

PowerShell

If clients scare you and you like using scripts to call APIs and execute stuff, you can use PowerShell.

Login to get a token (with PowerShell)

Doing this with PowerShell is even easier, once you know what and how to call stuff, all the code below is located in the same file.
Gather all the information you have above.

# Fill in the data in a collection
authBody = @{
    'Client_Id' = 'Your Client ID'
    'client_Secret' = 'Your client secret (password)'
    'resource' = 'see previous heading about this'
    'grant_type' = 'client_credentials'
}tenantId = 'Your Tenant GUID'
# Set the URI
tokenUri = "https://login.microsoftonline.com/(tenantId)/oauth2/token"

# Login to get a Token 
# Notice -ContentType and -Formresult = Invoke-RestMethod -Uri tokenUri -ContentType "multipart/form-data" -FormauthBody -Method Post

# A token must be a SecureString when used in later API-calls.
secureToken = ConvertTo-SecureStringresult."access_token" –asplaintext –force

Are you using Windows PowerShell?

If you need to use Windows PowerShell, aka 5.1, you need to replace the Invoke-RestMethod line with:

Invoke-RestMethod -Uri tokenUri -Method Post -BodyauthBody

Note the lack the -Form parameter and -ContentType

Use the token in the API-call (with PowerShell)

When you have your $secureToken you can use it in any API call as a bearer-token.

# Use the token in the API-call
uri = 'https://your api call'response = Invoke-RestMethod -Authentication Bearer -Token secureToken -Uriuri 

Done!

Bonus content on the Token

Did you know that the Token contains information that you can parse? I sure did not.
Visit eiter https://www.jsonwebtoken.io/ or https://jwt.io/ to see the information in the token. You simply paste your token and see what it contains.
Here is an example of a payload for the token I got in Postman:

{
 "aud": "https://management.azure.com/",
 "iss": "https://sts.windows.net/<tenantGUID>/",
 "iat": 1591799017, <-- Issued At
 "nbf": 1591799017, <-- nbf means not before
 "exp": 1591802917, <-- The expiration time in Unix timestamp
 "aio": "42dgYDhp4Pl5Eccb7me1ixxx",
 "appid": "<client ID>",
 "appidacr": "1",
 "idp": "https://sts.windows.net/<tenantGUID>/",
 "oid": "ad049d62-472f-4835-90be-qqqwwwee",
 "rh": "0.AQwAHo4e6q_ta0SWTzChaFpEhgeZB<<<<>>>>>.",
 "sub": "ad049d62-472f-4835-90be-<<<<>>>>",
 "tid": "<tenantGUID>",
 "uti": "HZ0eFQf0akCeUE0hJPgjAA",
 "ver": "1.0"
}

This information can be very useful. The aud (Audiance) should be the same as the resource setting and that might be different in your scenario.