Quantcast
Channel: Gobbledygooks
Viewing all 102 articles
Browse latest View live

Third party BizTalk monitoring tools

$
0
0
>> **Update 2012-03-28** When looking into BizTalk monitoring and overall governance tools for BizTalk, don’t miss [BizTalk360](http://www.biztalk360.com/). BizTalk 360 wasn't available when this post was written and has some really compelling features. >>>> **Update 2010-09-30:** BizMon is now owned and developed by [Communicate Norway](http://www.communicate.no). They have renamed, and further developed the product, to IPM (Integration Monitoring Platform) – check it out [here](http://ipm.communicate.no/exciting-news). >>

I’ll start this post by clarifying two important things

  1. I am involved in the development and marketing of “BizMon”. Therefore I am biased and you have to decide for yourself if that affects the content of the post. As always it is best to try it for yourself and see if it is useful for you.

  2. I have talked about BizTalk monitoring tools in a previous post and my goal then was then to start a an open source project. That did not happened and you can read why in the update to that post.

Why “monitoring” for BizTalk?

image I have worked as a BizTalk developer for many years but it was not until I really got in to maintaining a large integration solution that I realized that the tools I really needed was not there. I found myself using the following “tools” and techniques over and over again.

  1. Open the BizTalk Administration Console and query for suspended messages, running instances, routing errors etc, etc.

But as I had to pull for this information it took time and discipline (two things I’m short of) to quickly find out when errors occurred.

  1. I used the HAT to try and find out when the last messages was sent and received on the different applications. This gave me a “guarantee” that things worked as I accepted and that the solution had a “pulse” – messages at least moved back and forward.

The problem is that the HAT tool is bad and it is hard to find what one is looking for (It is a bit better in BizTalk 2009 but it is still tricky to get useful information out of it.)

  1. Some of the integrations in our environment used BAM to track messages and their state.

The problem was that all solutions was developed by either myself or different consultants. This made it hard to get everyone to use the same tracking. It was also hard to convince management to go back and try and “instrument” old working integrations with BAM tracking.

At the same time as we had the “tools” and techniques mentioned above available, management had the following requirements for us.

  1. Start working on fixing an error within 10 minutes after it occurred 24/7 all 365 days

  2. Be able to delegate simple monitoring task to support personnel (a help desk).

  3. Not have to actively “pull” for information but be quickly altered of errors and get the information “pushed” to us.

The idea was that this would would save time as people don’t have to look for errors when everything is working fine. Time that people can use for other tasks …

  1. Enable reporting so we can provide systems owners and other interested people with information on how much data has been sent received to the systems and parties they care about.

All the above lead up to the realization that we needed some sort of tool.

What are the existing options for BizTalk monitoring tooling?

At the time we started looking for options all we could find was System Center Operations Manager (SCOM). We looked at SCOM BizTalk Management Pack and decided that for us this was not the right solution. It was too big, too complicated and it would be to hard to get it to the what we wanted to do.

>> The decision to not use SCOM I think was right for us. We wanted something leaner and more specialized. I am however **_not saying _**that it is the right decision for you. >>>> If you are successfully suing SCOM to monitor BizTalk I would love to [hear about it](mailto:richard.hallgren@gmail.com)! >>

What we ended up with

We ended up building BizMon. It does what we need and our help desk can now basically monitor about 100 different BizTalk application themselves. At the same time they do all the other support task they have to do. When something happens (and it does …) they are the first to know. Some easy tasks they can solve themselves, otherwise they make sure to notify the users and quickly call the developer that knows more and can help them.

Support personnel can now also setup custom reports that users can subscribe to, all based on BAM that they now easily can interject tracking points in existing solutions – both new and old ones.

As I said. This worked out out good and helped us. If you think that it could work for you as well – give it a try.

I am also really interested to how you have solved similar requirements as we had with your own tool or other solutions.

What else is there?

Recently FRENDS released a beta version of their FRENDS Helium product that looks promising could potentially solve a lot of the same issues that BizMon does and that I have discussed in this post.

Check it out and let us know what you think.


What kind of integration patters do you implement in BizTalk?

$
0
0

I like to divide BizTalk based integrations into three different patterns.

  1. Point-to-point integration

  2. Broker-based integration

  3. ESB-based integration

I am sure someone could come up with fancier names but I will in this post try and dig into each of them, explain what I mean by them. I will also try and highlight some issues with each of the patterns.

Point-to-point integration

image This is where a lot of us started. The idea is that the sending system has information and knowledge about the receiving system.

This usually means that the messages are exported in a message format that is tailored to the format of the particular receiving system’s needs. It also means that we usually get one integration process for each receiving system there is.

The example in the figure on the right shows a sending system that sends information about invoices to two different receiving systems (“System A” and “System B”). Using a point-to-point pattern in this scenario we end up with two different types of messages that are exported. Each of the exported messages that are suited and tailored to the format of the receiving system.

Issues

The main problem with this kind of point-to-point integration, where the sending system has detailed knowledge about the receiving systems, is that this knowledge translates into a coupling between the systems. When we tie the export message format from the sending system to the format of the receiving, all changes in the receiving system will also cause changes in the sending system.

Suppose there is a sudden need to change the format of the receiving system - as we use that format as our export format we now also have to change it there.

Another problem is the lack of agility. In this invoice scenario all other system that also has the need of invoice based information has to get this by going all the way back to the sending system and develop a whole new specific integration - separate to the existing ones.

Broker-based integration

imageIn the broker-based scenario the integration platform is used as a broker of messages.

This means that only one canonical format is exported from the sending system. The broker then handles the routing to the different systems and the transformation to the message formats that the receiving systems expects.

The main advantage between this approach - where the sending system do not know anything about the receiving systems – and the point-to-point pattern is agility.

If there now is a need for invoice information in a third, new system, we do not have to change the export from the sending system (as long as we have all the information need that is) or develop a whole new integration. All we have to do is to route the invoices so that they also are sent to the third system and transformed into the format that the system expects. A new BizTalk map and port and we are done!

Issues

In my point of view this approach to integration has a lot of advantages over the point-to-point integrations previously discussed. And in a lot of simpler, stabile scenarios it works just fine and is the way to go.

But in some scenarios it kind of breaks down and becomes hard to work with. The problems is related to configuration and how we define the routing information in BizTalk. In BizTalk we can either create an orchestration and “hardcode” the process that defines which systems the messages should be sent to. We can also create a “messaging” scenario where we configure this routing information in the different BizTalk port artifacts by setting up filters.

Regardless if we choose a “messaging” or orchestration based solution the routing information becomes hard to update as the solution grow in size. We either get very complicated orchestrations or loads of ports to maintain.

Furthermore the whole process is very coupled to the one canonical schema which makes versioning and updates hard. If the canonical schema needs to be updated and we still need to be able to send information using the old schema (so we have a “version 1” and “version 2” of the schema side-by-side) all artifacts needs to be more or less duplicated and the complexity of the solutions grows fast.

This makes business agility hard and any changes takes long time to develop and deploy correctly.

>> I guess these issues has more to do with how BizTalk works than the integration pattern itself - but this is a BizTalk blog! >>

ESB-based integration

imageESB-based integration in BizTalk is today achieved using ESB Toolkit from Microsoft.

One of the ideas of ESB-based integration is that the integration platform should not have hardcoded routing information. These routing rules should either be looked up at run time or travel with the actual payload of the message (called “dynamic routing”).

This in combination with having generic on- and off-ramps instead of loads of separate ports to maintain promises to create more agile and configurable solutions (called “Loosely coupled service composition” and “Endpoint run-time discovery and virtualization”).

The figure again shows the invoice scenario used previously. In this case we export a “Invoice 1.0”-format from the sending system. The format is not directly bound to a schema in the port (as we are used to in BizTalk) but it is basically possible to send any message format to the on-ramp.

The on-ramp then identifies what kind of message it received (for example using the XML-namespace) and looks up the routing rules for that specific message (the message “itinerary”) in a repository.

In this scenario there could be rules to route the message to a orchestration or directly to a send port. _All the configuration for this port is however configured as part of the itinerary and applied at run-time on the generic off-ramp port. And as itineraries are simple XML documents they are super light to update and deploy in the repository! _

So if we now wanted to update the “Invoice”-format to “version 2.0” all we would have to do is to create a new XML itinerary for the “Invoice 2.0” message type and possibly create new orchestration to handle the new message type. No new ports, no new bindings etc, etc. The configuration and deployment would be a lot simpler than before! We would end up with a lot fewer artifacts to maintain.

And the itinerary for “Invoice 1.0” would still be applied to all incoming Invoice 1.0 messages. Thus we have achieved what used to be so hard with a lot less development power!

Issues

With agility and dynamic capabilities comes complexity in debugging when something bad happens … I also feel that we give up a lot of validating and control capabilities that we had in the more schema-coupled patterns previously discussed.

I am new to this ESB-based model and I would love to hear your experiences, problems, stories etc!

I am however sure that this is the way to go and I would not be surprised if ESB like patterns will play an ever bigger role in future BizTalk versions!

BAM tracking data not moved to BAM Archive database

$
0
0

There are a few really good blog post that explains BAM – like this from Saravana Kumar and this by Andy Morrison. They both do a great job explaining the complete BAM process in detail.

This post will however focus on some details in the last step of the process that has to do with archiving the data. Let’s start with a quick walk-through of the whole process.

BAM Tracking Data lifecycle

image

  1. The tracked data is intercepted in the BizTalk process and written to the “BizTalk MsgBox” database.

  2. The TDDS service reads the messages and moves them to the correct table in the “BAM Primary Import” database.

  3. The SSIS package for the current BAM activity has to be triggered (this is manual job or something that one has to schedule). When executing the job will do couple of things.

    1. Create a new partitioned table with a name that is a combination of the active table name and a GUID.

    2. Move data from the active table to this new table. The whole point is of course to keep the active table as small and efficient as possible for writing new data to.

    3. Add the newly created table to a database view definition. It is this view we can then use to read all tracked data (including data from the active and partitioned tables).

    4. Read from the “BAM Metadata Activities” table to find out the configured time to keep data in the BAM Primary Import database. This value is called the “online window”.

    5. Move data that is older than the online window to the “BAM Archive” database (or delete it if you have that option).

Sound simple doesn’t it? I was however surprised to see that my data was not moved to the BAM Archive database, even if it was clearly outside of the configured online window.

So, what data is moved to the BAM Archive database then?

Below there is a deployed tracking activity called “SimpleTracking” with a online window of 7 days. Ergo, all data that is older than 7 days should be moved to the BAM Archive database when we run the SSIS job for the activity.

image

If we then look at the “BAM Completed” table for this activity we see that all the data is much older than 7 days as today's date is “13-11-2009”.

So if we run the SSIS job these rows should be moved to the archive database. lets run the SSIS job. Right?

image

But when we execute the SSIS job the BAM Archive database is still empty! All we see are the partitioned tables that were created as part of the first steps of the SSIS job. All data from the active table is however moved to the new partitioned table but not moved to the Archive database.

image

It turns out that the SSIS job does not at all look at the the “Last Modified” values of each row but on the “Creation Time” of the partitioned table in the “BAM MetaData Partitions” table that is shown below.

image

The idea behind this is of course to not have to read from tables that potentially are huge and find those rows that should be moved. But it also means that it will take another 7 days before the data in the partitioned view is actually move to the archive database.

This might actually be a problem if you haven not scheduled the SSIS job to run from day one and you BAM Primary Import database is starting to get to big and you quickly have to move data over to archiving. All you then have to is of course to change that “Creation Time” value in the BAM Metadata Partitions table so it is outside of the online window value for the activity.

Checking if BizTalk binding file is up-to date during deployment

$
0
0

As all of you know the number one time consuming task in BizTalk is deployment. How many times have you worked your way through the steps below (and even more interesting - how much time have you spent on them …)

  • Build

  • Create application

  • Deploy schemas

  • Deploy transformations

  • Deploy orchestration

  • Deploy components

  • Deploy pipelines

  • Deploy web services

  • Create the external databases

  • Change config settings

  • GAC libraries

  • Apply bindings on applications

  • Bounce the host instances

  • Send test messages

  • Etc, etc …

Not only is this time consuming it’s also drop dead boring and therefore also very prone - small mistakes that takes ages to find and fix.

The good news is however that the steps are quite easy to script. We use a combination of a couple of different open-source MsBuild libraries (like this and this) and have created our own little build framework. There is however the BizTalk Deployment Framework by Scott Colescott and Thomas F. Abraham that looks great and is very similar to what we have (ok, ok, it’s a bit more polished …).

Binding files problem

Keeping the binding files in a source control system is of course a super-important part of the whole build concept. If something goes wrong and you need to roll back, or even rebuild the whole solution, having the right version of the binding file is critical.

A problem is however that if someone has done changes to the configuration via the administration console and missed to export these binding to source control we’ll deploy an old version of the binding when redeploying. This can be a huge problem when for example addresses etc have changed on ports and we redeploy old configurations.

>>“- What!? If fixed that configuration issue in production last week and now it back …” >>

_So how can we reassure that the binding file is up-to-date when deploying? _

One solution is to do and export of the current binding file and compare that to one we’re about to deploy in a “pre-deploy”-step using a custom MsBuild target.

Custom build task

Custom build task in MsBuild are easy, a good explanation of how to write one can be found here. The custom task below does the following.

  1. Require a path to the binding file being deployed.

  2. Require a path to the old deployed binding file to compare against.

  3. Using a regular expression to strip out the time stamp in the files as this is the time the file was exported and that will otherwise differ between the files.

  4. Compare the content of the files and return a boolean saying if they are equal or not.

    public class CompareBindingFiles : Task { string bindingFileToInstallPath; string bindingFileDeployedPath; bool _value = false;

        [Required]
        public string BindingFileToInstallPath
        {
            get { return _bindingFileToInstallPath; }
            set { _bindingFileToInstallPath = value; }
        } 
    
        [Required]
        public string BindingFileDeployedPath
        {
            get { return _bindingFileDeployedPath; }
            set { _bindingFileDeployedPath = value; }
        } 
    
        [Output]
        public bool Value
        {
            get { return _value; }
            set { _value = value; } 
    
        } 
    
        public override bool Execute()
        {
            _value = GetStrippedXmlContent(_bindingFileDeployedPath).Equals(GetStrippedXmlContent(_bindingFileToInstallPath));
            return true; //successful
        } 
    
        private string GetStrippedXmlContent(string path)
        {
            StreamReader reader = new StreamReader(path);
            string content = reader.ReadToEnd();
            Regex pattern = new Regex("<Timestamp>.*</Timestamp>");
            return pattern.Replace(content, string.Empty);
        } 
    
    }
    

Using the build task in MsBuild

After compiling the task above when have to reference the dll in element like below.

<UsingTask AssemblyFile="My.Shared.MSBuildTasks.dll" TaskName="My.Shared.MSBuildTasks.CompareBindingFiles"/>

We can then do the following in out build script!

<!--
    This target will export the current binding file, save as a temporary biding file and use a custom target to compare the exported file against the one we’re about to deploy.
    A boolean value will be returned as IsValidBindingFile telling us if they are equal of not.
-->
<Target Name="IsValidBindingFile" Condition="$(ApplicationExists)=='True'">
    <Message Text="Comparing binding file to the one deployed"/> 

    <Exec Command='BTSTask ExportBindings /ApplicationName:$(ApplicationName) "/Destination:Temp_$(BindingFile)"'/> 

    <CompareBindingFiles BindingFileToInstallPath="$(BindingFile)"
                         BindingFileDeployedPath="Temp_$(BindingFile)">
        <Output TaskParameter="Value" PropertyName="IsValidBindingFile" />
    </CompareBindingFiles> 

    <Message Text="Binding files is equal: $(IsValidBindingFile)" /> 

</Target>

<!--
    This pre-build step runs only if the application exists from before. If so it will check if the binding file we try to deploy is equal to one deployed. If not this step will break the build.
-->
<Target Name="PreBuild" Condition="$(ApplicationExists)=='True'" DependsOnTargets="ApplicationExists;IsValidBindingFile">
    <!--We'll break the build if the deployed binding files doesn't match the one being deployed-->
    <Error Condition="$(IsValidBindingFile) == 'False'" Text="Binding files is not equal to deployed" /> 

<!--All other pre-build steps goes here-->

</Target>

So we now break the build if the binding file being deployed aren’t up-to-date!

This is far from rocket science but can potentially save you from making some stupid mistakes.

Streaming pipeline and using context ResourceTracker to avoid disposed streams

$
0
0

Recently there’s been a few really good resources on streaming pipeline handling published. You can find some of the here.aspx) and here.aspx).

The Optimizing Pipeline Performance.aspx) MSDN article has two great examples of how to use some of the Microsoft.BizTalk.Streaming.dl.aspx) classes. The execute method of first example looks something like below.

public IBaseMessage Execute(IPipelineContext context, IBaseMessage message)
{
    try
    {
        ...
        IBaseMessageContext messageContext = message.Context;
        if (string.IsNullOrEmpty(xPath) && string.IsNullOrEmpty(propertyValue))
        {
            throw new ArgumentException(...);
        }
        IBaseMessagePart bodyPart = message.BodyPart;
        Stream inboundStream = bodyPart.GetOriginalDataStream();
        VirtualStream virtualStream = new VirtualStream(bufferSize, thresholdSize);
        ReadOnlySeekableStream readOnlySeekableStream = new ReadOnlySeekableStream(inboundStream, virtualStream, bufferSize);
        XmlTextReader xmlTextReader = new XmlTextReader(readOnlySeekableStream);
        XPathCollection xPathCollection = new XPathCollection();
        XPathReader xPathReader = new XPathReader(xmlTextReader, xPathCollection);
        xPathCollection.Add(xPath);
        bool ok = false;
        while (xPathReader.ReadUntilMatch())
        {
            if (xPathReader.Match(0) && !ok)
            {
                propertyValue = xPathReader.ReadString();
                messageContext.Promote(propertyName, propertyNamespace, propertyValue);
                ok = true;
            }
        }
        readOnlySeekableStream.Position = 0;
        bodyPart.Data = readOnlySeekableStream;
    }
    catch (Exception ex)
    {
        if (message != null)
        {
            message.SetErrorInfo(ex);
        }
        ...
        throw ex;
    }
    return message;
}

We used this example as a base when developing something very similar in a recent project. At first every thing worked fine but after a while we stared getting an error saying:

>> Cannot access a disposed object. Object name: DataReader >>

It took us a while to figure out the real problem here, everything worked fine when sending in simple messages but as soon as we used to code in a pipeline were we also debatched messages we got the “disposed object” problem.

imageIt turns out that when we debatched messages the execute method of the custom pipeline ran multiple times, one time for each sub-messages. This forced the .NET Garbage Collector to run.

The GC found the XmlTextReader that we used to read the stream as unreferenced and decided to destoy it.

The problem is that will also dispose the readOnlySeekable-Stream stream that we connected to our message data object!


It’s then the BizTalk End Point Manager (EPM) that throws the error as it hits a disposed stream object when trying to read the message body and save it to the BizTalkMsgBox!

ResourceTracker to the rescue!

Turns out that the BizTalk message context object has a nice little class connected to it called the ResourceTracker. This object has a “AddResouce”-method that makes it possible to add an object and the context will the hold a reference to this object, this will tell the GC not to dispose it!

So when adding the below before ending the method everything works fine - even when debatching messages!

context.ResourceTracker.AddResource(xmlTextReader);

Using XSLT 1.0 to summarize a node-set with comma separated values

$
0
0

Pure XSLT is very powerful but it definitely has its weaknesses (I’ve written about how to extend XSLT using mapping and BizTalk previously here) … One of those are handling numbers that uses a different decimal-separator than a point (“.”).

Take for example the XML below

<Prices>
  <Price>10,1</Price>
  <Price>10,2</Price>
  <Price>10,3</Price>
</Prices>

Just using the XSLT sum-function on these values will give us a “NaN” values. To solve it we’ll have to use recursion and something like in the sample below.

The sample will select the node-set to summarize and send it to the “SummarizePrice” template. It will then add the value for the first Price tag of the by transforming the comma to a point. It will then check if it’s the last value and if not use recursion to call into itself again with the next value. It will  keep adding to the total amount until it reaches the last value of the node set.

<xsl:template match="Prices">
  <xsl:call-template name="SummurizePrice">
    <xsl:with-param name="nodes" select="Price" />
  </xsl:call-template>
</xsl:template>

<xsl:template name="SummurizePrice">
  <xsl:param name="index" select="1" />
  <xsl:param name="nodes" />
  <xsl:param name="totalPrice" select="0" />

  <xsl:variable name="currentPrice" select="translate($nodes[$index], ',', '.')"/>

  <xsl:choose>
    <xsl:when test="$index=count($nodes)">
      <xsl:value-of select="$totalPrice + $currentPrice"/>
    </xsl:when>
    <xsl:otherwise>
      <xsl:call-template name="SummurizePrice">
        <xsl:with-param name="index" select="$index + 1" />
        <xsl:with-param name="totalPrice" select="$totalPrice + $currentPrice" />
        <xsl:with-param name="nodes" select="$nodes" />
      </xsl:call-template>
    </xsl:otherwise>
  </xsl:choose>

</xsl:template>

Simple but a bit messy and nice to have for future cut and paste ;)

Is there a bug in BizTalk 2006 R2 SP1?

$
0
0
>> **Update 2012-03-28 **This bug was fixed in the CU2 update for BizTalk 2006 R2 SP 1. Further details can be found [here](http://support.microsoft.com/kb/2211420) and [here](http://support.microsoft.com/kb/983185). >>>> **Update 2010-04-12 **Seems like there is a patch coming that should fix all the bugs in SP1 … I’ve been told it should be public within a week or two. I’ll make sure to update the post as I know more. Our problem is still unsolved. >>

Late Thursday night last week we decided to upgrade one of our largest BizTalk 2006 R2 environment to recently released Service Pack 1.aspx). The installation went fine and everything looked good.

… But after a while we started see loads of error messages looking like below.

>> Unable to cast COM object of type 'System.__ComObject' to interface type 'Microsoft.BizTalk.PipelineOM.IInterceptor'. This operation failed because the QueryInterface call on the COM component for the interface with IID '{24394515-91A3-4CF7-96A6-0891C6FB1360}' failed due to the following error: Interface not registered (Exception from HRESULT: 0x80040155). >>

After lots of investigation we found out that we got the errors on ports with the follow criteria:

  • Send port

  • Uses the SQL Server adapter

  • Has a mapping on the port

  • Has a BAM tracking profile associated with the port
    In our environment the tracking on the port is on “SendDateTime” from the “Messaging Property Schema”. We haven’t looked further into if just any BAM tracking associated with port causes the error or if only has to do with some specific properties.

Reproduce it to prove it!

I’ve setup a really simple sample solution to reproduce the problem. Download it here.

The sample receives a XML file, maps it on the send port to schema made to match the store procedure. It also uses a dead simple tracking definition and profile to track a milestone on the send port.image

Sample solution installation instructions

Create a database called “Test”

Run the two SQL scripts (“TBL_CreateIds.sql” and “SP_CreateAddID.sql”) in the solution to create the necessary table and store procedure

Deploy the BizTalk solution just using simple deploy from Visual Studio

Apply the binding file (“Binding.xml”) found in the solution

Run the BM.exe tool to deploy the BAM tracking defintion.
Should look something like:

>> bm.exe deploy-all -definitionfile:\BAM\SimpleTestTrackingDefinition.xml >>

Start the tracking Profile editor and open the “SimpleTestTrackingProfile.btt“ that you’ll find in the solution and apply the profile

Drop the test file in the receive folder (“InSchema_output.xml”)

The sample solution fails on a environment with SP 1 but works just fine on a “clean” BizTalk 2006 R2 environment.

What about you?

I haven’t had time to test this on a BizTalk 2009 environment but I’ll update the post as soon as I get around to it.

We also currently have a support case with Microsoft on this and I’ll make sure to let you as soon as something comes out of that. But until then I’d be really grateful to hear from you if any of you have the same behavior in your BizTalk 2006 R2 SP1 environment.

Use code blocks to extend your BizTalk custom XSLT maps

$
0
0
**Update 2010-04-13 **Grant Samuels commented and made me aware of the fact that inline scripts might in some cases cause memory leaks. He has some further information [here](http://linderalex.blogspot.com/2008/06/memory-leak-using-biztalk-mapper.html) and you’ll find a kb-article [here](http://support.microsoft.com/kb/918643).** **

I’ve posted a few times before on how powerful I think it is in complex mapping to be able to replace the BizTalk Mapper with a custom XSLT script (here’s how to.aspx)). The BizTalk Mapper is nice and productive in simpler scenarios but in my experience it break down in more complex ones and maintaining a good overview is hard. I’m however looking forward to the new version of the tool in BizTalk 2010– but until then I’m using custom XSLT when things gets complicated.

Custom XSLT however lacks a few things once has gotten used to have - such as scripting blocks, clever functoids etc. In some previously post (here and here) I’ve talked about using EXSLT as a way to extend the capabilities of custom XSLT when used in BizTalk.

Bye, bye external libraries – heeeello inline scripts ;)

Another way to achieve much of the same functionality even easier is to use embedded scripting that’s supported by the XslTransform class. Using a script block in XSLT is easy and is also the way the BizTalk Mapper makes it possible to include C# snippets right into your maps.

Have a look at the following XSLT sample:

<?xml version="1.0" encoding="utf-8"?>
<xsl:stylesheet
    version="1.0"
    xmlns:xsl="http://www.w3.org/1999/XSL/Transform"
    xmlns:msxsl="urn:schemas-microsoft-com:xslt"
    xmlns:code="http://richardhallgren.com/Sample/XsltCode"
    exclude-result-prefixes="msxsl code">
    <xsl:output method="xml" indent="yes"/>

    <xsl:template match="@* | node()">
        <Test>
            <UniqueNumber>
                <xsl:value-of select="code:GetUniqueId()" />
            </UniqueNumber>
            <SpecialDateFormat>
                <xsl:value-of select="code:GetInternationalDateFormat('11/16/2003')" />
            </SpecialDateFormat>
            <IncludesBizTalk>
                <xsl:value-of select="code:IncludesSpecialWord('This is a text with BizTalk in it', 'BizTalk')" />
            </IncludesBizTalk>
        </Test>
    </xsl:template>

    <msxsl:script language="CSharp" implements-prefix="code">
        //Gets a unique id based on a guid
        public string GetUniqueId()
        {
            return Guid.NewGuid().ToString();
        }

        //Formats US based dates to standard international
        public string GetInternationalDateFormat(String date)
        {
            return DateTime.Parse(date, new System.Globalization.CultureInfo("en-US")).ToString("yyyy-MM-dd");
        }

        //Use regular expression to look for a pattern in a string
        public bool IncludesSpecialWord(String s, String pattern)
        {
            Regex rx = new Regex(pattern);
            return rx.Match(s).Success;
        }
    </msxsl:script>
</xsl:stylesheet>

All one has to do is to define a code block, reference the xml-namespace used and start coding! Say goodbye to all those external library dlls!

It’s possible to use a few core namespaces without the full .NET namespace path but all namespaces are available as long as they are fully qualified. MSDN has a great page with all the details here.aspx).


Promote properties in a EDI schema using the EDI Disassembler

$
0
0

I’ve doing a lot of EDI related work in BizTalk lately and I have to say that I’ve really enjoyed it! EDI takes a while to get used to (see example below), but once one started to understand it I’ve found it to be a real nice, strict standard - with some cool features built into BizTalk!

UNB+IATB:1+6XPPC+LHPPC+940101:0950+1'
UNH+1+PAORES:93:1:IA'
MSG+1:45'
IFT+3+XYZCOMPANY AVAILABILITY'
ERC+A7V:1:AMD'
IFT+3+NO MORE FLIGHTS'
ODI'
TVL+240493:1000::1220+FRA+JFK+DL+400+C'
...

There are however some things that doesn't work as expected …

Promoting values

According to the MSDN documentation.aspx) the EDI Disassembler by default promotes the following EDI fields: UNB2.1, UNB2.3, UNB3.1, UNB11; UNG1, UNG2.1, UNG3.1; UNH2.1, UNH2.2, UNH2.3.

There are however situation where one would like other values promoted.

I my case I wanted the C002/1001 value in the BGM segment. This is a value identifying the purpose of the document and I needed to route the incoming message based on the value.

The short version is that creating a property schema, promoting the field in the schema and having the EDI Disassembler promoting the value will not work (as with the XML Disassembler). To do this you’ll need to use a custom pipeline component to promote the value. Rikard Alard seem to have come to the same conclusion here.

Promote pipeline component to use

If you don’t want to spend time on writing your own pipeline component to do this yourself you can find a nice “promote component” on CodePlex here by Jan Eliasen.

If you however expect to receive lots and lots of big messages you might want to look into changing the component to use XPathReader and custom stream implementations in the Microsoft.BizTalk.Streaming.dll. You can find more detailed information on how to do that in this MSDN article.aspx).

End-to-end tracking using BAM services and the BAM Service Generator tool

$
0
0

An integration between systems can often be view upon as a chain of system and services working together to move a message to its final destination.

image

One problem with this loosely-coupled way of dealing with message transfers is that its hard to see if and where something has gone wrong. Usually all we know is that the sending system has send the message, but the final system never received it. The problem could then be within any of the components in between. Tracking messages within BizTalk is one thing but achieving a end-to-end tracking within the whole “chain” is much harder.

One way of solving the end-to-end tracking problem is to use BAM.aspx). BAM is optimized for these kind of scenarios were we might have to deal with huge amount of messages and data – several of the potential problems around these issues are solved out-of-the-box (write optimized tables, partitioning of tables, aggregated views, archiving jobs and so on).

And even though BAM is a product that in theory isn’t tied to BizTalk its still a product that is easier to use in the context of BizTalk than outside of it due tools like the BizTalk Tracking Profile Editor.aspx) and the built in BAM-interceptor patterns within BizTalk. It is however fully possible to track BAM data and take advantage of the BAM infrastructure even outside of BizTalk using the BAM API.aspx).

The BAM API is a .NET based API used for writing tracking data to the BAM infrastructure from any .NET based application. There are however a few issues with this approach.

The application sending the tracking data has to be .NET based and use the loose string based API. So for example writing to the “ApprovedInvoice” activity below would look something like this – lots of untyped strings.

public void Log(ApprovedInvoicesServiceType value) 
{ 
    string approvedInvoicesServiceActivityID = Guid.NewGuid().ToString();

    var es = new DirectEventStream("Integrated Security=SSPI;Data Source=.;Initial Catalog=BAMPrimaryImportw", 1);

    es.BeginActivity("ApprovedInvoices", approvedInvoicesServiceActivityID);

    es.UpdateActivity("ApprovedInvoices", approvedInvoicesServiceActivityID, 
        "ApprovedDate", "2011-01-18 12:02", 
        "ApprovedBy", "Richard", 
        "Amount", 122.34, 
        "InvoiceId", "Invoic123");

    es.EndActivity("ApprovedInvoices", approvedInvoicesServiceActivityID); 
}

One way of getting around problem with the usage of strings is to use the “Generate Typed BAM API tool”. The tool reads the BAM definition file and generates a dll containing strong types that corresponds to the fields in the definition. By referencing the dll in the sending application we can get a strong typed .NET call. The fact that this still however requires a .NET based application remains.

The obvious way to solve the limitation of a .NET based client - and to get one step closer to a end-to-end tracking scenario - is of course to wrap the call to the API in a service.

BAM Service Generator

BAM Service Generator is very similar to the Generate Typed BAM API tool mentioned above with the difference that instead of generation a .NET dll it generates a WCF service for each activity.

c:\Tools\bmsrvgen.exe /help 

BAM Service Generator Version 1.0 

Generates a WCF service based on a BizTalk BAM definition file.

 -defintionfile: Sets path to BAM definition file.
 -output: Sets path to output folder.
 -namespace: Sets namespace to use.

Example: bmsrvgen.exe -defintionfile:c:\MyFiles\MyActivityDef.xml 
-output:c:\tempservices -nampespace:MyCompanyNamespace

The BAM Service Generator is a command line tool that will read the BAM definition file and generate a compiled .NET 4.0 WCF service. The service is configured with a default basicHttp endpoint and is ready to go straight into AppFabric or similar hosting.

image

This service-approach makes it possible to take advantage of the BAM infrastructure from all different types of system, and even in cases when they aren’t behind the same firewall! As shown in figure below this could take us one step closer to the end-to-end tracking scenario.

image

“BAM Service Generator” is open-source and can be found here.

Update to BAM Service Generator

$
0
0

I have previously written about how to generate a types set of WCF services to achieve end-to-end BAM based tracking – making BAM logging possible from outside potential firewalls and/or from non .NET based clients.

In that post I reference a open source tool to generate the services based on the BAM definition files to be able to make types BAM API calls. I called the tool BAM Service Generator and it’s published on CodePlex.

The first version of the tool only made it possible to make simple activity tracking calls, even missing the option to defining a custom activity id. All advanced scenarios like continuation, activity and data reference or even updating a activity wasn’t possible. That has now been fixed and the second version of the tool enables the following operations.

  • LogCompleteActivity
    Creates a simple logging line and is the most efficient option if one only want's to begin, update and and end an activity logging as soon as possible.

  • BeginActivity
    Start a new activity with a custom activity id

  • UpdateActivity
    Updates a already started activity and accepts a types object as a parameter. The typed object is of course based on what's has been defined in the BAM definition file. Multiple updates can be made until EndActivity is called.

  • EnableContinuation
    Enables continuation. This is used if you have multiple activity loggings that belong together but you can not use the same activity id. This enables you to tie those related activities together using some other set of id (could for example be the invoice id).

  • AddActivivityReference
    Add a reference to an another activity and enables you to for example display a link or read that whole activity when reading the main one.

  • AddReference
    Adds a custom reference to something that isn’t necessarily an activity. Could for example be a link to a document of some sort. Limited to 1024 characters.

  • AddLongReference
    Similar to AddReference but enables to also store a blog of data using the LongRefernceData parameter – limited to 512 KB.

  • EndActivity
    Ends a started activity.

Let me know if you think that something is missing, if something isn’t working as would expect etc.

Using BizTalk Web Documenter as a new way of documenting your BizTalk solutions

$
0
0

2012-09-30: I’ve renamed the whole project and moved it to GitHub. Hope to see you there!

As most BizTalk developers/architects you have probably been in a situation were documentation of your integration solution has been discussed. And you probably also been just as happy as me when you first found BizTalk Documenter. BizTalk Documenter is a great tool for extracting all that configuration data that is so vital to BizTalk and create a nice technical documentation of it.

documentation levelsThere are of course several levels to documentation in BizTalk Server based solution were the technical information is at the lowest level. Above that you would then have you integration process descriptions and then possible another level above that where you fit the integration processes into bigger business processes, and so on – but let’s stick to the technical documentation for now.

BizTalk Documenter does a nice job of automating the creation of technical documentation, instead of basically having to all that configuration job a second time when documenting. Doing technical documentation by hand usually also breaks down after a while as solution setup and configuration changes a lot and the work is really time consuming. And as documentation that isn’t up to date frankly is completely worthless people soon looses interest in both reading and maintaining the technical documentation.

Using BizTalk Documenter however has a few problems …

  • The tool generates either a CHM file or a word file with documentation. As more and more organizations however move their documentation online new output formats are needed.

  • The latest 3.4 versions of the tools throws OutOfMemory Exceptions on bigger solutions.

BizTalk Config Explorer Web Documenter

As we started trying to find ways to solve the issues above we had a few ideas we wanted the new solution to handle better.

  • Web based documentation
    A web based documentation is both much easier to distribute and more accessible to read – sending a link and reading a web page rather then sending around a CHM file.

  • Easy way of see changes in versions
    Sometimes one wants to see how things have changed and how things used to be configured. The tools should support “jumping back in time” and see previous versions.

config explorer

Check out an an example of a Config Explorer yourself.

Getting started

Config Explorer is a open source tool published on CodePlex GitHub - a good place to start to understanding the workflow of creating documentation and to download the tool itself.

Contribute!

This release is a 1.0 release and there will are many features we haven’t had time for, but a few of them we’ve planned for future releases. Unfortunately there will however also be bugs and obvious things we’ve missed. Please use the CodePlex GutHubIssue Tracker to let us know what you like us to add and what possible issues you’ve found.

I’ll of course also check for any comments here Winking smile.

Reborn as BizTalk Web Documenter!

$
0
0

About a year ago I published a project that helps automating documentation for BizTalk Server – much like BizTalk Documenter but with the difference that this tools instead generates a dynamic web site. I’ve called in Config Explorer, a name I knew was sh*t from the start but the best I could come up with at the time.

After a while however, when enough people told me that they also thought the name was crap I decided to change it to BizTalk Web Documenter. I’ve also moved the whole project to GitHub.

I'll try and spend a bit more time on the project than I have the last year and will update the documentation, wiki and roadmap as well as implement a few new features.

I hope you’ll help me – fork the project today!

Presenting on “Efficient system integration documentation” at BizTalk User Group Sweden

$
0
0

I’ve talked a lot about efficient documentation previously on this blog, both when in comes to splitting the documentation up in different layers as here, but also in relation to tools BizTalk Documenter and BizTalk Web Documenter for automatically generation technical documentation as here.

Last week I however had the pleasure to present on the BizTalk User Group Sweden meeting and got to talk all documentation for a whole hour Smile. You can find the PPT from the meeting here.

I ended the presentation by presenting the ten commandments for efficient system integrations documentation:

I. Thou shall not manually document anything that can be automatized!

II. Thou shall keep it simple& make it look nice

III. Thou shall use a wiki based platform

IV. Thou shall use pictures whenever appropriate

V. Thou shall have well defined guidelines for your documentation

VI. Thou shall have a well defined target audience for your documentation

VII. Thou shall document continuously in your project

VIII. Thou shall have a common vocabulary and common icons defined

IX. Thou shall test your documentation with target audience

X. Thou as the developer of an integration should document it

Export BizTalk Server MSI packages directly from Visual Studio using BtsMsiTask

$
0
0

Getting a full Continuous Integration (CI) process working with BizTalk Server is hard!

One of the big advantages in a working CI process is to always have tested and verified artifacts from the build server to deploy into test and production. Packaging these build resources into a deployable unit is however notorious hard in BizTalk Server as a Visual Studio build will not provide a deployable artifact (only raw dlls). The only way to get a deployable MSI package for BizTalk Server is to first install everything into the server and then export – until now.

Why Continuous Integration?

Continuous Integration is a concept first described by Martin Fowler back in 2006. At its core its about team communication and fast feedback but also often leads to better quality software and more efficient processes.

CI

A CI process usually works something like the above picture.

  1. A developer checks in code to the source control server.

  2. The build server detects that a check in has occurred, gets all the new code and initiates a new build while also running all the relevant unit tests.

  3. The result from the build and the tests are sent back to the team of developers and provides them with a up to date view of the “health” in the project.

  4. If the build and all the test are successful the built and tested resources are written to a deploy area.

As one can see the CI build server acts as another developer on the team but always builds everything on a fresh machine and bases everything on what is actually checked in to source control – guaranteeing that nothing is build using artifacts that for some reasons is not in source control or that some special setting etc is required to achieve a successful build.

In step 4 above the CI server also writes everything to a deploy area. A golden rule for a CI workflow is to use artifacts and packages from this area for further deployment to test and production environments – and never directly build and move artifacts from developer machines!
As all resources from each successful build is stored safely and labeled one automatically achieves versioning and the possibility to roll back to previous versions and packages if needed.

What is the problem with CI and BizTalk?

It is important to have the build and feedback process as efficient as possible to enable frequent checkins and to catch possible errors and mistake directly. As mentioned it is equally as important that the resources are written to the deploy area are the ones used to deploy to test and production so one gets all the advantages with versioning and roll back possibilities etc.

The problem with BizTalk Server is however that only building a project in Visual Studio does not gives us a deployable package (only raw dlls)!

There are a number of different ways to get around this. One popular option is to automate the whole installation of the dlls generated in the build. This not only requires a whole lot of scripting and work, it also requires a full BizTalk Server installation on the build server. The automated process of installation also takes time and slows down the feedback loop back to development team. There are however great frameworks as for example the BizTalk Deployment Framework to help with this (this solution of course also enables integration testing using BizUnit and other framework).

Some people would also argue that the whole script package and the raw dlls could be moved onto test and production and viewed on as a deployment package. But MSI is a powerful packaging tool and BizTalk Server has a number of specialized features around MSI. As MSI also is so simple and flexible it usually the preferred solution by IT operations.

A final possibility is of course to directly add the resources one by one using BizTalk Server Administration console. In more complex solutions this however takes time and requires deeper knowledge into the solution as one manually has to know in what order the different resources should be added.

Another option in BtsMsiTask

Another option is then to use BtsMsiTask to directly generate a BizTalk Server MSI from the Visual Studio build and MsBuild.

SimplerProcess

The BtsMsiTask uses same approach and tools as the MSI export process implemented into BizTalk Server but extracts it into a MSBuild task that can be directly executed as part of the build process.

BtsMsiTask enables the CI server to generate a deployable MSI package directly from the Visual Studio based build without having to first install into BizTalk Server!


Talking Application Lifecycle Management and BizTalk in Gothenburg

$
0
0

Back in February I and my college Robin did an presentation on ALM and BizTalk for the BizTalk User Group in Stockholm. In the two hour presentation we talked about things like:

  • Pros and cons on distributed version control handling systems and BizTalk (more specially Git).

  • Identifying the right level of automated test – what are the differences between integration and unit tests and how and what should one use when in a BizTalk context.

  • Using NuGet and BizTalk to handle dependencies and packages. How can we use the NuGet infrastructure to handle all BizTalk dependencies and also distribute artifacts like pipeline components within our company and teams?

  • How, in our opinion, are Continuous Integration and Delivery best handled when developing with BizTalk. In the presentation we’re looking at everything from build servers to how to handle BizTalk automated builds and packing with minimal amount of work.

So, now we doing the presentation all over again for the newly started BizTalk User Group in Gothenburg the 24 of Mars. The presentation will be in Swedish and packed with demos.

Hope to see you there!

Why full NuGet support for BizTalk projects is important!

$
0
0

Let’s start with a summary for those who don’t feel like reading the full post.

Using NuGet to handle BizTalk dependencies for shared schemas, pipeline components and so on works fine today.

As .btproj files however aren’t supported by NuGet (as shown in this pull request) and are not in the current white list of allowed project types Package Restore will not work (issue closed as by design here).

Not having Package Restore of course is a problem as one now is forced to check in all packages as part of the solutions, something that in the end leads to bloated and messy solutions.

So please reach out to your Microsoft contacts and let’s get this fixed!

NuGet

As most people know NuGet is the package management solution from Microsoft for .NET. It started off as an initiative to further boost open source within the .NET community and NuGet packages uploaded to the NuGet.org platform are open and available directly within Visual Studio through the NuGet add-in. Currently there are well over 20 000 open packages for everyone to download and use.

Lately there has however been lots of discussions within the community to use NuGet as a package manager for internal shared resources as well (by Hanselman and others). Solutions like MyGet allows for private NuGet feeds – only available to those within your organization but still levering all the ease and control offered by NuGet.

Using NuGet for references has a number of of advantages:

  • Communication
    All available resources are directly visible in Visual Studio and when updates to a used library is a available a notification is shown. No more spam mails about changes and never read list of available libraries.

  • Versioning
    A NuGet package has it’s own versioning. This is useful as it isn’t always optimal to change the dll version, but by using the NuGet package version one can still indicate that something has changed.
    As you also reference a specific version of a NuGet package from your solution you always have full control of exactly what version you’re targeting and where to find the built and ready bits.

  • Efficiency
    When starting to work on a project with many references one first have to get the source code from source control for the references, build these (hopefully in the right version … hopefully you have your tags and labels in order …) until all the broken references are fix.
    With NuGet references this just works straight away and you can be sure you get the right version as the resource isn’t the latest from source control but the actual built dlls that’s part of the referenced NuGet package.

NuGet Feeds

As mentioned NuGet feeds can be public or private. A NuGet feed is basically a RSS feed with available resources and versions of these. A feed and a NuGet server can be a hosted web based solution or something as simple as a folder where you write your NuGet packages to. The NuGet documentation covers these options in depth.
The point being that creating your own private NuGet feed is very simple!

So if you haven’t already realized it by now - NuGet is not only a great tool to manage all public external dependencies but can add a a lot a value for internal references as well.

Couple of relevant NuGet features

  • NuGet Package Restore
    NuGet Package Restore enables NuGet to download the used referenced package from the package area. The goal is to avoid having to check in the actual references in source control as this will bloat the version control system and in the end the create a messy solution.

  • NuGet Specification (nuspec) metadata token replacements
    All packages are based on a nuspec file that dictates the version, package description and other meta information.
    NuGet has the capability to by using replacement tokens (such as $version$ for example) read some of this information from the AssemblyInfo files.
    This is far from a critical feature but nice to have and avoid having to repeat oneself and have the same information in a number of places.**

BizTalk and NuGet?

A typical BizTalk solution has a number of shared resources such as shared schemas, shared libraries and pipeline components. As the resources usually are shared between a number of project they often live in a separate development cycle. So when opening a BizTalk project with such resources it’s not only a lot of work getting the referenced code and building the references, there’s also this nagging feeling that it might not be in the right version and that the source might have changed since the first time the reference was added.

Another reference issue occurs when using a build server for a solution with references. As the solution has a dependency to the referenced project one has to make sure not only that the main solution is fetched to the build workarea by the build server, but also that all the referenced project are fetched from version control – and again, hopefully the latest version in the attended version …
This kind of works using TFS Build Service and common TFS Source Control. If however one’s using Git and have resources in separate repositories this becomes impossible as TFS Build Service currently only supports fetching a single repository per build definition to the build workarea …
(This issue does not apply for TeamCity that has a number of different options for dependency management)

All the these issues are actually solved when using NuGet references instead of traditional references as we can be sure we’re getting the packaged dlls as part of the NuGet package in the version that one referenced and not the latest checked in version.
A NuGet reference also makes things a bit easier when it comes to managing the workarea for the TFS Build Service as one only have to sure the NuGet package is available (either as checked in as part of the solution or by using Package Restore).

But …

NuGet doesn’t support BizTalk projects!

As disused here NuGet currently doesn't support .btproj files. As BizTalk project files are are basically a standard .NET project file with some extra information a two line change in the NuGet whitelist is needed as in this pull request.

nolove

So the main issue it that by not having full support of .btproj files Package Restore won’t work and we’re for now force to check in all the NuGet packages as part of our solutions.
An other minor issue is that the token replacement feature also doesn’t work. I also think that if we could actually get full BizTalk support we’d see more BizTalk specific open shared packages for things like useful libraries and pipeline components.

Call for action: Reach out to the NuGet Team or other Microsoft connections and let’s get those two lines added to the white list!

Getting digital certificates right in BizTalk using a third party root certificate in combination with client certificate security and WCF BasicHttp

$
0
0

Digital certificates and asymmetric security is notoriously hard to get right in a Windows environment. Getting it right in a BizTalk context isn't exactly easier.

In this scenario a BizTalk Server act as a client and communicates with a service over https. The service also uses a client certificate for client authentication.

Flow

Long story short

Third party root certificates always needs to be places under "Third-Party Root Certification Authorities" or directly under the "Trusted Root Certification Authorities" folder on Local Machine level in Windows. When however also configuring the "WCF-BasicHttp" adapter to also use client certificate authorization the BizTalk Administration console requires the thumbprint id of a specific server certificate (in addition to the client certificate thumbprint). This makes the runtime also look for the for the public certificates under "Trusted People" folder and causes an error if we don't also place it that folder.

In the end this requires us to add the public root certificate in two different places.

Server certificate

Let's start by getting the server certificate right.

After configuring everything in BizTalk using a standard WCF-BasicHttp port and selecting Transport security I encountered the following error message.

A message sent to adapter "WCF-BasicHttp" on send port "SP1" with URI "https://skattjakt.cloudapp.net/Service1.svc" is suspended.

Error details: System.ServiceModel.Security.SecurityNegotiationException: Could not establish trust relationship for the SSL/TLS secure channel with authority 'skattjakt.cloudapp.net'. ---> System.Net.WebException: The underlying connection was closed: Could not establish trust relationship for the SSL/TLS secure channel. ---> System.Security.Authentication.AuthenticationException: The remote certificate is invalid according to the validation procedure.

The error message is pretty straightforward: Could not establish trust relationship for the SSL/TLS secure channel with authority.

The first thing that happens when trying to establish SSL channel is that a public server certificate is sent down the client for the client to use when encrypting further messages to the server. This certificate is validated so it hasn't been revoked, that it's Valid to date hasn't passed, that the Issued to name actually matches the services domain and so on.

But to be able to trust the information in the certificate it needs to be issued by someone we trust, a certificate authority (CA).

If we take an example of a request to Google we actually don't trust the information in the Google server certificate, neither do we trust the intermediate certificate they use to sign their public server certificate. The root certificate that issued the intermediary Google Certificate is however one of the preinstalled trusted certificate authorities in Windows.

Google

What authorities and certificates to trust is in Windows based on what certificates exists in the Certificate Store under the Trusted Root Certificate Authorities folder.

ca

In our case the service didn't use a certificate from one of the trusted authorities but had based their certificate on a root certificate they created themselves.

ca2

Further the Certificate Manager in Windows has three different levels: "Local Machine", "Service" and "Current User". The top level is the "Local Machine" and certificates added on this level are available for all users. "Service" and "Current User" are more specific and only available for specific services and users. From a BizTalk perspective it's important to place the certificate so it's accessible for the user running the BizTalk host instance.

So after requesting the used root certificate and placing it in the trusted authorities folder for the Local Machine we're able to successfully establish an SSL session!

Client certificate

As the server however required a client certificate for authorization I reconfigured the send port to use Certificate as client credential type.
3

The BizTalk Administration Console then requires one to enter the thumbprint of the private client certificate to use. When browsing for picking the client certificate the console will look for certificates to choose from in the "Personal" folder on the "Current User" level. So for the certificate to show up one has to add the client certificate to the "Personal" folder running as the user that eventually will hit the browse button in the console. Adding it only to the "Personal" folder of "Local Machine" will not make it show up in the console. As the "Current User" level also is separate for each user it's very important to add it to the "Personal" folder for the user that eventually will run the BizTalk process as this user otherwise won't find the certificate at runtime. In this case just pasting the thumbprint id from the certificate will work fine.

When selecting Certificate client credential type the BizTalk Administration console also requires one to pick what public server certificate to use - even though we still just want to use the same root certificate as just added to the trusted store on machine level ..? When locating server certificates to display the console will look in the "Other People" folder on "Local Computer" level. So for making our root certificate show up in the console we also have to this to this folder. It turns however out that when having a pinpointed specific server certificate the BizTalk runtime will throw an error if the server certificate is not placed in the "Other People" folder. Likewise will an error we be thrown if the certificate is placed only in one of the trusted authorities folders.

A message sent to adapter "WCF-BasicHttp" on send port "SP1" with URI "https://skattjakt.cloudapp.net:444/Service42.svc" is suspended. Error details: System.InvalidOperationException: Cannot find the X.509 certificate using the following search criteria: StoreName 'AddressBook', StoreLocation 'LocalMachine', FindType 'FindByThumbprint', FindValue '70A9899E6CF89B014E6195ADE6E1BA12BEA58728'.

So in this case we need to add the public CA certificate in two different places for the communication to work.

Frankly I don't see the point of having to point out a server certificate at all in this case - all I want is to configure what client certificate to use for authorization and the runtime to validate the server certificate against all CA I have in the trusted folders.

Calling an on premise Microsoft Dynamics CRM using BizTalk and Active Directory Federated Security (ADFS)

$
0
0

This post was originally posted here.

Federated security is a great way of accomplishing single-sign-on (SSO) security for your applications. It’s also a technique that is becoming increasingly relevant as things move to the cloud and we’re starting to get more hybrid situation with some applications on premise and some in the cloud.

Active Directory Federated Security (ADFS) is an implementation of federated security and is used by a number of Microsoft Applications, Microsoft Dynamics CRM being one of them.

Windows Communication Foundation (WCF) has a few techniques to simplify federated security communication and this post will show an example of using Microsoft BizTalk Server and WCF to communicate with an ADFS secured CRM installation.

What is federated security?

Federated security at its core is pretty simple. In our scenario BizTalk Server (client) wants to login in and authenticate itself with the CRM system.

Traditionally the CRM system then had to manage the credentials for the client and verify these as login happened. There a number of drawback to this, the main one being that it doesn’t scale that well when we’re getting many separated systems that need access to each other as login information and login logic is spreads out across a number of systems.

When using federated security each part instead chooses to trust a common part (in this case the ADFS and AD), and as long as someone provide a token that can be validated with the trusted part, the CRM system will trust that it already has been authenticated and that everything is ok.

Data flow

  1. Authentication and requesting token
  2. Authentication against AD
  3. Login authenticated
  4. ADFS token response
  5. ADFS token based authentication
  6. Response from CRM

So basically a federated security model allows for separating all authentication and authorization out to a separate system.

As mentioned ADFS is just an implementation of federated security were Active Directory acts as the main repository with a Security Token Service implementation on top of it.

BizTalk and ADFS

As BizTalk has great WCF support we can use the WCF stack to handle all of communication with ADFS and CRM. But it does involve a fair bit of configuration. BizTalk and Visual Studio will help in most mainstream WCF scenario where one can point Visual Studio at the WSDL location and a basic binding file is generated for us. However, in the case of and ADFS based WSDL this will just result in an empty binding file that doesn’t help much.

Lots of projects I’ve seen makes the decision at this point to use custom code and create a facade to solve authentication. As Microsoft Dynamics CRM comes with a nice SDK including a C# library to handle authentication is easy to understand how one would end up using that. The problem is however that this creates another code base that again needs to be maintained over time. One could also argue that using custom code that is called by BizTalk further complicates the overall solution and makes it harder to maintain.

So let’s configure the authentication from scratch using Windows Communication Foundation.

Choosing the right WCF binding

First thing to do is to choose the right WCF binding. Let’s create a WCF-Custom Static Solicit-Response send port and choose the ws2007FederationHttpBinding.

Choose binding

Adding the Issuer

First thing we need to add is information on how to connect to the Issuer. The Issuer is the one issuing the ticket, in our case that’s the ADFS.

Issuer details

First we need to add information about the address of the Issuer. The WSDL tells us that the mex endpoint for the ADFS server is located at https://adfs20.xxx.yy/adfs/trust/mex.

Mex file

Browsing the WSDL for the ADFS server shows a number of different endpoints. Which one to use depends on what kind of authentication being used when requesting the token. In our case we’re using a simple username and password so we’re using the usernamemixed endpoint (https://adfs20.xxx.yy/adfs/services/trust/2005/usernamemixed).

Secondly we need to add information about the binding and the binding configuration for communication with the ADFS service.

What this basically means is that we need to add information to a second, or an inner, binding configuration. The BizTalk Server WCF configuration GUI doesn’t provide a way to set this so the only way is to configure this is to use one of the relevant configuration files (“machine.config” or the BizTalk config) and ad a binding manually.

<bindings>
 <ws2007HttpBinding>
  <clear/>
  <binding name="stsBinding">
   <security mode="TransportWithMessageCredential">
    <transport clientCredentialType="None"/>
    <message clientCredentialType="UserName" establishSecurityContext="false"/>
   </security>
  </binding>
 </ws2007HttpBinding>
</bindings>

Once this is setup we can point our BizTalk Server WCF configuration to the correct URL and reference the WCF inner binding we just configured.

Issuer configured

Finally we need to provide the username and password to authenticate ourselves to the ADFS server.

Login details

We now have communication setup to the ADFS service and should be able to get a valid ticket that we then can use to authenticate ourselves to the CRM system!

We now however also need to provide information on how to connect to the actual CRM system.

Configure communication to CRM

The rest is easy. Let’s start with adding the URL to the end service we want to call. As with any other service call we’ll also add the SOAP Action Header that in this case is the Update service (http://schemas.microsoft.com/xrm/2011/Contracts/Services/IOrganizationService/Update) of the OrganizationService service.

Service details

As out service also uses SSL for encryption we need to tell the binding to use TransportWithMessageCredentials.

Transport encryption

Establishing a Security Context – or not

Finally there is a little tweak that is needed. WCF supports establishing a Security Context. This will cache the token and avoid asking the STS for a new token for each call to the CRM system. BizTalk Server however doesn’t seem to support this so we need to turn it off.

Security Context

Conclusion

Understanding federated security is important and will become increasingly important as we move over to systems hosted in the cloud – federated security is the de facto authentication standard used by hosted systems in the cloud. Avoiding custom code and custom facades is a key factor in building maintainable large scale BizTalk based systems over time. BizTalk has great WCF support and taking full advantage of it is important to be able to build solutions that easy to oversee and possible to maintain not just by those familiar and comfortable in custom code.

Build and generate MSI for BizTalk Server using Team Foundation Build Services

$
0
0

Using a build server and leveraging continuous integration is good practice in any software development project. The idea behind automated build and continuous integrations is to have a server that monitors one's source code repository and builds the solution as changes occur. This separate build activity alone will ensure that all artifacts are checked in and that a successful build doesn’t depend on any artifacts or settings on the development machines.

Build server

Today build servers do a lot more as part of the build - the build process usually involves execution of tests, labeling the source as well as packing the solution into a deployable artifact.

In this post we'll see how a build process can be achieved using Team Foundation (TFS) Build Services, building a BizTalk project that results in a deployable MSI artifact.

TFS Build Services

TFS Build services is a component that is part of the standard TFS install media. Each TFS build controller controls a number of "Build Agents" that will perform the actual build process. For each solution to build one has to define its process. These processes are described in a "Build Template" that tells the agent what steps to go through and in what order.

"Build Templates" in TFS Build Services are defined using Visual Studio. The image below shows a build template accessed through Visual Studio Team Explorer.

Visual Studio Team Explorer Build Templates

Major steps in a build template

As one creates a new build template for a solution one has to go through the following major steps:

1. Define a trigger

Decides what should trigger the build. Should it be triggered manually, should it be a scheduled build or should it be triggered at a check-in of new code?

2. Source Setting

This will tell the build process what part of the source tree the build template is relevant for. When queueing a new build this is the part of the source tree that will be downloaded to the staging area. It also tells the build services where on disk the source should be downloaded to.

3. Process

This is where all the steps and activities that the build service should perform are defined. Team Foundation Build Services comes with a number of standard templates and custom ones can be added. In this post we'll however stick with the default one.

Build your first BizTalk solution

Building BizTalk Server solution using TFS Build Services is straight forward.

In this post I will use this sample BizTalk solution. After checking it into Team Foundation Source Control (I'll use TFS Source control in this post but it'll work similarly using Git) I’ll create a new build template for the solution. All that's needed to change is the MsBuild platform setting property, so we’re using x86 when executing MsBuild as shown below.

Build process and the MsBuild platform setting property

After queuing a build we can in the TFS Build Explorer see a successful build!

Successful build in TFS Build Explorer

We can also download the output from the build where we can see all our build artifacts! First artifact drop

Using BtsMsiTask to create a MSI as part of the build

So far so good, but we started the article by saying that what we wanted was a deployable artifact. In the case of BizTalk this means a BizTalk MSI. Let’s see what we need to change to also have the build process create a MSI.

1. Install BtsMsiTask

Download and install BtsMsiTask. This will install a MsBuild task for generating the MSI.

2. Add a MsBuild project file

Add a MsBuild project file (build.proj) to the solution

The project file will tell the BtsMsiTask process what artifacts to include. Add the created project file to the solution and check it in as part of the solution.

3. Add the MsBuild project file to the TFS build template

Add the created MsBuild project file to the TFS build template by adding it to the list of projects to build.

Adding the build file to the build process

After another successful build we can see that we also created a MSI as part of the build! Successful build with msi

Adding build information to the MSI

File name

As we can see the MSI we just created ended up with the default BtsMsiFile name that is a combination of the BizTalk application name property and the current date and time. Wouldn't it be nice of we instead could the build number as part of the name? BtsMsiTask has an optional property called FileName that we for example can set to <FileName>$(TF_BUILD_BUILDNUMBER).msi</FileName>

Source location

When installing the artifact to BizTalk Server we can see that the source location property in the BizTalk Administration Console is set to where the artifact was built on the staging area. Source Location without build number

It'd be nice to also have information about what build that produced these artifacts. This will give the required information to know exactly what builds that are used for all the installed artifacts.

We can change what is set in the source location by using the SourceLocation property of BtsMsiTask <SourceLocation>c:\$(TF_BUILD_BUILDNUMBER)</SourceLocation>

So after setting the property as below, queue another build, reinstall using the MSI and we'll get the following result with the build number in the source location property. Source Location with build number

And finally this is the MsBuild project file we ended up with in our example.

Viewing all 102 articles
Browse latest View live