I wanted to take a few minutes this morning and blog about my latest experiences with Tibco (Scribe) and interfacing with it to push data in Dynamics 365 in the cloud. Tibco is a collection of connectors for various software solutions. The idea is most of these software solutions don’t have something like a Swagger/Open-API for moving data. They usually have some form of an API that usually requires some piece of specific knowledge of that particular architecture. Tibco solves this by creating a normalized way of moving all the data around. You create your account, install the appropriate connector and then start dragging blocks into the flow screen and designing out your integration/migration. This all sounds very wonderful in theory. However the practical, or hands on, experience was quite different.
Flows
Tibco has a concept of an application. An application is made up of one or more flows. Each flow is made up of a collection of blocks that can be dragged onto the screen to inform the Tibco engine what the desired behavior for the flow is. Depending on the type of connector there are different kind of blocks available to be used.
Flows are a really nice way to quickly build out a migration. Each entity that you are migrating data into will have a flow built out for it. In some cases there will be multiple flows for each entity. This happens if you are pulling data from multiple locations to create accounts for example. Tibco doesn’t function as an ETL (Extract, Transform and Load). While you could use it as a transport/insert layer for one; Tibco it doesn’t provide a bucket for the data to flow into. You would need to setup a database to coalesce the data into.
The process of setting up the flows is quite easy. You need to define a source connection and a target connection. Once that is done you drag in a loop block and start building out the behavior you are looking for.
Challenges
The biggest challenge I ran into using Tibco as a migration tool to move data between a source on-premise instance and a target cloud instance was the performance of the tool. Our data migration took most of a weekend to run. We were averaging about six – eight record a second. It was very slow. At first I thought this had to do with the Dynamics 365 cloud instance. All the logging was shut off, all the workflows and plugins were disabled. Turns out that there is more than one way to connect to a cloud instance now. Tibco is still using the old process of connecting to the API. When I connected to the cloud using the new process, without using the affinity token, the number of processed records was quite good; even in the Sandbox instance.
//Change max connections from .NET to a remote service default: 2 System.Net.ServicePointManager.DefaultConnectionLimit = 65000; //Bump up the min threads reserved for this app to ramp connections faster - minWorkerThreads defaults to 4, minIOCP defaults to 4 System.Threading.ThreadPool.SetMinThreads(100, 100); //Turn off the Expect 100 to continue message - 'true' will cause the caller to wait until it round-trip confirms a connection to the server System.Net.ServicePointManager.Expect100Continue = false; //Can decrease overall transmission overhead but can cause delay in data packet arrival System.Net.ServicePointManager.UseNagleAlgorithm = false; //Connection to Dynamics var connectionString = @"AuthType=OAuth; Url=https://you.crm.dynamics.com;Username=user@company.com;Password=awesome;AppId=51f81489-12ee-4a9e-aaae-a2591f45987d; RedirectUri=app://58145B91-0C36-4500-8554-080854F2AC97;LoginPrompt=Auto"; using (var connTarget = new CrmServiceClient(connectionString))= { //Do work }
Now that you have the connection properly configured you can do some work or create/update entities in Dynamics. To do that create a collection of entities and then pass them off to a method that will handle all the API interaction.
/// <summary>
/// Update the account entities
/// </summary>
/// <param name="svc"></param>
/// <param name="entityReferences"></param>
private static void UpdateAccountRecords(CrmServiceClient svc, List<Entity> entityList)
{
Parallel.ForEach(
entityList, new ParallelOptions() { MaxDegreeOfParallelism = svc.RecommendedDegreesOfParallelism },
() => {
//Clone the CrmServiceClient for each thread
return svc.Clone();
},
(entity, loopState, index, threadLocalSvc) =>
{
// In each thread, delete the entities
try
{
System.Console.Write("+");
//Do stuff
threadLocalSvc.Update(entity);
}
catch (Exception)
{
//Do nothing
}
return threadLocalSvc;
},
(threadLocalSvc) =>
{
//Dispose the cloned CrmServiceClient instance
if (threadLocalSvc != null)
{
threadLocalSvc.Dispose();
}
}
);
}
I write all this code to demonstrate how fast the cloud instance of Dynamics 365 can be. After seeing the obvious speed difference between what Tibco was capable of and what I could get with a properly written solution I opened a ticket with Tibco. The response was not what I wanted. They have not upgraded their code base yet (2022 spring). So even though I was able to provide them a working example of how to properly write this they didn’t want to fix it. I was not happy to say the least. For more detail see: Maximize API throughput – Finance & Operations | Dynamics 365 | Microsoft Docs
Also check out the Task Parallel Library: Web API CDSWebApiService Async Parallel Operations Sample (C#) (Microsoft Dataverse) – Power Apps | Microsoft Docs
And lastly don’t use the Organization Service. Use the CrmServiceClient
Summary
Even though the UI of Tibco was easy to use and intuitive the lack of performance was really disappointing. I would strongly recommend not using their product for a migration as it is way too slow to use on anything except the smallest ( sub one million) record migrations. Check out Kindswaysoft. I’ve had much better results with their product.
Caleb Skinner
Enjoy writing code and driving engagements. Been in the game for about 22 years. Been around c#, Java, SQL, Ubuntu, Windows and the rest of the gang for a while.