Biml, Microsoft Technologies, SSIS

Biml for a Task Factory Dynamics CRM Source Component

I recently worked on a project where a client wanted to use Biml to create SSIS packages to stage data from Dynamics 365 CRM. My first attempt using a script component had an error, which I think is related to a bug in the Biml engine with how it currently generates script components, so I had to find a different way to accomplish my goal. (If you have run into this issue with Biml, please comment so I know it’s not just me! I have yet to get Varigence to confirm it.) This client owned the Pragmatic Works Task Factory, so we used the Dynamics CRM source to retrieve data.

I was ultimately creating a bunch of packages that looked like the below.

There are two pieces to using the Task Factory Dynamics CRM Source: the source component and the connection manager. The code for both is below.

Things to note for the source component:

  •  This is pulling data for an entity in Dynamics 365. If you were to populate this by hand, the Task Factory UI would ask you which entity. There are several entities that are visible, but do not allow you to retrieve the data. You can find a list of internal entities here.  You won’t want to stage data from those tables. I generated all of my packages at once with Biml, using a Blacklist to exclude the internal entities.
  • The entity name that should be the value on line 21 is the entity logical name.
  • I’m iterating through columns in tables in a separate file and holding them in memory, hence the references to Table.columns. You’ll need to write some code to get the entity metadata to feed that, or find another way to provide the table names and column names and data types.
  • You must set the ErrorRowDisposition and TruncateRowDisposition to “NotUsed” as I did on line 10 for each column or your package will not work.
  • In OutputColumns and ExternalColumns collections, I just have if statements to check data types. There may be a more elegant way to do this, but this worked for me and I went with it. Feel free to leave suggestions in the comments if you have a better idea.
  • The Connection element on line 145 should keep the name “DYNAMICSCONNECTION”. If you change it, the connection won’t work. In my actual project, my connection manager is in a separate file and is named “TF_DynamicsCRM”. You can set the ConnectionName property to whatever you want, as long as it matches your connection definition (Line 153 in the gist).

Things to note for the Connection Manager:

  • I tried reverse engineering the connection manager in Biml Studio, and that got me close, but it didn’t get the ObjectData property quite right. I ended up having to create one manually, view the code for it in SSDT, and copy it into my Biml file. I have my packages and project set to EncryptSensitiveWithPassword, which is why you see the p4:Salt value on line 158. Your connection manager will have different values for lines 158, 159, and 160. If you set your project to EncryptSensitiveWithPassword, this value will stay consistent across developers. But if you use EncryptSensitiveWithUserKey, the value may change, which will be fun for regenerating with Biml. So make sure you plan for that if you have multiple developers or multiple computers on which you are working.
  • This connection manager is set to connect to Dynamics 365, hence the ServerHost=disco.crm.dynamics.com. If you have an on-prem or hosted environment that isn’t 365, you’ll have to find the correct ServerHost value and put it in line 165.

I hope that helps someone looking to generate the Task Factory Dynamics Source with Biml.

Excel, Microsoft Technologies, Power BI, SSIS

Installing the Microsoft.ACE.OLEDB.12.0 Provider for Both 64-bit and 32-bit Processing

I recently got a new laptop and had to go through the ritual of reinstalling all my programs and drivers. I sometimes work with SSIS locally to import data from Excel and occasionally do demos with Power BI where I read from an Access database so I needed to install the ACE OLE DB provider.  If you try to do either of those things without the ACE OLE DB provider you will get a message like the one below.

The ‘Microsoft.ACE.OLEDB.12.0’ provider is not registered on the local machine.

I needed both the 32-bit and the 64-bit adapters to cover my various use cases. I started by installing the 64-bit Microsoft Access Database Engine 2010 Redistributable.  If you have ever attempted this, you know there is a secret. You cannot install the 2010 Access database engine with a different processing architecture (bittedness) from your Office install (or install both versions of the 2010 Access database engine) without a workaround.  If you try, you will get an error message.

ACE engine bittedness error

There are two options for getting the other version installed.

Option A: Install the 2007 Office System Driver: Data Connectivity Components

The 2007 Access database engine can read the same format, but didn’t have a 64-bit install. It doesn’t have the check to see what other products are on your machine and can co-exist peacefully with the 64-bit 2010 Access engine.

Option B: Install from the Command Line and Use the Passive Switch

  1. Open the command prompt.
  2. Type the file path and file name for your Access Database Engine install file.
  3. Follow it with a space and then “/passive” and then hit Enter.
  4. Delete or rename the mso.dll registry value in the following registry key: HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Office\14.0\Common\FilesPaths

ACE install cmd

I wrote up a document for this a while back for a client, but I can never remember where I put it. So I’m posting this on my blog to make this info easier to find for me as well as you.

Azure Data Factory, Biml, Conferences, SSIS

I’m Speaking at IT/Dev Connections 2017

I’m pleased to say that I am speaking at IT/Dev Connections 2017. This year the conference will be held in San Francisco October 23-26. I had a great experience speaking at IT/Dev Connections in 2015, so I am excited to return again this year.

This conference is special to me because of its focus on providing great content for developers and IT pros – the conference website describes it as the “anti-keynote” conference with no forced marketing content.

I also enjoy it because it is more than just SQL Server/Data Platform (they have tracks for Cloud & Data Center, Enterprise Collaboration, Development & Dev Ops, and Enterprise Mobility and Security), and it’s nice to get out of my comfort zone a bit.

I will deliver two sessions at the conference.

Azure Data Factory in A Nutshell

If you have been wanting to get into Azure Data Factory (ADF) development, join me for this demo-filled overview. In this session, we’ll go over the basic anatomy of an ADF solution. You’ll learn what ADF is and isn’t as we walk through a solution to pull data from an on-premises SQL Server database to a blob storage and then populate and Azure SQL Data Warehouse. You’ll learn tips for creating ADF solutions in Visual Studio, and I’ll show you how to make ADF development less tedious with a free Visual Studio Add-in called Biml Express. You’ll leave with a basic understanding of ADF and a list of tools and skills you’ll want to acquire as you begin your ADF development.

Improve Data Warehouse ETL Delivery with a Patterns-Based Approach

What if I told you that 90% of your data integration development in SQL Server could be automated? In 5 years, you will be “old fashioned” if you are hand coding SSIS packages. Developers with different skill levels and design preferences create databases and SSIS packages however they see fit to get the job done. Documentation is frequently omitted. Maintenance and small enhancements consume too much development time while manual errors and inconsistencies slip through the testing and release process. You can use tools and frameworks to rearrange the development process and alleviate these common problems. The implementation and automation of design patterns leads to improved efficiency and communication. Join me in this session to learn how to use Business Intelligence Markup Language (Biml) and Excel to facilitate metadata-driven SSIS development. I’ll use database schema information plus Excel inputs to implement a small data mart from staging through the dimensional model.

I hope you will join me in San Francisco in October!

Microsoft Technologies, SSAS, SSIS

Process Compatibility Level 1200 SSAS Tabular Model from SSIS 2014

A client wanted to upgrade their SSAS model to SSAS 2016 to take advantage of some of the features of the new level 1200 compatibility model. But they weren’t yet ready to upgrade their SSIS server from SQL 2014. This presented a problem because they had been using the Analysis Services Processing Task to process their tabular model nightly. This processing task in SSIS 2014 uses the old Analysis Management Objects, which aren’t compatible with the new SSAS tabular models.

Attempting to use the AS Processing Task results in the following error: “[Analysis Services Execute DDL Task] Error: This command cannot be executed on database ‘MySSASDB’ because it has been defined with StorageEngineUsed set to TabularMetadata. For databases in this mode, you must use Tabular APIs to administer the database”

The reason for keeping SSAS processing in an SSIS package was because it kept consistent logging throughout their data refresh process. So we set out to find another solution.

The new SSAS Tabular models use Tabular Model Scripting Language (JSON) rather than XMLA. A simple process full command in TMSL might look like:

{
  "refresh": {
    "type": "full",
    "objects": [
      {
        "database": "MySSASDB"
      }
    ]
  }
}

Just pasting the JSON to process the model into an Analysis Services Execute DDL Task didn’t work. It returned the error “DDL is not valid”.

I asked around and received a couple of solutions that worked.

Option 1 (my less preferred option): Create an OLEDB connection manager (rather than MSOLAP) and use an Execute SQL Task.

You can set up an OLE DB connection that looks like the below.  Create a new connection manager and choose the OLE DB connection manager type. Change the provider to Microsoft OLE DB Provider for Analysis Service 13.0 and fill in your connection information.

ms-oledb-provider-for-as-conmgr

You’ll see the connection manager show up in the Connection Managers pane looking like this:

ssas-oledb-prov

Drag in an Execute SQL task, use the previously defined connection manager and paste in the JSON. This works just fine, but feels a bit too much like a workaround rather than a solid solution to me.

Option #2: Use an Analysis Services Execute DDL task and wrap the JSON in XMLA

Drag in an Analysis Services Execute DDL task. Create a new connection manager by choosing New Analysis Services Connection.

ssis-new-ssas-con

Edit your connection information, click OK, and you will end up with a connection manager that looks like this:

ssis-ssas-con-mgr

Use that connection manager in the AS Execute DDL task. We can use the same JSON from earlier and wrap it in XMLA as shown below.

<Statement xmlns="urn:schemas-microsoft-com:xml-analysis">
{
  "refresh": {
    "type": "full",
    "objects": [
      {
        "database": "MySSASDB"
      }
    ]
  }
}

</Statement>

The XMLA/JSON command can be a direct source statement or placed in a variable and referenced from the task.

To test that the model is successfully processed, you can execute the SSIS task or package and then run the following query against the DMV.

Select [catalog_name], [date_modified], [compatibility_level] 
from $SYSTEM.DBSCHEMA_CATALOGS where [Catalog_Name] = 'MySSASDB'

So if you are caught between versions in SSIS and SSAS, do not despair. You can still process your new SSAS Tabular model from an SSIS package.

Biml, Microsoft Technologies, SSIS

BimlScript – Get to Know Your Control Nuggets

This is post #2 of my BimlScript – Get to Know Your Code Nuggets series. To learn about text nuggets, see my first post.  

The next type of BimlScript code nugget I’d like to discuss is the control nugget. Control nuggets allow you to insert control logic to determine what Biml is generated and used to create your BI artifacts (packages, cubes, etc.).  Control nuggets can be used to:

  • Add conditional logic so that specified Biml fragments are only generated when certain criteria are met
  • Add loops to repeat Biml fragments multiple times
  • Define variables that will be used later
  • Access external data from databases, flat files or other sources that will be used in the generation of the final Biml code

Control nuggets start with <# and end with #>. Just like text nuggets, control nuggets can be a single line or multiple lines. And they can contain simple or complex logic.

There were actually control nuggets in the text nugget example from the previous post. The variable declarations at the top of the file (lines 4 – 10) are control nuggets.

Below is another example file that uses code nuggets. I like to separate my design patterns. project connections, and package generation into separate files. The BimlScript shown below is from a slightly altered “caller file”, the file that I would execute to create the SSIS packages that should follow a specified design pattern. This caller file is generating Type 1 slowly changing dimensions.

In my Biml framework, I store the data I need in order to generate my packages in SQL tables. You can see the control nugget in lines 14 – 40 retrieving data from my database and storing it in a data table for later use.

On line 46, there is a control nugget containing a for each loop. For each row in the data table, it calls a BimlScript file that creates a package that uses my Type 1 SCD design pattern with the variables from data table. Notice that the end curly brace for my loop is in a separate control nugget.

There are many different ways to use control nuggets that aren’t covered here.  From these two examples you can start to see how I might combine text nuggets and control nuggets to automate my SSIS package creation and employ consistent design patterns. But there are still a few missing pieces that will be filled in when I cover the remaining code nugget types.

Biml, Microsoft Technologies, SSIS

BimlScript – Get to Know Your Code Nuggets

gold nuggetsIn BimlScript, we embed nuggets of C# or VB code into our Biml (XML) in order to replace variables and automate the creation of our BI artifacts (databases, tables, SSIS packages, SSAS cubes, etc.). Code nuggets are a major ingredient in the magic sauce that is meta-data driven SSIS development using BimlScript.

There are 5 different types of code nuggets in BimlScript:

  • Text nuggets
  • Control nuggets
  • Class nuggets
  • Comments
  • Directives

Over the next several posts I’ll cover each type of code nugget and provide examples of their use.

Text Nuggets

Text nuggets evaluate the expression they contain and then replace the text nugget with the string representation of the value of the expression. I use them often to switch out names of packages, tasks, and components as well as source and destination tables in SSIS development when creating packages based upon a design pattern.

Text nuggets start with <#= and end with #>. Notice there is an equals sign at the beginning of the text nugget but not at the end.

Text nuggets are very useful. You can include complex business logic in the expressions. And the expression result can be any data type. The BimlScript compiler will automatically convert it to text before replacing the code nugget with the result. Like all code nuggets, text nuggets can be a single line or multiple lines.

Let’s take a look at some examples.

Note: The Biml script below has variables, connections, and a package all in one file for ease of demonstration. Normally I would split these out into separate files. 

This BimlScript creates an SSIS package that truncates and reloads data in a single table.

Notice that you can use the text nuggets alone, together, or with literal text.

The <#=PackageName#> text nugget simply replaces the value for the name of the package with the current value of the PackageName variable I set at the top of the file.

The DataFlow Task name is a combination of text and a code nugget.

<Dataflow Name="DFT Stage_<#=DestinationTable#>">

This comes in handy if you would like to enforce certain naming standards. My personal preference is to use naming conventions suggested by Jamie Thomson, so I build them into my Biml and then use text nuggets as shown above to change the rest of the name to reflect the nature of the current task.

The truncate table statement in the Execute SQL Task contains text and two code nuggets that reference a variable and use a function to convert the variable value into all uppercase text:

Truncate Table [<#=DestinationSchema.ToUpper()#>].[<#=DestinationTable.ToUpper()#>];

Hopefully you can see how text nuggets are useful when creating multiple packages at the same time. They allow you to switch out the values that change for each package and create consistent naming standards.

To get to know other types of code nuggets, see Get To Know Your Control Nuggets.

Biml, Conferences, SSIS

Bimling in the Northeast

I’m expanding my experiences to speak at different SQL Saturdays this year, and I’m very excited to say that I will be speaking at SQLSaturday Boston on March 19th and SQLSaturday Maine on June 4th.

My session at both SQL Saturdays will focus on using BimlScript to create good ETL patterns. SSIS has been around for a while now, but I see people making the same mistakes that are easily resolved. One of the biggest mistakes is simple inconsistency. BimlScript supports consistency, and forces the developer to (when used correctly) recognize patterns and reuse them. This helps us to solve (and share the solution to) known problems, make it easier for junior developers to solve those problems and learn why our solution works, and move on to new and more interesting challenges. And if we find a flaw in our pattern, it’s a quick update to the Biml pattern and regeneration of packages to make the change to all applicable packages.

Everyone involved in creating and supporting your SSIS projects can appreciate consistent design patterns. It makes your patterns more tested (by reuse in multiple situations) and reduces the learning curve for development and support.  BimlScript plus a good ETL framework facilitates quicker start and completion of SSIS projects. As a consultant, I appreciate the consistency when I take over a project (usually, I’m working on phase 2 when someone else built phase 1) and already know how it works. And if the framework is slightly different from what I’m used to (there are valid reasons to build things differently on occasion),  I can look at the Biml from the previous work and easily read what’s going on. Have you ever tried to read the XML code of an SSIS package? It’s ugly. Biml is much easier to read because it doesn’t include position information for the UI and many other attributes that aren’t helpful. And because the code is all on one page, I have less chance of missing a setting that was hiding in the advanced properties somewhere in an existing package.

If you are an SSIS architect and you aren’t familiar with Biml, I urge you to look into it. BimlScript.com has a great learning plan to help you get started and become comfortable using Bimlscript.  In my opinion, it is currently the best (and in many cases free!) way to create your SSIS framework and design patterns.