BIDS Helper, Microsoft Technologies, SSIS

Using a Variable to Populate the Query in a Lookup in SSIS

I encountered a situation on my last SSIS project in which I needed to be able to populate the query in lookup with a where clause that referenced a project parameter. This wasn’t something I had ever needed to do in the past, so I had to do a bit of digging to figure it out. Luckily, I found this post by John Welch, which led me in the right direction. I’m posting my steps in detail here both to help anyone else trying to figure this out and also so I have it for reference should I need to do this again.

The Situation

I had warranty registration data that contained addresses. Earlier in the project, we used Melissa Data to do some address cleansing and to retrieve MSA and county FIPS codes for each address. Melissa Data returns results with a confidence score that reflects how well the address matched. We determined that we needed a confidence score of .8 in order to use the data. The client asked that we make an environment variable for the required confidence score so it could be easily changed without having to re-deploy the SSIS project. Later in the project, we had some warranty registration data for which we needed to look up the MSA. But we only wanted to return the MSA if the confidence score tied to the address was at or above the required confidence score we had set in the project parameter.

I already had a project parameter for the required confidence score.

projectparam

So I made a package-level variable to hold the query for the lookup that referenced the project parameter.  Then I set the lookup property to use that query.

The SSIS Setup

I created a variable in my package to hold the query for the SSIS lookup which referenced the project parameter in the where clause. (The scope was the package and the data type was String.)

SSIS Variable query

I already had my data flow populated with the lookup for MSA. I set it to full cache and entered a query in the connection to initially populate the fields that would be returned (simply my lookup query without the where clause).

Next, I opened the data flow properties, located Expressions and clicked on the ellipses to open the Property Expression  Editor. I found the SQLCommand property for my MSA lookup and set it to my package variable that contained my query.

ssis dft propertiesSince I had BIDS Helper installed, I could see that I had an expression set for the lookup as denoted by the pink glyph.

ssis data flow

 

And that’s all there is to it.

I could have avoided the extra variable and put the lookup query directly into the property expression, but I prefer having the variable to reference instead of digging through the properties.

BIDS Helper, Biml, Data Warehousing, Microsoft Technologies, SQL Server, SSIS

Type 6 or Hybrid Type 2 Slowly Changing Dimension with Biml

In my previous post, I provided the design pattern and Biml for a pure Type 2 Slowly Changing Dimension (SCD). When I say “pure Type 2 SCD”, I mean an ETL process that adds a new row for a change in any field in the dimension and never updates a dimension attribute without creating a new row.  In practice, I tend to create more hybrid Type 2 SCDs where updates to some attributes require a new row and others update the value on the existing rows. A similar pattern that I find I implement more often than a pure Type 2 is a Type 6 SCD. A Type 6 SCD builds on the Type 2 technique by adding current attributes alongside the historical attributes so related measures can be grouped by the historical or current dimension attribute values. The only difference between what I call a hybrid Type 2 and a Type 6 is that in the Type 6, there are no Type 1 attributes in the dimension that do not also have a Type 2 version in the dimension to capture the historical values.

Design Pattern

Control Flow

If you are comfortable with my design pattern for a pure Type 2 SCD in which a change of value in any column causes a new row, this pattern is quite similar. And the control flow is exactly the same. This pattern, as with my pure Type 2, assumes that rows are not deleted from the source system. You could easily modify this to check for deleted rows if needed.

Control Flow for a Hybrid Type 2 or Type 6 Dimension
Control Flow for a Hybrid Type 2 or Type 6 Dimension

The steps in the control flow are:

  1. Log package start.
  2. Truncate the update table.
  3. Execute the data flow.
  4. Execute the update statements to update columns and insert new rows.
  5. Log package end.

The update statements are different in this pattern, and I’ll explain those in detail below.

Data Flow

The data flow looks like a pure Type 2 SCD, with the exception of an added derived column transformation and minor changes to the lookup and conditional split. Again, I use views to integrate the data, apply business logic, and add hashkeys for change detection. Then I use SSIS to perform the mechanics of loading the data.

The steps in this data flow task are:

  1. Retrieve data from my source view.
  2. Count the rows for package logging purposes.
  3. Perform a lookup to see if the entity already exists in the dimension table.
  4. If the entity doesn’t exist at all in the dimension table, it goes into the left path where I count the number of rows, add a derived column that sets the row start date to “01/01/1900 00:00:00”, and then insert the row into the dimension table.
  5. If the entity does exist in the table, I check it for changes.
  6. If there are changes to the entity, I count the number of rows, us a derived column to flag the type(s) of changes to make, and then insert the row into an update table.
  7. Entities with no changes are simply counted for audit purposes.

The Source View

This SSIS pattern requires 3 hashed values for for change detection:

  • HistoricalHashKey: the unique identifier of the entity, the natural key that ties the historical rows together
  • ChangeHashKey: the columns on the dimension that cause a new row to be created and the current row to be expired
  • UpdateHashKey: the columns on the dimension that should be updated in place

In my example view below, the Route ID and Warehouse identify a unique route.  The supervisor, route description and route type are all important attributes of the route.  The route area identifies the metro area in which a route is located. If this should change, there is no need to see previous values; we just want to see the current value on every row.

 CREATE View [dbo].[StgDimRoute] as   
    with [Routebase] as   
    (   SELECT [RouteID]  
    , [RouteDescription]  
    , [Supervisor]  
    , [RouteType]  
    , [Warehouse]  
    , [RouteArea]  
     , 1 as RowIsCurrent     
  FROM Stg.Routes R  
  Left JOIN dbo.StgWarehouse W ON W.WarehouseID = R.WarehouseID  
  UNION   
  select -1 as RouteID, 'Unknown' as RouteDescription, 'Unknown' as Supervisor,   
  'Unknown' as RouteType, 'Unknown' as Warehouse, 'Unknown' as RouteArea, 
  1 as RowIsCurrent 
      ) ,  
 Routedata as   
 (  
 Select RouteID, RouteDescription, Supervisor, RouteType, Warehouse, 
   RouteArea, RowIsCurrent 
  , CONVERT(VARBINARY(20), HASHBYTES('MD5', CONCAT([RouteID], ' ',Warehouse ))) 
      AS HistoricalHashKey  
  , CONVERT(VARBINARY(20), HASHBYTES('MD5', CONCAT(Supervisor, [RouteDescription], 
      RouteType))) AS ChangeHashKey  
  , Convert(VARBINARY(20), HASHBYTES('MD5', CONCAT(Area,' '))) AS UpdateHashKey  
  from Routebase sb  
  )   
 Select RouteID, RouteDescription, Supervisor, RouteType, Warehouse,  
  RouteArea, RowIsCurrent  
  , HistoricalHashKey, ChangeHashKey, UpdateHashKey  
  , CONVERT(VARCHAR(34), HistoricalHashKey, 1) AS HistoricalHashKeyASCII  
  , CONVERT(VARCHAR(34), ChangeHashKey, 1) AS ChangeHashKeyASCII  
  , CONVERT(VARCHAR(34), UpdateHashKey, 1) AS UpdateHashKeyASCII  
  , dateadd(MINUTE, -1, current_timestamp) as RowEndDate
  , CURRENT_TIMESTAMP as RowStartDate   
  from Routedata ds  

The RowEndDate value in this view will be used for routes that require a current row to be expired since my pattern is the leave the end date of the current row null.

The Change Detection Lookup

The lookup in my DFT compares the HistoricalHashKeyASCII column from the source view with the varchar version of the HistoricalHashKey from the dimension table and adds two lookup columns: lkp_ChangeHashKeyASCII and lkp_UpdateHashKeyASCII to the data set.

Type 6 historical hash lookup

Rows that do not match are new rows; i.e., that route has never been in the dimension table before.  Rows that do match have a row in the dimension table and will then be evaluated to see if there are any changes in the values for that route.

Derived Column for New Rows

The no match output of the lookup are new rows for routes that are not in the dimension table. Since this is the first row in the table for that route, we want this row to be effective from the beginning of time until the end of time. The beginning of time in this data mart is 01/01/1900. Since the data is loaded multiple times per day, I use a date/time field rather than a date. If you only need the precision of a day, you can cut your row end date/time back to just a date. In my pattern, the current row has a null row end date, but you could easily add a derived column to set the end date to 12/31/9999 if you prefer.

Derived Column for Start Date

Conditional Split for Change Detection

This time, I have to check to see if both the ChangeHashKeyASCII and the UpdateHashKeyASCII match in my conditional split.

Type 6 CSPL

If both hashed values from the source view match the hashed values from the dimension table, then no changes are required and the row is simply counted for audit purposes.

If either hashed value doesn’t match, there is an update to be made.

Derived Column to Identify Change Types

We again compare the UpdateHashKeyASCII value from the source view with that of the dimension. If they don’t match, we set the UpdateInPlace flag to true. If the ChangeHashKeyASCII values don’t match, we set the UpdateNewRow flag to true. If a row has both types of changes, both types of updates will be made.

My update table contains the UpdateInPlace and UpdateNewRow columns, so I can reference these flags in my update statements.

The Update Statements

The update statements in the control flow take the changes from the update table and apply them to the dimension table. Three statements are executed in the Execute SQL Statement labeled SQL Update DimRoute.

 UPDATE A   
 SET  A.RouteArea = UA.RouteArea,   
      A.UpdateHashKey = UA.UpdateHashKey,   
      A.UpdtDtTm = CURRENT_TIMESTAMP   
 FROM  Updt.UpdtRoute AS UA   
     JOIN Dbo.DimRoute AS A   
      ON UA.HistoricalHashKey = A.HistoricalHashKey   
 WHERE UpdateInPlace = 1; 
  
UPDATE A   
 SET  RowIsCurrent = 0,   
     A.RowEndDate = UA.RowEndDate,   
     A.UpdtDtTm = CURRENT_TIMESTAMP   
 FROM  Updt.UpdtRoute UA   
     JOIN Dbo.DimRoute AS A   
      ON UA.HistoricalHashKey = A.HistoricalHashKey   
 WHERE A.RowIsCurrent = 1   
     AND UA.UpdateNewRow = 1;   

 INSERT INTO Dbo.DimRoute   
       (RouteID,   
        RouteDescription,   
        Supervisor,   
        RouteType,   
        Warehouse,   
        RouteArea,   
        RowIsCurrent,   
        RowStartDate,   
        RowEndDate,   
        HistoricalHashKey,   
        ChangeHashKey,   
        UpdateHashKey)   
 SELECT RouteID,   
        RouteDescription,   
        Supervisor,   
        RouteType,   
        Warehouse,   
        RouteArea,   
        1,   
        RowStartDate,   
        null,   
        HistoricalHashKey,   
        ChangeHashKey,   
        UpdateHashKey
FROM Updt.UpdtRoute WHERE UpdateNewRow = 1; 

The first statement updates the values for the columns designated to be updated in place by joining the update table to the dimension table based on the HistoricalHashKey column. This is the same as performing updates in a Type 1 SCD.

The second statement expires all the rows for which a new row will be added. The third statement inserts the new rows with the RowIsCurrent value set to 1 and the RowEndDate set to null.

The Biml

If you are using Biml, you know that you can create a design pattern for this type of dimension load and reuse it across multiple projects. This speeds up development and ensures that your Type 2 Hybrid or Type 6 dimensions are implemented consistently.

As usual, I have 3 Biml files that are used to create the SSIS package:

  • ProjectConnections.biml – contains all the project-level connections for the project
  • Dim2Hybrid.biml – contains the SSIS design pattern with code nuggets that parameterize it to make it reusable
  • CreateDim2HybridPackages.biml – calls Dim2Hybrid.biml and passes along the values to be used for each package

I’ve pasted Dim2Hybrid and CreateDim2Hybrid below.

Dim2Hybrid.Biml

 <#@ template language="C#" tier="2" #>  
 <#@ property name="PackageName" type="String" #>  
 <#@ property name="DstSchemaName" type="String" #>  
 <#@ property name="DstTableName" type="String" #>  
 <#@ property name="DstConnection" type="String" #>  
 <#@ property name="DataFlowSourceName" type="String" #>  
 <#@ property name="SrcConnection" type="String" #>  
 <#@ property name="SourceQuery" type="String" #>     
 <#@ property name="UpdateSchemaName" type="String" #>     
 <#@ property name="UpdateTableName" type="String" #>     
 <#@ property name="UpdateConnection" type="String" #>        
 <#@ property name="UpdateSQLStatement" type="String" #>      
     <Package Name="<#=PackageName#>" Language="None">  
   <Parameters>  
     <Parameter DataType="String" Name="ParentPackageID">00000000-0000-0000-0000-000000000000</Parameter>  
   </Parameters>  
   <Variables>  
     <Variable EvaluateAsExpression="true" DataType="String" IncludeInDebugDump="Exclude" Name="QualifiedTableSchema">"[" + @[User::SchemaName] + "].[" + @[User::TableName] + "]"</Variable>  
     <Variable DataType="String" IncludeInDebugDump="Exclude" Name="QueryAuditStart">EXECUTE [dbo].[PackageControlStart] @PackageName = ?, @PackageId = ?, @ParentPackageId = ?, @ExecutionId = ?, @StartTime = ?;</Variable>  
     <Variable DataType="String" IncludeInDebugDump="Exclude" Name="QueryAuditUpdate">EXECUTE [dbo].[PackageControlStop] @PackageId = ?, @ExecutionId = ?, @InsertRowQuantity = ?, @UpdateRowQuantity = ?, @UnchangedRowQuantity=?;</Variable>  
     <Variable DataType="Int32" IncludeInDebugDump="Include" Name="RowCountChanged">0</Variable>  
     <Variable DataType="Int32" IncludeInDebugDump="Include" Name="RowCountNew">0</Variable>  
     <Variable DataType="Int32" IncludeInDebugDump="Include" Name="RowCountSource">0</Variable>  
     <Variable DataType="Int32" IncludeInDebugDump="Include" Name="RowCountUnchanged">0</Variable>   
     <Variable DataType="String" IncludeInDebugDump="Exclude" Name="SchemaName"><#=DstSchemaName#></Variable>  
     <Variable DataType="String" IncludeInDebugDump="Exclude" Name="TableName"><#=DstTableName#></Variable>  
   </Variables>  
   <Tasks>    
     <Dataflow Name="DFT_Insert<#=DstTableName#>">  
       <Transformations>  
         <RowCount Name="CNT_Changed_Rows" VariableName="User.RowCountChanged">  
           <InputPath OutputPathName="CSPL Check For Changes.ChangedRows" />  
     </RowCount>  
         <ConditionalSplit Name="CSPL Check For Changes">  
           <InputPath OutputPathName="LKP Historical Key.Match" />  
           <OutputPaths>  
             <OutputPath Name="ChangedRows">  
               <Expression>(ChangeHashKeyASCII != lkp_ChangeHashKeyASCII) || (UpdateHashKeyASCII != lkp_UpdateHashKeyASCII)</Expression>  
     </OutputPath>  
   </OutputPaths>  
     </ConditionalSplit>  
         <RowCount Name="CNT_New_Rows" VariableName="User.RowCountNew">  
           <InputPath OutputPathName="LKP Historical Key.NoMatch" />  
     </RowCount>  
       <DerivedColumns Name="Identify Change Types">  
           <InputPath SsisName="Derived Column Input" OutputPathName="CNT_Changed_Rows.Output" />  
           <Columns>  
             <Column Name="UpdateInPlace" DataType="Boolean">UpdateHashKeyASCII != lkp_UpdateHashKeyASCII ? (DT_BOOL)1 : (DT_BOOL)0</Column>  
             <Column Name="UpdateNewRow" DataType="Boolean">ChangeHashKeyASCII != lkp_ChangeHashKeyASCII ? (DT_BOOL)1 : (DT_BOOL)0</Column>  
   </Columns>  
     </DerivedColumns>  
         <Lookup Name="LKP Historical Key" NoMatchBehavior="RedirectRowsToNoMatchOutput" OleDbConnectionName="<#=DstConnection#>">  
             <DirectInput>SELECT  
   CONVERT(VARCHAR(34), ChangeHashKey, 1) AS ChangeHashKeyASCII  
 , CONVERT(VARCHAR(34), HistoricalHashKey, 1) AS HistoricalHashKeyASCII  
  ,CONVERT(VARCHAR(34), UpdateHashKey, 1) as UpdateHashkeyAscii  
 FROM  
  <#=DstSchemaName#>.<#=DstTableName#>  
     where rowiscurrent = 1  
  </DirectInput> 
           <Parameters>  
             <Parameter SourceColumn="HistoricalHashKeyASCII" />  
   </Parameters>  
            <ParameterizedQuery>select * from (SELECT  
  CONVERT(VARCHAR(34), ChangeHashKey, 1) AS ChangeHashKeyASCII  
 , CONVERT(VARCHAR(34), HistoricalHashKey, 1) AS HistoricalHashKeyASCII  
  ,CONVERT(VARCHAR(34), UpdateHashKey, 1) as UpdateHashkeyAscii  
 FROM  
  <#=DstSchemaName#>.<#=DstTableName#>  
     where rowiscurrent = 1) [refTable]  
 where [refTable].[HistoricalHashKeyASCII] = ?</ParameterizedQuery>  
            <InputPath OutputPathName="CNT_Source_Rows.Output" />  
           <Inputs>  
             <Column SourceColumn="HistoricalHashKeyASCII" TargetColumn="HistoricalHashKeyASCII" />  
   </Inputs>  
           <Outputs>  
             <Column SourceColumn="ChangeHashKeyASCII" TargetColumn="lkp_ChangeHashKeyASCII" />  
                   <Column SourceColumn="UpdateHashkeyAscii" TargetColumn="lkp_UpdateHashKeyASCII" />  
   </Outputs>  
     </Lookup>  
         <DerivedColumns Name="NewItemRowStartDate">  
        <InputPath OutputPathName="CNT_New_Rows.Output" />  
          <Columns>  
         <Column ReplaceExisting="true" Name="RowStartDate" DataType="DateTime">  
          (DT_DBTIMESTAMP)"01/01/1900 00:00:00"  
         </Column>  
        </Columns>  
       </DerivedColumns>  
            <OleDbDestination Name="OLE_DST_New_Rows" ConnectionName="<#=DstConnection#>">  
           <InputPath OutputPathName="NewItemRowStartDate.Output" />  
            <ExternalTableOutput Table="<#=DstSchemaName#>.<#=DstTableName#>" />   
                <Columns>  
                   <Column SourceColumn="RowEndDate" IsUsed="false"/>  
                </Columns>  
     </OleDbDestination>  
         <RowCount Name="CNT_Source_Rows" VariableName="User.RowCountSource">  
           <InputPath OutputPathName="<#=DataFlowSourceName#>.Output" />  
     </RowCount>  
         <OleDbSource Name="<#=DataFlowSourceName#>" ConnectionName="<#=SrcConnection#>"> 
                <DirectInput><#=SourceQuery#></DirectInput>  
     </OleDbSource>  
         <RowCount Name="CNT_Unchanged_Rows" VariableName="User.RowCountUnchanged">  
           <InputPath OutputPathName="CSPL Check For Changes.Default" />  
     </RowCount>  
         <OleDbDestination Name="OLE_DST Update Table" ConnectionName="<#=DstConnection#>">   
           <InputPath OutputPathName="Identify Change Types.Output" />  
            <ExternalTableOutput Table="[<#=UpdateSchemaName#>].[<#=UpdateTableName#>]" />  
     </OleDbDestination>  
   </Transformations>  
        <PrecedenceConstraints>  
           <Inputs>  
              <Input OutputPathName="SQL Truncate <#=UpdateTableName#>.Output" />   
           </Inputs>  
        </PrecedenceConstraints>  
     </Dataflow>  
     <ExecuteSQL Name="SQL Begin Audit" ConnectionName="Audit">  
       <VariableInput VariableName="User.QueryAuditStart" />  
       <Parameters>  
         <Parameter Name="0" VariableName="System.PackageName" DataType="String" Length="-1" />  
         <Parameter Name="1" VariableName="System.PackageID" DataType="Guid" Length="-1" />  
         <Parameter Name="2" VariableName="ParentPackageID" DataType="Guid" Length="-1" />  
         <Parameter Name="3" VariableName="System.ServerExecutionID" DataType="Int64" Length="-1" />  
         <Parameter Name="4" VariableName="System.StartTime" DataType="Date" Length="-1" />  
   </Parameters>  
     </ExecuteSQL>  
     <ExecuteSQL Name="SQL Close Audit" ConnectionName="Audit">  
       <VariableInput VariableName="User.QueryAuditUpdate" />  
       <Parameters>  
         <Parameter Name="0" VariableName="System.PackageID" DataType="Guid" Length="-1" />  
         <Parameter Name="1" VariableName="System.ServerExecutionID" DataType="Int64" Length="-1" />  
         <Parameter Name="2" VariableName="User.RowCountNew" DataType="Int32" Length="-1" />  
         <Parameter Name="3" VariableName="User.RowCountChanged" DataType="Int32" Length="-1" />  
         <Parameter Name="4" VariableName="User.RowCountUnchanged" DataType="Int32" Length="-1" />  
   </Parameters>  
        <PrecedenceConstraints>  
           <Inputs>  
              <Input OutputPathName="SQL Update <#=DstTableName#>.Output" />  
           </Inputs>  
        </PrecedenceConstraints>  
     </ExecuteSQL>    
     <ExecuteSQL Name="SQL Truncate <#=UpdateTableName#>" ConnectionName="<#=UpdateConnection#>">  
       <DirectInput>Truncate Table [<#=UpdateSchemaName#>].[<#=UpdateTableName#>]</DirectInput>  
        <PrecedenceConstraints>  
           <Inputs>  
              <Input OutputPathName="SQL Begin Audit.Output" />  
           </Inputs>  
        </PrecedenceConstraints>  
     </ExecuteSQL>  
     <ExecuteSQL Name="SQL Update <#=DstTableName#>" ConnectionName="<#=DstConnection#>">  
          <DirectInput><#=UpdateSQLStatement#></DirectInput>  
        <PrecedenceConstraints>  
           <Inputs>  
              <Input OutputPathName="DFT_Insert<#=DstTableName#>.Output" /> 
           </Inputs>  
        </PrecedenceConstraints>  
     </ExecuteSQL>  
   </Tasks>  
 </Package>  

Create Dim2Hybrid.Biml

 

 <#@ template language="C#" hostspecific="true" #>  
 <Biml xmlns="http://schemas.varigence.com/biml.xsd">  
 <#@ include file="ProjectConnection.biml" #>  
 <!--  
    <#  
    string PackageName = "LoadDimRoute";  
    string DstSchemaName = "dbo";  
   string DstTableName = "DimRoute";  
    string DstConnection = "AWBIML";     
    string DataFlowSourceName = "OLE_SRC_StgDimRoute";  
    string SrcConnection = "AWBIML";  
    string SourceQuery = @"Select RouteID, RouteDescription, Supervisor, RouteType, Warehouse,   
   RouteArea, RowIsCurrent   
  , HistoricalHashKey, ChangeHashKey, UpdateHashKey   
  , HistoricalHashKeyASCII   
  , ChangeHashKeyASCII   
  , UpdateHashKeyASCII   
  , RowEndDate  
  , RowStartDate    
  FROM [dbo].[StgDimRoute];" ;  
    string UpdateSchemaName = "Updt";     
    string UpdateTableName = "UpdtRoute";  
    string UpdateConnection = "AWBIML";     
 string UpdateSQLStatement = @" UPDATE A    
  SET A.RouteArea = UA.RouteArea,    
    A.UpdateHashKey = UA.UpdateHashKey,    
    A.UpdtDtTm = CURRENT_TIMESTAMP    
  FROM Updt.UpdtRoute AS UA    
    JOIN Dbo.DimRoute AS A    
    ON UA.HistoricalHashKey = A.HistoricalHashKey    
  WHERE UpdateInPlace = 1;   
  UPDATE A    
  SET RowIsCurrent = 0,    
    A.RowEndDate = UA.RowEndDate,    
    A.UpdtDtTm = CURRENT_TIMESTAMP    
  FROM Updt.UpdtRoute UA    
    JOIN Dbo.DimRoute AS A    
    ON UA.HistoricalHashKey = A.HistoricalHashKey    
  WHERE A.RowIsCurrent = 1    
    AND UA.UpdateNewRow = 1;    
  INSERT INTO Dbo.DimRoute    
     (RouteID,    
     RouteDescription,    
     Supervisor,    
     RouteType,    
     Warehouse,    
     RouteArea,    
     RowIsCurrent,    
     RowStartDate,    
     RowEndDate,    
     HistoricalHashKey,    
     ChangeHashKey,    
     UpdateHashKey)    
  SELECT RouteID,    
     RouteDescription,    
     Supervisor,    
     RouteType,    
     Warehouse,    
     RouteArea,    
     1,    
     RowStartDate,    
     null,    
     HistoricalHashKey,    
     ChangeHashKey,    
     UpdateHashKey  
  FROM Updt.UpdtRoute WHERE UpdateNewRow = 1;" ;     
 #>  
 -->  
   <Packages>  
     <#=CallBimlScript("Dim2Hybrid.biml", PackageName, DstSchemaName, DstTableName, DstConnection, DataFlowSourceName, SrcConnection, SourceQuery, UpdateSchemaName, UpdateTableName, UpdateConnection, UpdateSQLStatement)#>  
   </Packages>  
 </Biml>  

Once I have my source view, dimension table, and update table in the database, the 3 Biml files added to my project, and BIDSHelper installed, all I have to do is right click on the CreateDim2Hybrid.Biml file and choose Generate SSIS packages to create my package.

BIDS Helper, Biml, Microsoft Technologies, SSIS

Demystifying the Type 2 Slowly Changing Dimension with Biml

Most data warehouses have at least a couple of Type 2 Slowly Changing Dimensions. We use them to keep history so we can see what an entity looked like at the time an event occurred. From an ETL standpoint, I think Type 2 SCDs are the most commonly over-complicated and under-optimized design pattern I encounter.

There is a Slowly Changing Dimension Transformation built into SSIS, but most people recommend against using it as it isn’t very efficient.  I think many of the people that do use it do so simply because they feel it’s easier than digging in and understanding the operations that need to be done in order to roll your own Type 2 SCD processing.The most common mistake I see in SCD 2 packages, whether using the built-in transformation or creating your own data flow, is that people use OLEDB commands to perform updates one row at a time rather than writing updates to a staging table and performing a set-based update on all rows.  If your dimension is small, the performance from row by row updates may be acceptable, but the overhead associated with using a staging table and performing set-based update will probably be negligible. So why not keep a consistent pattern for all type 2 dimensions and require no changes if the dimension grows?

I am here to encourage you: don’t be scared of slowly changing dimensions. Once you learn the design pattern, you will find they aren’t that difficult to understand. And once you have the Biml, creating them is fairly painless.

My Design Pattern

BimlScript allows me to create a reusable design pattern for SSIS that I can employ for each Type 2 dimension that I create. Despite the need to keep history, my Type 2 SCD doesn’t look that much different from my Type 1 SCD. In fact the control flow is exactly the same.

I start with an audit step to log the beginning of package execution. Then I truncate the staging table used to contain my updates. Next I have the Data Flow Task to insert new rows into the dimension table and updates into my update table. Then I run an Execute SQL Task to perform the updates. Finally I log the end of package execution.

The Data Flow Task looks a like a Type 1 SCD until the end, where I have a Derived Column Transformation before I insert new rows into my dimension table.

As with my Type 1 pattern, the combination and transformation of the data (the business logic) is performed in a view in SQL Server, and the mechanics of capturing the history is performed in SSIS.

The steps in this data flow task are:

  1. Retrieve data from my source view.
  2. Count the rows for package logging purposes.
  3. Perform a lookup to see if the entity already exists in the dimension table (more info on that below).
  4. If the entity doesn’t exist at all in the dimension table, it goes into the left path where I count the number of rows, add a derived column that sets the row start date to “01/01/1900 00:00:00”, and then insert the row into the dimension table.
  5. If the entity does exist in the table, I check it for changes.
  6. If there are changes to the entity, I count the number of rows and then insert the row into an update table.
  7. Entities with no changes are simply counted for audit purposes.

 The Details

Source View with Hashkeys

I’m using product data from the AdventureWorks database for an example.  I create a view based upon my staging table that holds the data from the source system. In addition to providing the data values from the source data, my view does the following:

  • Adds in the unknown row
  • Adds the hashkeys for change detection
  • Sets the row start date
 CREATE VIEW [dbo].[StgProduct2]  
      AS   
with Productbase as (     
 SELECT [productid],   
     [name],   
     [productnumber],   
     [makeflag],   
     [finishedgoodsflag],   
     [color],   
     [safetystocklevel],   
     [reorderpoint],   
     [standardcost],   
     [listprice],   
     [size],   
     [sizeunitmeasurecode],   
     [weightunitmeasurecode],   
     [weight],   
     [daystomanufacture],   
     [productline],   
     [class],   
     [style],   
     [sellstartdate],   
     [sellenddate],   
     [discontinueddate],   
           1 as RowIsCurrent,   
           0 as isDeleted  
  FROM  [Staging].[Product]  
 UNION   
  Select -1 as productid, 'unknown' as name, null as productnumber, null as makeflag, 
   null as finishedgoodsflag, null as color, null as safetystocklevel,   
   null as reorderpoint, null as standardcost, null as listprice, null as size, 
   null as sizeunitmeasurecode, null as weightunitmeasurecode, null as weight,   
   null as daystomanufacture, null as productline, null as class, null as style, 
   null as sellstartdate, null as sellenddate, null as discontinueddate,   
   1 as RowIsCurrent, 0 as isDeleted          
        ) ,  
productdata as   
 (  
 Select [productid], [name], [productnumber], [makeflag], [finishedgoodsflag], [color], 
 [safetystocklevel], [reorderpoint], [standardcost], [listprice], [size], 
 [sizeunitmeasurecode], [weightunitmeasurecode], [weight], [daystomanufacture], 
 [productline], [class], [style], [sellstartdate], [sellenddate], [discontinueddate], 
 rowiscurrent, isdeleted  
  , CONVERT(VARBINARY(20), HASHBYTES('MD5', CONCAT([ProductID], ' '))) AS HistoricalHashKey  
  , CONVERT(VARBINARY(20), HASHBYTES('MD5', CONCAT([name], [productnumber], [makeflag], 
 [finishedgoodsflag], [color], [safetystocklevel], [reorderpoint], [standardcost], [listprice], 
 [size], [sizeunitmeasurecode], [weightunitmeasurecode], [weight], [daystomanufacture], 
 [productline], [class], [style], [sellstartdate], [sellenddate], [discontinueddate],
 rowiscurrent, isDeleted ))) AS ChangeHashKey  
  from Productbase sb  
  )   
Select [productid], [name], [productnumber], [makeflag], [finishedgoodsflag], [color], 
 [safetystocklevel], [reorderpoint], [standardcost], [listprice], [size], [sizeunitmeasurecode],  
 [weightunitmeasurecode], [weight], [daystomanufacture], [productline], [class], [style],  
 [sellstartdate], [sellenddate], [discontinueddate], rowiscurrent, isdeleted  
  ,HistoricalHashKey, ChangeHashKey  
  , CONVERT(VARCHAR(34), HistoricalHashKey, 1) AS HistoricalHashKeyASCII  
  , CONVERT(VARCHAR(34), ChangeHashKey, 1) AS ChangeHashKeyASCII  
  , dateadd(MINUTE, -1, current_timestamp) as rowenddate, CURRENT_TIMESTAMP as rowstartdate   
  from productdata ds  

The Change Detection Lookup

The lookup in my DFT compares the HistoricalHashKeyASCII column from the source view with the varchar version of the HistoricalHashKey from the dimension table and adds the lookup column lkp_ChangeHashKeyASCII to the data set.

 

lookup hashkey

Rows that do not match are new rows, as in that product has never been in the dimension table before.  Rows that do match have a row in the dimension table and will then be evaluated to see if there are any changes in the values for that product.

Derived Column for New Rows

The no match output of the lookup are new rows for products that are not in the dimension table. Since this is the first row in the table for that product, we want this row to be effective from the beginning of time until the end of time. The beginning of time in the data warehouse is 01/01/1900. Since I often load the data multiple times per day, I use a date/time field rather than a date. If you only need the precision of a day, you can cut your row end date/time back to just a date. In my pattern, the current row has a null row end date, but you could easily add a derived column to set the end date to 12/31/9999 if you prefer.

Derived Column for Start Date

Conditional Split to Check Existing Products For Changes

For existing products, we only want to update the products for which some attribute has changed. We check for changes using a conditional split. When we performed the lookup against the HistoricalHashKeyASCII value, we added the lookup column lkp_ChangeHashKeyASCII from the dimension table.

The match output from the lookup flows into the conditional split. Then we check to see if there is a difference between the change hash from the source view and the change hash in the dimension table that we have added to our data set.

Conditional Split To Detect Changes

If the change hash values are equal the row hasn’t changed and we can pass it on to get the count of unchanged rows. If the change hash values are not equal, we know some column will have a value to update and we can pass it on to the update staging table.

Update Commands

After the data flow task completes, there is an Execute SQL Task to update the changed rows.  There are two commands in the task:

  1. Set the row end date and row is current flag on all existing rows in the dimension for which there is an update.
  2. Insert the new rows for the changed products.
 UPDATE A   
  SET  RowisCurrent = 0,   
    A.rowenddate = UA.rowenddate,   
    A.updatedttm = CURRENT_TIMESTAMP   
  FROM  [Updt].UpdtProduct2 AS UA   
    JOIN dbo.Product2 AS A   
    ON UA.historicalhashkey = A.historicalhashkey   
  WHERE A.rowiscurrent = 1;   
  
INSERT INTO dbo.Product2   
    ([productid]  
     ,[name]  
     ,[productnumber]  
     ,[makeflag]  
     ,[finishedgoodsflag]  
     ,[color]  
     ,[safetystocklevel]  
     ,[reorderpoint]  
     ,[standardcost]  
     ,[listprice]  
     ,[size]  
     ,[sizeunitmeasurecode]  
     ,[weightunitmeasurecode]  
     ,[weight]  
     ,[daystomanufacture]  
     ,[productline]  
     ,[class]  
     ,[style]  
     ,[sellstartdate]  
     ,[sellenddate]  
     ,[discontinueddate]  
     ,[rowiscurrent]  
     ,[isdeleted]  
     ,[HistoricalHashKey]  
     ,[ChangeHashKey]  
     ,[rowenddate]  
     ,[rowstartdate])   
  SELECT [productid]  
     ,[name]  
     ,[productnumber]  
     ,[makeflag]  
     ,[finishedgoodsflag]  
     ,[color]  
     ,[safetystocklevel]  
     ,[reorderpoint]  
     ,[standardcost]  
     ,[listprice]  
     ,[size]  
     ,[sizeunitmeasurecode]  
     ,[weightunitmeasurecode]  
     ,[weight]  
     ,[daystomanufacture]  
     ,[productline]  
     ,[class]  
     ,[style]  
     ,[sellstartdate]  
     ,[sellenddate]  
     ,[discontinueddate]  
     ,[rowiscurrent]  
     ,[isdeleted]  
     ,[HistoricalHashKey]  
     ,[ChangeHashKey]  
     ,null  
     ,[rowstartdate]   
  FROM  updt.updtProduct2;  

The BimlScript

I have a Dim2.Biml file that just contains the design pattern with code blocks that allow me to pass parameters to it for the particular dimension I’m making. I also have a Biml file that contains the project level connection managers. Then I have a Biml file that obtains the package specific parameters and passes them to the Dim2.Biml file.

ProjectConnections.biml

 <#@ template language="C#" tier="1" #>       
 <Connections>  
  <OleDbConnection Name="AWBIML" ConnectionString ="Data Source=localhost\SQL2014;  
   Initial Catalog=AWBIML;Integrated Security=SSPI;Provider=SQLNCLI11.1;" CreateInProject="true"/>  
  <OleDbConnection Name="Audit" ConnectionString ="Data Source=localhost\SQL2014;  
   Initial Catalog=AWBIML;Integrated Security=SSPI;Provider=SQLNCLI11.1;" CreateInProject="true"/>       
 </Connections>  

The ProjectConnections file doesn’t begin with Biml tags because it gets included in the middle of the CreateDim1.biml file. AWBIML is a local database I created to contain my data mart.

Dim2.biml

This is the file that contains my design pattern.

 <#@ template language="C#" tier="2" #>  
   
 <#@ property name="PackageName" type="String" #>  
 <#@ property name="SourceConnection" type="String" #>  
 <#@ property name="DestinationConnection" type="String" #>  
 <#@ property name="DestinationSchemaName" type="String" #>  
 <#@ property name="DestinationTableName" type="String" #>  
 <#@ property name="DestinationTable" type="String" #>  
 <#@ property name="DataFlowName" type="String" #>  
 <#@ property name="DataFlowSourceName" type="String" #>  
 <#@ property name="DataFlowQuery" type="String" #>       
 <#@ property name="QueryOutputPathName" type="String" #>  
 <#@ property name="DestinationName" type="String" #>  
 <#@ property name="UpdateSchemaName" type="String" #>       
 <#@ property name="UpdateTableName" type="String" #>       
 <#@ property name="UpdateConnection" type="String" #>            
 <#@ property name="UpdateSQLStatement" type="String" #>       
        
        
   <Package Name="<#=PackageName#>" Language="None">  
   <Parameters>  
     <Parameter DataType="String" Name="ParentPackageID">00000000-0000-0000-0000-000000000000</Parameter>  
   </Parameters>  
   <Variables>  
     <Variable EvaluateAsExpression="true" DataType="String" IncludeInDebugDump="Exclude" Name="QualifiedTableSchema">"[" + @[User::SchemaName] + "].[" + @[User::TableName] + "]"</Variable>  
     <Variable DataType="String" IncludeInDebugDump="Exclude" Name="QueryAuditStart">EXECUTE [Audit].[PackageControlStart] @PackageName = ?, @PackageId = ?, @ParentPackageId = ?, @ExecutionId = ?, @StartTime = ?;</Variable>  
     <Variable DataType="String" IncludeInDebugDump="Exclude" Name="QueryAuditUpdate">EXECUTE [Audit].[PackageControlStop] @PackageId = ?, @ExecutionId = ?, @InsertRowQuantity = ?, @UpdateRowQuantity = ?, @UnchangedRowQuantity=?;</Variable>  
     <Variable DataType="Int32" IncludeInDebugDump="Include" Name="RowCountChanged">0</Variable>  
     <Variable DataType="Int32" IncludeInDebugDump="Include" Name="RowCountNew">0</Variable>  
     <Variable DataType="Int32" IncludeInDebugDump="Include" Name="RowCountSource">0</Variable>  
     <Variable DataType="Int32" IncludeInDebugDump="Include" Name="RowCountUnchanged">0</Variable>  
             
     <Variable DataType="String" IncludeInDebugDump="Exclude" Name="SchemaName"><#=DestinationSchemaName#></Variable>  
             
     <Variable DataType="String" IncludeInDebugDump="Exclude" Name="TableName"><#=DestinationTableName#></Variable>  
   </Variables>  
   <Tasks>  
             
     <Dataflow Name="DFT_Insert<#=DestinationTableName#>">  
       <Transformations>  
         <RowCount Name="CNT_Changed_Rows" VariableName="User.RowCountChanged">  
           <InputPath OutputPathName="CSPL Check For Changes.ChangedRows" />  
     </RowCount>  
         <ConditionalSplit Name="CSPL Check For Changes">  
           <InputPath OutputPathName="LKP Historical Key.Match" />  
           <OutputPaths>  
             <OutputPath Name="ChangedRows">  
               <Expression>ChangeHashKeyASCII != lkp_ChangeHashKeyASCII</Expression>  
     </OutputPath>  
   </OutputPaths>  
     </ConditionalSplit>  
         <RowCount Name="CNT_New_Rows" VariableName="User.RowCountNew">  
           <InputPath OutputPathName="LKP Historical Key.NoMatch" />  
     </RowCount>  
         <Lookup Name="LKP Historical Key" NoMatchBehavior="RedirectRowsToNoMatchOutput" OleDbConnectionName="<#=DestinationConnection#>">  
               <DirectInput>SELECT  
  CONVERT(VARCHAR(34), ChangeHashKey, 1) AS ChangeHashKeyASCII  
 , CONVERT(VARCHAR(34), HistoricalHashKey, 1) AS HistoricalHashKeyASCII  
 FROM  
  <#=DestinationSchemaName#>.<#=DestinationTableName#>  
  where RowIsCurrent = 1  
  </DirectInput>  
           <Parameters>  
             <Parameter SourceColumn="HistoricalHashKeyASCII" />  
   </Parameters>  
              <ParameterizedQuery>select * from (SELECT  
  CONVERT(VARCHAR(34), ChangeHashKey, 1) AS ChangeHashKeyASCII  
 , CONVERT(VARCHAR(34), HistoricalHashKey, 1) AS HistoricalHashKeyASCII  
 FROM  
  <#=DestinationSchemaName#>.<#=DestinationTableName#>) [refTable]  
 where [refTable].[HistoricalHashKeyASCII] = ?</ParameterizedQuery>  
              <InputPath OutputPathName="CNT_Source_Rows.Output" />  
           <Inputs>  
             <Column SourceColumn="HistoricalHashKeyASCII" TargetColumn="HistoricalHashKeyASCII" />  
   </Inputs>  
           <Outputs>  
             <Column SourceColumn="ChangeHashKeyASCII" TargetColumn="lkp_ChangeHashKeyASCII" />  
   </Outputs>  
     </Lookup>  
                      <DerivedColumns Name="NewItemRowStartDate">  
        <InputPath OutputPathName="CNT_New_Rows.Output" />  
                <Columns>  
         <Column ReplaceExisting="true" Name="RowStartDate" DataType="DateTime">  
          (DT_DBTIMESTAMP)"01/01/1900 00:00:00"  
         </Column>  
        </Columns>  
       </DerivedColumns>  
         <OleDbDestination Name="OLE_DST_New_Rows" ConnectionName="<#=DestinationConnection#>">  
           <InputPath OutputPathName="NewItemRowStartDate.Output" />  
              <ExternalTableOutput Table="<#=DestinationSchemaName#>.<#=DestinationTableName#>" />   
                               <Columns>  
                               <Column SourceColumn="RowEndDate" IsUsed="false"/>  
                          </Columns>  
     </OleDbDestination>  
         <RowCount Name="CNT_Source_Rows" VariableName="User.RowCountSource">  
           <InputPath OutputPathName="<#=DataFlowSourceName#>.Output" />  
     </RowCount>  
         <OleDbSource Name="<#=DataFlowSourceName#>" ConnectionName="<#=SourceConnection#>">        
                          <DirectInput><#=DataFlowQuery#></DirectInput>  
     </OleDbSource>  
         <RowCount Name="CNT_Unchanged_Rows" VariableName="User.RowCountUnchanged">  
           <InputPath OutputPathName="CSPL Check For Changes.Default" />  
     </RowCount>  
         <OleDbDestination Name="OLE_DST Update Table" ConnectionName="<#=DestinationConnection#>">  
           <InputPath OutputPathName="CNT_Changed_Rows.Output" />  
              <ExternalTableOutput Table="[<#=UpdateSchemaName#>].[<#=UpdateTableName#>]" />  
     </OleDbDestination>  
   </Transformations>  
          <PrecedenceConstraints>  
               <Inputs>  
                    <Input OutputPathName="SQL Truncate <#=UpdateTableName#>.Output" />   
               </Inputs>  
          </PrecedenceConstraints>  
     </Dataflow>  
     <ExecuteSQL Name="SQL Begin Audit" ConnectionName="Audit">  
       <VariableInput VariableName="User.QueryAuditStart" />  
       <Parameters>  
         <Parameter Name="0" VariableName="System.PackageName" DataType="String" Length="-1" />  
         <Parameter Name="1" VariableName="System.PackageID" DataType="Guid" Length="-1" />  
         <Parameter Name="2" VariableName="ParentPackageID" DataType="Guid" Length="-1" />  
         <Parameter Name="3" VariableName="System.ServerExecutionID" DataType="Int64" Length="-1" />  
         <Parameter Name="4" VariableName="System.StartTime" DataType="Date" Length="-1" />  
   </Parameters>  
     </ExecuteSQL>  
     <ExecuteSQL Name="SQL Close Audit" ConnectionName="Audit">  
       <VariableInput VariableName="User.QueryAuditUpdate" />  
       <Parameters>  
         <Parameter Name="0" VariableName="System.PackageID" DataType="Guid" Length="-1" />  
         <Parameter Name="1" VariableName="System.ServerExecutionID" DataType="Int64" Length="-1" />  
         <Parameter Name="2" VariableName="User.RowCountNew" DataType="Int32" Length="-1" />  
         <Parameter Name="3" VariableName="User.RowCountChanged" DataType="Int32" Length="-1" />  
         <Parameter Name="4" VariableName="User.RowCountUnchanged" DataType="Int32" Length="-1" />  
   </Parameters>  
          <PrecedenceConstraints>  
               <Inputs>  
                    <Input OutputPathName="SQL Update <#=DestinationTableName#>.Output" />   
               </Inputs>  
          </PrecedenceConstraints>  
     </ExecuteSQL>  
     <ExecuteSQL Name="SQL Truncate <#=UpdateTableName#>" ConnectionName="<#=UpdateConnection#>">   
       <DirectInput>Truncate Table [<#=UpdateSchemaName#>].[<#=UpdateTableName#>]</DirectInput>  
          <PrecedenceConstraints>  
               <Inputs>  
                    <Input OutputPathName="SQL Begin Audit.Output" />  
               </Inputs>  
          </PrecedenceConstraints>  
     </ExecuteSQL>  
   
     <ExecuteSQL Name="SQL Update <#=DestinationTableName#>" ConnectionName="<#=DestinationConnection#>">  
                <DirectInput><#=UpdateSQLStatement#></DirectInput>  
          <PrecedenceConstraints>  
               <Inputs>  
                    <Input OutputPathName="DFT_Insert<#=DestinationTableName#>.Output" />   
               </Inputs>  
          </PrecedenceConstraints>  
     </ExecuteSQL>  
   </Tasks>  
 </Package>  
   

CreateDim2Packages.Biml

This file simply feeds values into the code nuggets in the Dim2.biml file. In practice, I store these values in a table and change this file to pull the values from a table so when I generate SSIS packages from this file, all my type 2 dimension packages are generated rather than just one.

 <#@ template language="C#" hostspecific="true" #>  
 <#@ import namespace="System.Data" #>  
 <#@ import namespace="System.Data.SqlClient" #>  
 <#@ import namespace="System.IO" #>  
   
 <Biml xmlns="http://schemas.varigence.com/biml.xsd">  
             
 <!--  
      <#  
      string PackageName = "LoadDimProduct2";  
      string SourceConnection = "AWBIML";  
      string DestinationConnection = "AWBIML";       
      string DestinationSchemaName = "dbo";  
      string DestinationTableName = "Product2";  
      string DestinationTable = "dbo.Product2";  
      string DataFlowName = "DFT_Product2";  
      string DataFlowSourceName = "OLE_SRC_StgProduct";  
      string QueryOutputPathName = "OLE_SRC_StgProduct.Output";  
      string DestinationName = "OLE_DST_Product2";  
      string UpdateSchemaName = "Updt";  
      string UpdateTableName = "UpdtProduct2";  
      string UpdateConnection = "AWBIML";  
      string DataFlowQuery = @"SELECT [productid]  
                                     ,[name]  
                                     ,[productnumber]  
                                     ,[makeflag]  
                                     ,[finishedgoodsflag]  
                                     ,[color]  
                                     ,[safetystocklevel]  
                                     ,[reorderpoint]  
                                     ,[standardcost]  
                                     ,[listprice]  
                                     ,[size]  
                                     ,[sizeunitmeasurecode]  
                                     ,[weightunitmeasurecode]  
                                     ,[weight]  
                                     ,[daystomanufacture]  
                                     ,[productline]  
                                     ,[class]  
                                     ,[style]  
                                     ,[sellstartdate]  
                                     ,[sellenddate]  
                                     ,[discontinueddate]  
                                     ,[rowiscurrent]  
                                     ,[isdeleted]  
                                     ,[HistoricalHashKey]  
                                     ,[ChangeHashKey]  
                                     ,[HistoricalHashKeyASCII]  
                                     ,[ChangeHashKeyASCII]  
                                     ,[rowenddate]  
                                     ,[rowstartdate]  
                                FROM [dbo].[StgProduct2];" ;  
           string UpdateSQLStatement = @"UPDATE A   
           SET  RowisCurrent = 0,   
                  A.rowenddate = UA.rowenddate,   
                  A.updatedttm = CURRENT_TIMESTAMP   
           FROM  [Updt].UpdtProduct2 AS UA   
                  JOIN dbo.Product2 AS A   
                      ON UA.historicalhashkey = A.historicalhashkey   
           WHERE A.rowiscurrent = 1;   
   
           INSERT INTO dbo.Product2   
                          ([productid]  
                       ,[name]  
                       ,[productnumber]  
                       ,[makeflag]  
                       ,[finishedgoodsflag]  
                       ,[color]  
                       ,[safetystocklevel]  
                       ,[reorderpoint]  
                       ,[standardcost]  
                       ,[listprice]  
                       ,[size]  
                       ,[sizeunitmeasurecode]  
                       ,[weightunitmeasurecode]  
                       ,[weight]  
                       ,[daystomanufacture]  
                       ,[productline]  
                       ,[class]  
                       ,[style]  
                       ,[sellstartdate]  
                       ,[sellenddate]  
                       ,[discontinueddate]  
                       ,[rowiscurrent]  
                       ,[isdeleted]  
                       ,[HistoricalHashKey]  
                       ,[ChangeHashKey]  
                       ,[rowenddate]  
                       ,[rowstartdate])   
           SELECT          [productid]  
                       ,[name]  
                       ,[productnumber]  
                       ,[makeflag]  
                       ,[finishedgoodsflag]  
                       ,[color]  
                       ,[safetystocklevel]  
                       ,[reorderpoint]  
                       ,[standardcost]  
                       ,[listprice]  
                       ,[size]  
                       ,[sizeunitmeasurecode]  
                       ,[weightunitmeasurecode]  
                       ,[weight]  
                       ,[daystomanufacture]  
                       ,[productline]  
                       ,[class]  
                       ,[style]  
                       ,[sellstartdate]  
                       ,[sellenddate]  
                       ,[discontinueddate]  
                       ,[rowiscurrent]  
                       ,[isdeleted]  
                       ,[HistoricalHashKey]  
                       ,[ChangeHashKey]  
                       ,null  
                       ,[rowstartdate]   
           FROM  updt.updtProduct2;" ;  
        
      #>  
      -->  
                <#@ include file="ProjectConnection.biml" #>  
                            
                     <Packages>  
                          <#=CallBimlScript("Dim2.biml", PackageName, SourceConnection, DestinationConnection, DestinationSchemaName, DestinationTableName, DestinationTable, DataFlowName, DataFlowSourceName, DataFlowQuery, QueryOutputPathName, DestinationName, UpdateSchemaName, UpdateTableName, UpdateConnection, UpdateSQLStatement)#>  
                     </Packages>  
   
 </Biml>  

Get the Biml

You can download the Biml and SQL files necessary to use this pattern here.  This zip file contains:

  • 3 Biml files (ProjectConnections, Dim2, CreateDim2Packages)
  • 1 Word doc with a brief explanation of the files and design pattern
  • 8 SQL files to make the schemas, tables, views, and stored procedures for the product dimension.

This pattern is for a pure Type 2 SCD. I’ll post again soon with a pattern that handles hybrid Type 2 and Type 6 SCDs.

For more info on Biml, see my other Biml posts or check out BimlScript.com.

BIDS Helper, Biml, Data Warehousing, Microsoft Technologies, SSIS, T-SQL

Biml for a Type 1 Slowly Changing Dimension

I’ve been working on building my Biml library over the last few months. One of the first design patterns I created was a Type 1 Slowly Changing Dimension where all fields except the key fields that define the level of granularity are overwritten with updated values.  It assumes I have a staging table, but it could probably be easily modified to pull directly from the source table if needed.

For simplicity, I’m going to create DimSalesReason from the AdventureWorks database as my example. Please pardon the large amount of code in this post. I wanted to provide a fairly complete picture so you can see how the pieces fit together so I’m providing all of the SQL and Biml needed to generate the package.

Required Database Objects

My staging table is a copy of source table and looks like this:

CREATE TABLE [Staging].[SalesReason](
    [SalesReasonID] [int] NOT NULL,
    [Name] [nvarchar](50) NOT NULL,
    [ReasonType] [nvarchar](50) NOT NULL,
    [ModifiedDate] [datetime] NOT NULL
)

Here is the DDL for my dimension table:

CREATE TABLE [dbo].[DimSalesReason](
    [SalesReasonKey] int IDENTITY(1,1) NOT NULL Primary Key
    ,[SalesReasonID] int not null
    ,[SalesReasonName] nvarchar(50) not null
    ,[SalesReasonType] nvarchar(50) not null
    ,[HistoricalHashKey] varbinary(20) NOT NULL
    ,[ChangeHashKey] varbinary(20) NOT NULL
    ,[InsertDttm] datetime not null
    ,[UpdateDttm] datetime NULL
)

You’ll notice I have some audit fields in my table. UpdateDttm is the datetime when the row was last updated. InsertDttm is the datetime when the row was initially inserted. The HistoricalHashKey and ChangeHashKey are used for change detection. I may not need them in a dimension this simple, but I use them in larger/wider dimensions and I like my dimensions to be built consistently (unless they need to be tweaked for performance). The HistoricalHashKey represents the business key of the table that defines the level of uniqueness for the row in the dimension table. The ChangeHashKey represents all the other fields that are used in the dimension row. This makes it easier to determine if values have changed since I can compare just the ChangeHashKey instead of each individual field.

I also have an update table. This allows me to update in bulk rather than updating in place row by row. With a very small table, you might not notice a performance difference, but this pattern becomes much more efficient as the table grows.

CREATE TABLE [Updt].[UpdtSalesReason](
	 [SalesReasonID] int not null
	,[SalesReasonName] nvarchar(50) not null
	,[SalesReasonType] nvarchar(50) not null
	,[HistoricalHashKey] varbinary(20) NOT NULL
	,[ChangeHashKey] varbinary(20) NOT NULL
	,[InsertDttm] datetime not null
)

I use a view to do most of the transformation work (joins, business logic, etc.) for my Type 1 SCD package. Here’s my view for the SalesReasonDimension:

Create View [Staging].[DimSalesReason] as
With SalesReasonData as (
	Select	 [SalesReasonID] 
		,[Name] as [SalesReasonName] 
		,[ReasonType] as [SalesReasonType] 
		,CONVERT(VARBINARY(20), HASHBYTES('MD5', CONCAT(SalesReasonID, ' '))) 
                                                               AS [HistoricalHashKey]
		,CONVERT(VARBINARY(20), HASHBYTES('MD5', CONCAT(Name, ' ', ReasonType))) 
                                                                   AS [ChangeHashKey]
		,CURRENT_TIMESTAMP as InsertDttm
		,CURRENT_TIMESTAMP as UpdtDttm
		from 
		[Staging].[SalesReason]
	UNION 
	Select	-1 as [SalesReasonID]
	   , 'Unknown' as [SalesReasonName]
	   , 'Unknown' as [SalesReasonType]
	   ,CONVERT(VARBINARY(20), HASHBYTES('MD5', CONCAT(-1, ' '))) 
                                                               AS [HistoricalHashKey]
	   ,CONVERT(VARBINARY(20), HASHBYTES('MD5', CONCAT('Unknown', ' ', 'Unknown')))
                                                               AS [ChangeHashKey]
	   ,CURRENT_TIMESTAMP as InsertDttm
           ,CURRENT_TIMESTAMP as UpdtDttm
)
Select SalesReasonID
	   , SalesReasonName
	   , SalesReasonType
	   , HistoricalHashKey
	   , ChangeHashKey
	   , CONVERT(VARCHAR(34), HistoricalHashKey, 1) AS HistoricalHashKeyASCII
	   , CONVERT(VARCHAR(34), ChangeHashKey, 1) AS ChangeHashKeyASCII 
	   , InsertDttm
	   , UpdtDttm
from SalesReasonData

Using a view to do the transformation work allows me to use the SQL Server engine to do the things it is good at (joins, case statements, conversions) and SSIS to do the things it is good at (controlling the flow of data). It also makes it easy to create a design pattern in Biml that is abstract enough that it easily fits most SCD 1 scenarios. The hashkey fields allow me to do a lookup on one field to determine if the row already exists in the dimension table and a quick comparison to determine if any of the values for that row have changed. I chose to use an MD5 hash because I think it is a good balance of speed/size and collision risk. You may feel differently (others prefer SHA1 due to lower collision risk). On a dimension this size, the difference probably isn’t noticeable.

You can also see that I add my unknown member row in my view. Some people prefer to set the surrogate key of the unknown value to -1. I prefer to set the business key to -1 and let the surrogate key be set to any value in the load process. I’m not a fan of having to turn off the identity insert to add the unknown row. When I do the dimension key lookup for fact tables, I just look for the row where the business key = -1. This also means I don’t have to check my dimension tables to see if someone remembered to insert the unknown rows after deployment to a new environment because I know the unknown rows will be inserted when the package is run.

I have audit steps in my package that write to the Audit.PackageControl table using stored procedures.

CREATE TABLE [Audit].[Package_Control](
	[Package_NM] [varchar](100) NOT NULL,
	[Package_ID] [uniqueidentifier] NOT NULL,
	[Parent_Package_ID] [uniqueidentifier] NULL,
	[Execution_ID] [bigint] NULL,
	[Start_TS] [datetime] NOT NULL,
	[Stop_TS] [datetime] NULL,
	[Insert_Row_QT] [int] NULL,
	[Update_Row_QT] [int] NULL,
	[Unchanged_Row_QT] [int] NULL,
	[Deleted_Row_QT] [int] NULL,
	[Duration_s]  AS (datediff(second,[Start_TS],[Stop_TS])),
	[PackageLogID] [int] IDENTITY(1,1) NOT NULL
)

CREATE PROCEDURE [Audit].[PackageControlStart]
(
    @PackageName varchar(100)
,   @PackageId uniqueidentifier
,   @ParentPackageId uniqueidentifier = NULL
,   @ExecutionId bigint
,   @StartTime DATETIME
,   @StopTime datetime = NULL
,   @InsertRowQuantity int = NULL
,   @UpdateRowQuantity int = NULL
,   @UnchangedRowQuantity int = NULL
)
AS
BEGIN
    SET NOCOUNT ON;
	DECLARE @PackageLogId int 
    INSERT INTO [Audit].[Package_Control]
    (
        [Package_NM]
    ,   [Package_ID]
    ,   [Parent_Package_ID]
    ,   [Execution_ID]
    ,   [Start_TS]
    ,   [Stop_TS]
    ,   [Insert_Row_QT]
    ,   [Update_Row_QT]
    ,	[Unchanged_Row_QT]
    )
    SELECT
        @PackageName 
    ,   @PackageId 
    ,   @ParentPackageId 
    ,   @ExecutionId 
    ,   CURRENT_TIMESTAMP
    ,   @StopTime 
    ,   @InsertRowQuantity 
    ,   @UpdateRowQuantity 
    ,	@UnchangedRowQuantity
  SELECT @PackageLogID = SCOPE_IDENTITY()
  SELECT  @PackageLogID as PackageLogID
END

CREATE PROCEDURE [Audit].[PackageControlStop]
(
    @PackageId uniqueidentifier
,   @ExecutionId bigint
,   @InsertRowQuantity int = NULL
,   @UpdateRowQuantity int = NULL
,	@UnchangedRowQuantity int = NULL
)
AS
BEGIN
    SET NOCOUNT ON;
    -- Close out the execution.
    UPDATE PC
    SET [Stop_TS] = CURRENT_TIMESTAMP  
    ,   [Insert_Row_QT] = @InsertRowQuantity
    ,   [Update_Row_QT] = @UpdateRowQuantity
	,	[Unchanged_Row_QT] = @UnchangedRowQuantity
    FROM  [Audit].[Package_Control] AS PC
    WHERE PC.Package_ID = @PackageId
        AND PC.Execution_ID = @ExecutionId
        AND PC.[Stop_TS] IS NULL;  
END

And Now for the Biml

My Biml library usually contains 3 files for each package type:

  • ProjectConnections – I like to keep my connection managers in a separate file so I only have to update one place if I need to add or change a connection.
  • Dim1 – This contains my actual design pattern.
  • CreateDim1 – This is the Biml file I run to generate the package. It gets separated so I can pull values from databases and pass in variables to my design pattern using BimlScript. For this example I have hardcoded my variables into this file rather than pulling from a database.

ProjectConnections.Biml

<#@ template language="C#" tier="1" #>	
<Connections>
    <OleDbConnection Name="AWBIML" ConnectionString ="Data Source=localhost\SQL2014;
        Initial Catalog=AWBIML;Integrated Security=SSPI;Provider=SQLNCLI11.1;" CreateInProject="true"/>
    <OleDbConnection Name="Audit" ConnectionString ="Data Source=localhost\SQL2014;
        Initial Catalog=AWBIML;Integrated Security=SSPI;Provider=SQLNCLI11.1;" CreateInProject="true"/>	
</Connections>

The ProjectConnections file doesn’t begin with Biml tags because it gets included in the middle of the CreateDim1.biml file. AWBIML is a local database I created to contain my data mart.

Dim1.Biml

<#@ template language="C#" tier="2" #>

<#@ property name="PackageName" type="String" #>
<#@ property name="DstSchemaName" type="String" #>
<#@ property name="DstTableName" type="String" #>
<#@ property name="DstConnection" type="String" #>
<#@ property name="DataFlowSourceName" type="String" #>
<#@ property name="SrcConnection" type="String" #>
<#@ property name="SourceQuery" type="String" #>	
<#@ property name="UpdateSchemaName" type="String" #>	
<#@ property name="UpdateTableName" type="String" #>	
<#@ property name="UpdateConnection" type="String" #>		
<#@ property name="UpdateSQLStatement" type="String" #>	
	
<Package Name="<#=PackageName#>" Language="None">
    <Parameters>
        <Parameter DataType="String" Name="ParentPackageID">00000000-0000-0000-0000-000000000000</Parameter>
    </Parameters>
    <Variables>
        <Variable EvaluateAsExpression="true" DataType="String"  
          Name="QualifiedTableSchema">"["+@[User::SchemaName]+"].["+@[User::TableName]+"]"
        </Variable>
        <Variable DataType="String" Name="QueryAuditStart">
          EXECUTE [Audit].[PackageControlStart] @PackageName=?, @PackageId=?, 
          @ParentPackageId=?,@ExecutionId=?, @StartTime=?;
        </Variable>
        <Variable DataType="String" Name="QueryAuditUpdate">
          EXECUTE [Audit].[PackageControlStop] @PackageId=?, @ExecutionId=?, 
          @InsertRowQuantity=?, @UpdateRowQuantity=?, @UnchangedRowQuantity=?;</Variable> 
        <Variable DataType="Int32"  Name="RowCountChanged">0</Variable>
        <Variable DataType="Int32"  Name="RowCountNew">0</Variable>
        <Variable DataType="Int32"  Name="RowCountSource">0</Variable>
        <Variable DataType="Int32"  Name="RowCountUnchanged">0</Variable>
        <Variable DataType="String" Name="SchemaName"><#=DstSchemaName#></Variable>
        <Variable DataType="String" Name="TableName"><#=DstTableName#></Variable>
    </Variables>
    
    <Tasks>
      <ExecuteSQL Name="SQL Begin Audit" ConnectionName="Audit">
       <VariableInput VariableName="User.QueryAuditStart" />
        <Parameters>
          <Parameter Name="0" VariableName="System.PackageName" DataType="String" Length="-1"/>
          <Parameter Name="1" VariableName="System.PackageID" DataType="Guid" Length="-1"/>
          <Parameter Name="2" VariableName="ParentPackageID" DataType="Guid" Length="-1"/>
          <Parameter Name="3" VariableName="System.ServerExecutionID" DataType="Int64" 
            Length="-1"/>
          <Parameter Name="4" VariableName="System.StartTime" DataType="Date" Length="-1"/>
        </Parameters>
      </ExecuteSQL>

        <ExecuteSQL Name="SQL Truncate <#=UpdateTableName#>" 
          ConnectionName="<#=UpdateConnection#>"> 
            <DirectInput>Truncate Table [<#=UpdateSchemaName#>].[<#=UpdateTableName#>]
            </DirectInput>
          <PrecedenceConstraints>
            <Inputs>
              <Input OutputPathName="SQL Begin Audit.Output" />
            </Inputs>
          </PrecedenceConstraints>
        </ExecuteSQL>

        <Dataflow Name="DFT Insert<#=DstTableName#>">
            
            <Transformations>
            
                <RowCount Name="CNT Changed_Rows" VariableName="User.RowCountChanged">
                    <InputPath OutputPathName="CSPL Check For Changes.ChangedRows" />
                </RowCount>
                
                <ConditionalSplit Name="CSPL Check For Changes">
                    <InputPath OutputPathName="LKP Historical Key.Match" />
                    <OutputPaths>
                        <OutputPath Name="ChangedRows">
                          <Expression>ChangeHashKeyASCII != lkp_ChangeHashKeyASCII</Expression>
                        </OutputPath>
                    </OutputPaths>
                </ConditionalSplit>
                
                <RowCount Name="CNT New_Rows" VariableName="User.RowCountNew">
                    <InputPath OutputPathName="LKP Historical Key.NoMatch" />
                </RowCount>
        
                <Lookup Name="LKP Historical Key" NoMatchBehavior="RedirectRowsToNoMatchOutput" 
                  OleDbConnectionName="<#=DstConnection#>">
                    <DirectInput>SELECT
                      CONVERT(VARCHAR(34), ChangeHashKey, 1) AS ChangeHashKeyASCII
                      , CONVERT(VARCHAR(34), HistoricalHashKey, 1) AS HistoricalHashKeyASCII
                      FROM
                      <#=DstSchemaName#>.<#=DstTableName#></DirectInput>
                   <Parameters>
                     <Parameter SourceColumn="HistoricalHashKeyASCII" />
                   </Parameters>
	           <ParameterizedQuery>select * from (SELECT
                     CONVERT(VARCHAR(34), ChangeHashKey, 1) AS ChangeHashKeyASCII
                     , CONVERT(VARCHAR(34), HistoricalHashKey, 1) AS HistoricalHashKeyASCII
                     FROM
                     <#=DstSchemaName#>.<#=DstTableName#>) [refTable]
                     where [refTable].[HistoricalHashKeyASCII] = ?</ParameterizedQuery>
                  <InputPath OutputPathName="CNT Source_Rows.Output" />
                  <Inputs>
                    <Column SourceColumn="HistoricalHashKeyASCII" 
                      TargetColumn="HistoricalHashKeyASCII" />
                  </Inputs>
                  <Outputs>
                    <Column SourceColumn="ChangeHashKeyASCII" 
                    TargetColumn="lkp_ChangeHashKeyASCII" />
                  </Outputs>
                </Lookup>
       
            <OleDbDestination Name="OLE_DST New_Rows" ConnectionName="<#=DstConnection#>">
              <InputPath OutputPathName="CNT New_Rows.Output" />
	      <ExternalTableOutput Table="<#=DstSchemaName#>.<#=DstTableName#>" /> 
            </OleDbDestination>
            
            <RowCount Name="CNT Source_Rows" VariableName="User.RowCountSource">
              <InputPath OutputPathName="<#=DataFlowSourceName#>.Output" />
            </RowCount>
            
            <OleDbSource Name="<#=DataFlowSourceName#>" ConnectionName="<#=SrcConnection#>">
	      <DirectInput><#=SourceQuery#></DirectInput>
            </OleDbSource>
               
            <RowCount Name="CNT Unchanged_Rows" VariableName="User.RowCountUnchanged">
              <InputPath OutputPathName="CSPL Check For Changes.Default" />
            </RowCount>
                
            <OleDbDestination Name="OLE_DST Update Table" ConnectionName="<#=DstConnection#>"> 
              <InputPath OutputPathName="CNT Changed_Rows.Output" />
	      <ExternalTableOutput Table="[<#=UpdateSchemaName#>].[<#=UpdateTableName#>]" />
            </OleDbDestination>
    
     </Transformations>
    
     <PrecedenceConstraints>
      <Inputs>
          <Input OutputPathName="SQL Truncate <#=UpdateTableName#>.Output" /> 
      </Inputs>
     </PrecedenceConstraints>
    </Dataflow>
            
    <ExecuteSQL Name="SQL Update <#=DstTableName#>" 
    ConnectionName="<#=DstConnection#>">
      <DirectInput><#=UpdateSQLStatement#></DirectInput>
        <PrecedenceConstraints>
          <Inputs>
            <Input OutputPathName="DFT Insert<#=DstTableName#>.Output" /> 
          </Inputs>
        </PrecedenceConstraints>
    </ExecuteSQL>
    
    <ExecuteSQL Name="SQL Close Audit" ConnectionName="Audit">
      <VariableInput VariableName="User.QueryAuditUpdate" />
        <Parameters>
          <Parameter Name="0" VariableName="System.PackageID" DataType="Guid" Length="-1"/>
          <Parameter Name="1" VariableName="System.ServerExecutionID" DataType="Int64"
            Length="-1"/>
          <Parameter Name="2" VariableName="User.RowCountNew" DataType="Int32" Length="-1"/>
          <Parameter Name="3" VariableName="User.RowCountChanged" DataType="Int32" 
            Length="-1"/>
          <Parameter Name="4" VariableName="User.RowCountUnchanged" DataType="Int32" 
            Length="-1"/>
        </Parameters>
        <PrecedenceConstraints>
          <Inputs>
            <Input OutputPathName="SQL Update <#=DstTableName#>.Output" /> 
          </Inputs>
        </PrecedenceConstraints>
    </ExecuteSQL>
				    
  </Tasks>

</Package>

CreateDim1.Biml

<#@ template language="C#" hostspecific="true" #>
<Biml xmlns="http://schemas.varigence.com/biml.xsd">
<#@ include file="ProjectConnection.biml" #>

	<!--
	<#
	string PackageName  = "LoadDimSalesReason";
	string DstSchemaName = "dbo";
	string DstTableName = "DimSalesReason";
	string DstConnection = "AWBIML";
	string DataFlowSourceName = "OLE_SRC StgDimSalesReason";
	string SrcConnection = "AWBIML";
	string SourceQuery = @"SELECT [SalesReasonID]
					  ,[SalesReasonName]
					  ,[SalesReasonType]
					  ,[HistoricalHashKey]
					  ,[ChangeHashKey]
					  ,[HistoricalHashKeyASCII]
					  ,[ChangeHashKeyASCII]
					  ,[InsertDttm]
					  ,[UpdtDttm]
				  FROM [AWBIML].[Staging].[DimSalesReason]";
	string UpdateSchemaName = "Updt";
	string UpdateTableName = "UpdtSalesReason";
	string UpdateConnection = "AWBIML";
	string UpdateSQLStatement = @"Update d 
					set 
					d.SalesReasonName = u.SalesReasonName,
					d.SalesReasonType = u.SalesReasonType,
					d.changehashkey = u.changehashkey 
					FROM Updt.UpdtSalesReason u 
					inner join dbo.DimSalesReason d
                                        on u.HistoricalHashKey = d.HistoricalHashKey;";
	#>
	-->
	
    <Packages>
        <#=CallBimlScript("Dim1.biml", PackageName, DstSchemaName, DstTableName, DstConnection,
        DataFlowSourceName, SrcConnection, SourceQuery, UpdateSchemaName, UpdateTableName, 
        UpdateConnection, UpdateSQLStatement)#>
    </Packages>
	
</Biml>

Resulting SSIS Package

Executing the CreateDim1.Biml file generates a package called LoadDimSalesReason.dtsx. Here is the control flow:
BIML Dim1 ControlFlow
I start the package by logging the package start in my PackageControl table. Then I truncate my update table to prepare for the new data load. Next I have a data flow task, which inserts data into either the dbo.DimSalesReason table or the Updt.UpdtSalesReason table (see below). The SQL Update task updates the DimSalesReason table with the rows that were inserted into UpdtSalesReason. And finally, I log my package completion.

Here’s my data flow:
BIML Dim1 DataFlow
First, I retrieve my data from my staging view. Then I count the number of rows coming from my source so I can log it in my PackageControl table. The lookup on the HistoricalHashKey field sends rows with no match to be inserted into the dimension table. If the rows have a match, they are checked to see if their ChangeHashKey values match the value of the ChangeHaskey in the existing row in the dimension table. If ChangeHashKey values match, the row is counted and nothing else is done. If the ChangeHashKey values don’t match, the row is counted and written to the update table.

And that is all you need to create a Type 1 dimension load with Biml.

BIDS Helper, Biml, Microsoft Technologies, SSIS

Beginning With Biml

I have been learning and using Biml for several months, but I neglected to blog about it until now. If you are using SSIS and are not familiar with Biml you need to check it out. Apologies if this post sounds like an ad for Biml, but if you read down to the Benefits of Biml section you will see how learning just a bit of Biml was extremely useful for me. It is a technology that I find exciting and effective.

What is Biml?

Biml, created in 2008, is a domain specific language for describing business intelligence objects. It’s essentially XML that you can write to build SSIS projects faster with fewer mouse clicks and while ensuring consistent design patterns across packages and projects. Biml can describe the following objects:

  • SSIS packages
  • databases
  • schemas
  • tables
  • columns
  • SSAS cubes, facts, and dimensions (Mist only)

Most people who write Biml extend it with BimlScript. BimlScript allows you to use C# or VB.Net to extend your Biml to perform tasks such as:

  • quickly import database table schemas
  • replace static values with expressions
  • include text from another Biml file or text file
  • turn tedious, repetitive work into reusable scripts

While Biml is useful without BimlScript, it is much more powerful with it.

Get Biml

To use Biml, you need either the BIDS Helper add-in for SSDT-BI or Mist. BIDS Helper is free and has everything you need to write Biml for SSIS. This is a great way to get started. If you already have BIDS Helper, make sure it is version 1.7+ as there were several breaking changes related to Biml released in version 1.7 (and you don’t want to learn the old way and then have to learn the new way for the current version).

Mist is not free, but there is a free 14-day trial available. Mist has more Biml capabilities, such as reverse engineering Biml from existing SSIS packages and describing SSAS databases. Mist also offers a GUI if you want the benefits of Biml without having to write all the XML yourself.

Start Small

Once you have BIDS Helper or Mist, you are ready to write some Biml. I am not nor have I ever been a .Net developer, although I think I pretended to be one for a couple of semesters in college. I have made a few simple .NET websites and written some C# for SSIS script tasks, but that is not really where my strengths or interest lie. Yet even I have learned to love writing Biml, and I am working on getting better at BimlScript. I fully admit I have spent a few hours banging my head against the wall trying to figure out how Biml works, but that is how I learn. I was able to make progress because I started small, getting the hang of the basic building blocks and then adding new pieces of knowledge to my foundation. If you need a plan with some quick “aha” moments to make you feel accomplished on your journey of learning Biml, you can follow my initial learning path.

  1. Create a blank package in Biml and generate it using BIDS Helper
  2. Create a project level connection in Biml and generate it using BIDS Helper
  3. Create a package to populate a staging table in Biml with BIDS Helper that simply truncates a table and then selects data from a single table in another database and inserts it into the newly truncated table.
  4. Use Mist (the free trial, if necessary) to reverse engineer a dimension package and a fact package that follows your desired SSIS design patterns. Examine the Biml that is created, and then try recreating them by hand.
  5. Create a Biml library for all of your common design patterns (Import from flat file to staging table, import from database table to staging table, populate a fact table, populate a type 1 SCD, populate a type 2 SCD, etc.)

If you made it through step 5, you’re probably ready to jump into Biml Script. Try creating BimlScript that copies the data from a specified database to another location (see this blog post by Bill Fellows for help). This will come in handy any time you need to refresh data in a dev/test environment by copying it from production (or another environment).

From there the SSIS world is your oyster, and I’m sure you will see many great uses where you can apply BIML to increase your productivity and reduce time spent on manual repetitive tasks.

The Benefits of Biml

I was able to realize the benefits of BIiml on the very first project in which I used it, and I hadn’t gotten into BimlScript at that point at all. I was the architect on a data warehouse project, working with a developer who didn’t have strong SSIS skills. But my time was split across multiple projects and I didn’t have time to coach him through every package or review every bit of his work. This happened to be for a client with whom we were working on multiple projects. And once the data mart was in production, another team would be responsible for providing support (responding to SSIS package execution failures, performing small enhancements) for it and the other data marts belonging to that client. It was important to me that all the data marts and associated ETL packages follow the same design patterns to ensure consistency and quality of work in addition to making them easier to support. As usual, we had tight deadlines and needed a way to ensure we met them despite me being stretched across multiple projects and the developer being relatively new to SSIS.

Since I had just finished building my Biml library I decided to put it to use. I added the Biml files to source control, created and published the database project for my data mart, and created the project parameters and project-level connections.  I used my Biml to create one staging package (all of my source systems for this project were SQL Server).  Then I provided a list of staging packages to my developer that looked something like this (example values changed to protect the innocent…and the guilty):

Staging Table Source System Source Table Filters or Other Logic
StageSalesCustomer AdventureWorksOLTP Sales.Customer  Straight pull – select all fields, no filters

I then sent over instructions on what to change in the Biml and showed the developer how to generate a package with BIDS Helper. The instructions were really just a list that corresponded with the comments I included in the Biml files. Ex:

  • Line 10: Package Name
  • Line 20: Schema Name Variable
  • Line 21: Table Name Variable
  • Line 41: Data Flow Name
  • Line 43: OLEDB Source Name
  • Line 43: OLEDB Connection Name
  • Line 45: Direct Input Source Query
  • Line 64: Output Path Name
  • Line 68: External Table Output Table

The developer would change the text in the Biml file at the specified locations and then generate the package. Once we finished the staging packages, I did something similar for the dimensions and then the facts. With this process, I didn’t have to worry about the developer creating the package by hand or copying another package and missing a setting or running into issues with corrupted metadata. I was confident that my design patterns were being followed. The packages were created quickly, giving us a bit more cushion with our deadlines. And I think both the developer and I came out of the project with a better understanding of SSIS and a greater appreciation for Biml. Looking back on it now, I know there was a better way to use Biml and BIMLScript to create the SSIS packages for that project. But I was happy with my first attempt at using BIML on a real project, and I used the lessons I learned to create a revised plan to use Biml on my next projects.

For More Information

I intend to write more technical posts about Biml in the future, but for now I’m perfectly happy to provide the sources of my knowledge.

 

Happy Bimling!

 

 

BIDS Helper, SSAS

Webucator Made a Video of My Blog Post

Back in July, I wrote a blog post about My Favorite BIDS Helper features for SSAS development. Webucator contacted me about creating a video based upon it, and it’s now available.  They are doing a free series called SQL Server Solutions from the Web where they highlight different SQL Server solutions found on blog posts around the web, and I’m pleased and honored that my blog is included.

Webucator provides technical and business training via private classes for your organization, public classes in a live virtual environment, and self-paced online programs. They are a Microsoft Certified Partner for Learning Solutions, and they offer several classes on SQL Server, including SSAS, SSIS, and SSRS.

Without further ado, here’s the video for My Favorite BIDS Helper Features for SSAS Development:

You can check out another blog post that Webucator made into a video here. The post by Koen Verbeeck is a great explanation of why it’s important to be aware of the defaults in T-SQL windowing functions.

BIDS Helper, Microsoft Technologies, SSAS

My Favorite BIDS Helper Features for SSAS Development

Bill Fellows and I presented Somebody Got BIDS Helper in My Data Tools at Mile High Tech Con in Denver last weekend, and it reminded me how much I love BIDS Helper.  I use it to develop all of my SSIS and SSAS projects, but I realized I haven’t blogged much about it.  So here are my favorite BIDS Helper features for SSAS development. If you haven’t tried BIDS Helper yet, I highly recommend it.  It’s free and there is great documentation for it, so it’s easy to get started.

Printer Friendly Dimension Usage for SSAS Multidimensional

Printer Friendly Dimension Usage provides you with a nice Bus Matrix for your documentation or design reviews. You can choose between a summarized view and a detailed view.  When I have clients that I know will do a lot of ad hoc reporting in Excel using their cube, I like to create an Excel workbook with a data glossary, summary bus matrix, and a few pre-populated pivot tables that can be used to answer the most common questions.  With BIDS Helper 1.6.5 or later, I no longer have to create my bus matrix by hand. When you click the Printer Friendly Dimension Usage button, it will ask if you want the detailed or summary version.

PrinterFriendlyDimUsage

The report will open up in a new window, and you can choose to print it or export to Word, PDF, or Excel.

Bus Matrix report

Tri-State Perspectives and Deploy Perspectives for SSAS Multidimensional

The Tri-State Perspectives and Deploy Perspectives go together nicely. Tri-State Perspectives put a red box around any dimension or measure group in which not all children are included in the perspective. This can be very helpful as you update and enhance your cube.  New measures or dimension attributes are not added to perspectives by default when they are added to a cube, so this is a great way to double-check that you have added any new fields to the appropriate perspectives before you re-deploy.

TriStatePerspective

 

If you need to make changes only to your perspectives (perhaps you realized you forgot to add a new field to a perspective), the Deploy Perspectives functionality allows you to deploy only the perspectives rather than the entire cube. This feature will delete any perspectives on the destination database that aren’t in the source in addition to deploying perspectives.

DeployPerspective

Tabular Actions Editor

Tabular Actions Editor fills a feature gap in the Tabular SSAS world.  There is a nice UI for creating drillthrough actions for Multidimensional Analysis Services. By default, tabular models have drillthrough actions that simply contain the fields in the underlying fact table for that measure, which isn’t very useful or friendly to your users since it is mostly dimension keys and a few aggregatable values.

defaulttabulardrillthrough

Without BIDS Helper, the only way to alter that behavior is the manually change the .bim file, which can completely corrupt your tabular model if you aren’t careful. The Tabular Actions Editor provides a nice UI to create actions in a manner similar to multidimensional SSAS development so you can create a helpful detail data set that is analytically relevant.  With BIDS Helper you can also add actions other than drillthroughs, such as go to Url or go to report. With drillthrough actions, you choose:

  • action type
  • target type
  • target measure group
  • default setting
  • maximum number of rows
  • method of invocation
  • perspectives in which it should be available.

 

tabular actions editor

There are lots of good examples in the documentation.  You will also notice a warning in the documentation:

Warning: While actions work in Tabular models, they are not officially supported by Microsoft. If you encounter a bug in how Tabular handles actions and open a support case, Microsoft may not provide support.

I have not run into a situation where this has been an issue, but I wanted to make sure I called this out so you knowingly accept the risks.

I love BIDS Helper

If you can’t tell, I’m a fan of BIDS Helper.  I like it so much that I recruited a friend to help me create and deliver a presentation about it.  If you are developing in SSAS or SSIS and you aren’t familiar with BIDS Helper, I encourage you to check it out.  I believe is it one of the best free tools for MSBI developers available today.