Accessibility, Data Visualization, Power BI

Viridis color palettes in Power BI theme files

I am a fan of the viridis color palettes available in python and R, so I decided to make Power BI theme files for each of the 4 color maps (viridis, inferno, magma, plasma). These color palettes are not only lovely to look at, they are colorblind/CVD friendly and perceptually uniform (or close to it).

The screenshots below show the colors you’ll get when you use my theme files.

Viridis

The Power BI color picker for a data colors in a column chart. It shows white, black, and then the 8 colors from the viridis color palette which range from dark purple to blue to green.
Viridis theme colors in Power BI

Plasma

The Power BI color picker for a data colors in a column chart. It shows white, black, and then the 8 colors from the plasma color palette which range from dark purple to pink to yellow.
Plasma theme colors in Power BI

Magma

The Power BI color picker for a data colors in a column chart. It shows white, black, and then the 8 colors from the magma color palette which range from dark purple to pink to orange.
Magma theme colors in Power BI

Inferno

The Power BI color picker for a data colors in a column chart. It shows white, black, and then the 8 colors from the inferno color palette which range from dark purple to red to yellow
Inferno theme colors in Power BI

I generated a palette of 10 colors and then dropped the darkest and lightest colors in an effort to try to help you get good color contrast without inadvertently highlighting a data point. I chose to use the second darkest color of the 8 as the first/main color, which should work well on light backgrounds.

You’ll also notice that I have set in the theme the minimum, center, and maximum colors for use in a diverging color palette. This diverging palette includes the darkest and lightest color in an effort to give you a wider scale.

Give the themes a try

If you don’t enjoy choosing colors and just want something that looks good, feel free to hop over to the Github project and download the JSON files. You can learn more about the method I used to choose the colors and my suggestions for usage in the project documentation.

If you do use the themes, feel free to let me know how they worked and if you have suggestions for improvements.

Microsoft Technologies, Power BI, Power Query

Calling the Intercom API with Power Query and Refreshing in the Power BI Service

I needed to pull some user data for an app that uses Intercom. While I will probably import the data using Data Factory or a function in the long term, I needed to pull some quick data in a refreshable manner to combine with other data already available in Power BI.

I faced two challenges in getting this code to work:

  1. Intercom’s API uses cursor-based pagination when retrieving contacts
  2. I needed this query to be refreshable in PowerBI.com so I could schedule a daily refresh.

If you have worked with the Web.Contents function in Power Query, you may be familiar with all the various ways you can use it that aren’t supported in a refresh on PowerBI.com. Chris Webb’s blog is a great source for this info, if you find yourself stuck with a query that works in Power BI Desktop but not in the service.

The Intercom API

In version 2.4 of the Intercom API, users are a type of contact. You must make an HTTP GET call to https://api.intercom.io/contacts and pass an access token and Accept:application/json in the headers.

In the query string, you can specify the number of contacts per page by specifying per_page=x where x is a number less than or equal to 150. If you have more than 150 contacts, you will need to handle the cursor-based pagination.

When you make the initial call, the API will return a JSON object that contains a total count of contacts and a record for pages. Expanding the pages record shows the current page, the number of contacts per page, and the total number of pages.

type: pages
next: Record
page: 1
per_page: 150
total_pages: 3
Data returned in the Power Query Editor when retrieving contacts from the Intercom API

Expanding the next record gives you page 2 with a starting_after ID.

The cursor used for pagination in the Intercom API as retrieved using Power Query

To get the next page of contacts, the API call would be https://api.intercom.io/contacts?per_page=150&starting_after=[the starting_after ID listed above].

The Power Query Queries

This blog post from Gil Raviv gave me some ideas where to start, but the code in that blog will not refresh in PowerBI.com. You cannot put the iterations and the Web.Contents call and the generated list all in one query if you want to use scheduled refresh.

I ended up creating one query and two functions to accomplish my goal. The second function is optional, but you may find it useful as the time values in the API response are listed as Unix timestamps and you probably want to convert them to datetime values.

The first function contains the API call with an input parameter for the starting_after ID.

//Web API call function
(startafter) as record =>
   let
   Source = Json.Document(Web.Contents("https://api.intercom.io/contacts",[Query=[per_page="150",starting_after=startafter],Headers=[Accept="application/json", Authorization="Bearer <your access token goes here>"]])),
   data = try Source[data] otherwise null,
    pages = Source[pages],
    ttlpages = pages[total_pages],
    nextkey = pages[next][starting_after],
    next = try nextkey otherwise null,
    res = [Data=data, Next=next,TotalPages = total_pages]
   in
    res

It’s important to make the url in the Web.Contents function be a static string. If it is concatenated or dynamic in any way, the query will not be able to refresh in the Power BI service. All the query string parameters can go in the Query arguments of the Web.Contents function. If you have multiple query string arguments, you can put them in brackets with a comma separating them. You can do the same with multiple headers.

This function attempts the API call and returns null if it encounters an error. Once the data is returned, it retrieves the specific JSON objects needed to return the data. The next two lines are used to retrieve the starting_after value. The function returns the contact data, starting_after value, and total_pages value.

I used the following query to call that function.

let
GeneratedList = List.Generate(
   ()=>[i=0, res = fnGetOnePage("")],
   each [res][Data]<>null,
   each [i=[i]+1, res = fnGetOnePage([res][Next])],
   each [res][Data]),
    #"Converted to Table" = Table.FromList(GeneratedList, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
    #"Expanded Column1" = Table.ExpandListColumn(#"Converted to Table", "Column1"),
    #"Expanded Column2" = Table.ExpandRecordColumn(#"Expanded Column1", "Column1", {"id", "workspace_id", "external_id", "role", "email", "name", "has_hard_bounced", "marked_email_as_spam", "unsubscribed_from_emails", "created_at", "updated_at", "signed_up_at", "last_seen_at", "last_replied_at", "last_contacted_at", "last_email_opened_at", "last_email_clicked_at", "browser", "browser_version", "browser_language", "os", "location", "custom_attributes", "tags", "notes", "companies", "opted_out_subscription_types"}, {"id", "workspace_id", "external_id", "role", "email", "name", "has_hard_bounced", "marked_email_as_spam", "unsubscribed_from_emails", "created_at", "updated_at", "signed_up_at", "last_seen_at", "last_replied_at", "last_contacted_at", "last_email_opened_at", "last_email_clicked_at", "browser", "browser_version", "browser_language", "os", "location", "custom_attributes", "tags", "notes", "companies", "opted_out_subscription_types"}),
    #"Expanded location" = Table.ExpandRecordColumn(#"Expanded Column2", "location", {"type", "country", "region", "city", "country_code"}, {"location.type", "location.country", "location.region", "location.city", "location.country_code"}),
    #"Expanded custom_attributes" = Table.ExpandRecordColumn(#"Expanded location", "custom_attributes", {"role", "brand", "Subdomain"}, {"custom_attributes.role", "custom_attributes.brand", "custom_attributes.Subdomain"}),
    #"Removed Columns" = Table.RemoveColumns(#"Expanded custom_attributes",{"opted_out_subscription_types", "tags", "notes", "companies", "role"}),
    #"Changed Type" = Table.TransformColumnTypes(#"Removed Columns",{{"id", type text}, {"workspace_id", type text}, {"external_id", type text}, {"email", type text}, {"name", type text}, {"has_hard_bounced", type logical}, {"marked_email_as_spam", type logical}, {"unsubscribed_from_emails", type logical}, {"created_at", Int64.Type}, {"updated_at", Int64.Type}, {"signed_up_at", Int64.Type}, {"last_seen_at", Int64.Type}, {"last_replied_at", Int64.Type}, {"last_contacted_at", Int64.Type}, {"last_email_opened_at", Int64.Type}, {"last_email_clicked_at", Int64.Type}}),
    #"Replace Null with 0" = Table.ReplaceValue(#"Changed Type",null,0,Replacer.ReplaceValue,{"created_at","updated_at","signed_up_at","last_replied_at","last_seen_at","last_contacted_at","last_email_opened_at","last_email_clicked_at"
}),
    #"Invoked Custom Function" = Table.AddColumn(#"Replace Null with 0", "created_at_dt", each fnUnixToDateTime([created_at])),
    #"Invoked Custom Function1" = Table.AddColumn(#"Invoked Custom Function", "updated_at_dt", each fnUnixToDateTime([updated_at])),
    #"Invoked Custom Function2" = Table.AddColumn(#"Invoked Custom Function1", "signed_up_at_dt", each fnUnixToDateTime([signed_up_at])),
    #"Invoked Custom Function3" = Table.AddColumn(#"Invoked Custom Function2", "last_seen_at_dt", each fnUnixToDateTime([last_seen_at])),
    #"Invoked Custom Function4" = Table.AddColumn(#"Invoked Custom Function3", "last_replied_at_dt", each fnUnixToDateTime([last_replied_at])),
    #"Invoked Custom Function5" = Table.AddColumn(#"Invoked Custom Function4", "last_contacted_at_dt", each fnUnixToDateTime([last_contacted_at])),
    #"Invoked Custom Function6" = Table.AddColumn(#"Invoked Custom Function5", "last_email_opened_at_dt", each fnUnixToDateTime([last_email_opened_at])),
    #"Invoked Custom Function7" = Table.AddColumn(#"Invoked Custom Function6", "last_email_clicked_at_dt", each fnUnixToDateTime([last_email_clicked_at])),
    #"Fix Null Values" = Table.ReplaceValue(#"Invoked Custom Function7",DateTime.From("1970-01-01"),null,Replacer.ReplaceValue,{"created_at_dt","updated_at_dt","signed_up_at_dt","last_replied_at_dt","last_seen_at_dt","last_contacted_at_dt","last_email_opened_at_dt","last_email_clicked_at_dt"
}),
    #"Removed Columns1" = Table.RemoveColumns(#"Fix Null Values",{"created_at", "updated_at", "signed_up_at", "last_seen_at", "last_replied_at", "last_contacted_at", "last_email_opened_at", "last_email_clicked_at"})
in
    #"Removed Columns1"

The List.Generate call generates the rows I need to call my first function. It sets an iterator variable to 0 and then calls the function, which returns the first page of data along with the total pages and the starting_after ID. As long as data is returned, it makes the API call again with the previously returned starting_after ID. This creates a list of lists that can be converted into a table of records. Then the records can be expanded to fields.

I expanded several columns out into multiple columns. Then I adjusted the data types of my columns to the correct types (they came back as Any data type).

There were several columns that contained Unix timestamps. All of the custom function calls are returning the datetime version of those values. I needed to handle null values in the timestamp conversion, so I replaced all the null timestamps with 0, converted them, and then converted the datetime value of 1-Jan-1970 back to null. I did the replace for all 7 columns in one step. Then I removed the original columns that contained the Unix timestamp as they were not analytically relevant for me.

Below is my Unix timestamp to datetime conversion function.

(UnixTime as number) as datetime=> 
let 
DT = #datetime(1970, 1, 1, 0, 0, 0) + #duration(0, 0, 0, UnixTime) 
in DT

Advice and warnings

When using an API key that must be passed in the headers, it is safest to use a custom connector. Otherwise, you have to embed your API key in the M code, as shown above. When the query is sent to Intercom, it is encrypted using HTTPS. But anyone that opens your PBIX file would have access to it. Your key will be captured in the Power BI logs. And anyone that can manage to intercept your web request and decrypt your traffic would have access to it. This is not ideal. But creating a custom connector requires more advanced code and a gateway to make it usable in the Power BI service. With either option, you will choose Anonymous authentication for the data source.

Be sure to use the RelativePath and Query options in the Web.Contents call. This is necessary to make the query refreshable in the service. The value passed to the first parameter of Web.Contents must be a static string and must be valid in itself (no errors returned).

After publishing your report, you’ll need to set the credentials for the dataset before you can refresh it. Be sure to check the Skip test connection box. Otherwise, your credentials update will fail.

url: https://api.intercom.io/contacts
Authentication method: anonymous
Privacy level: organizational 
Skip test connection: yes
The Skip test connection option for the web data source in the dataset settings in PowerBI.com

Even though we are using anonymous authentication, you can still choose an Organizational privacy level.

In my contacts list, if I want only app users from your Intercom contacts list, I needed to filter the results on role = “user” since contacts also includes leads. The Role attribute was in the custom attributes returned by the API.

It took a bit of experimentation to get this working, so I thought I would share what I found to work. As always, Power BI is always changing and a better way to do this may be available in the future.

If you would like this development experience to improve, here are some Power BI Ideas to vote for:

Accessibility, Data Visualization, Microsoft Technologies, Power BI

What are those new buttons under tab order in Power BI?

If you’ve visited the Tab order area of the Selection Pane in Power BI in the last couple of months, you might have noticed some new buttons.

The selection pane in Power BI desktop with Tab order selected. There are three buttons underneath the Tab order heading.
Three new buttons for managing tab order in Power BI

The hover text on the first button says “Expand All”. This button is useful if you have grouped visuals. Groups are indicated by a carat to the left of the item in the tab order list.

The selection pane with tab order showing. The fourth item is a group titled Summary Cards. The items within the group are not shown.
Tab order for a report page containing one group

Selecting the Expand all button shows the individual objects within a group.

The selection pane with tab order showing. The fourth item is a group titled Summary Cards. The items within the group are shown in an indented list under the group name.
Tab order with the group expanded

The second button is the Collapse All button. It will collapse the groups so only the name of the group is shown and not the individual objects within the group.

The third button is a great new addition: Have tab order match visual order.

The Selection pane is shown with Tab order selected. The third button has hover text that reads "Have tab order match visual order".
The option to have tab order match visual order is the third button under Tab order.

This button will set the tab order for the visuals on the page to sort ascending by Y and then X coordinates. Let’s look at an example.

I have a report page containing 7 textboxes.

A Power BI report page with a box in the top left corner and a grid of 6 boxes underneath, spanning the entire width of the report. The order of the boxes appears random, but it matches the order in which they were added to the page.
The X,Y coordinates of each box are shown in the box. The original tab order is indicated by the numbers in the circle.

After clicking the button to have tab order match visual order, the tab order is changed shown below.

A Power BI report page with a box in the top left corner and a grid of 6 boxes underneath, spanning the entire width of the report. The order of the boxes matches the Y and X coordinates of each visual, starting at the top left and moving down to the bottom right.
Tab order set with the top left visual being first and the bottom right visual being last

This is often the correct tab order that matches how we read the report visually. This little button can increase keyboard/screen reader accessibility in one second instead of taking a couple of minutes per page.

There will be times that this tab order will not be what you want. Some exceptions might be when you use visuals that have a different amount of space inside the visual container, so the containers are intentionally misaligned (according to the X,Y coordinates) in order for the content to appear visually aligned. Then you might need to customize your tab order a bit. Another exception might be if you have some buttons or links at the top right of the report page that you want a user to visit last (after the content of the report). In that case, you would customize your tab order to make the button last.

But the majority of the time, this option to make tab order match visual order is exactly what you need. I applaud the Power BI team for taking this step to make creating accessible reports a little easier.

Microsoft Technologies, Power BI, Workout Wednesday

Power BI, Maps, and Publish to Web

October 2021 is mapping month over at Workout Wednesday for Power BI. As part of our challenges, we build a sample report and use the Publish to Web functionality to share it on the website. While this has worked well all year, there are some visuals, including maps, that do not support or require a different license for use with Publish to Web.

It’s frustrating to build a Power BI report that you plan to share, only to find that you can’t share it. So I thought it would be helpful to consolidate what I have found about the various map visuals and their support of Publish to Web.

Disclaimer: This information is correct as of October 14, 2021. This could change over time. This is not an exhaustive list of all the map visuals available for Power BI.

Map Visuals

6 map visuals on a power bi report: a bubble map, filled map, shape map, ArcGIS map, Azure Map, Mapbox map.
Examples of the 6 map visuals tested with Publish to Web

Note: I also tested several other AppSource visuals, but they failed to render in Power BI desktop. I may update this post if they start working again.

I hope this helps you plan your visuals when you need to publicly share a report that contains a map.

Excel, Microsoft Technologies, Power BI

Connect Excel to a Power BI Dataset in a Premium Workspace with a B2B User

Power BI offers the ability for users who have access to a dataset in the Power BI service (PowerBI.com) to connect to the dataset using Excel. Normally, this feature is referred to as Analyze in Excel. Once you connect Excel to your dataset, you can create Pivot Table reports or use Cube Functions.

There are currently limitations that mean this functionality isn’t supported for B2B (external) users. An external user is an Azure AD user that is based in another tenant and has been guested into the local AAD tenant. If you go to your dataset in PowerBI.com. choose Analyze in Excel, and then try to open the downloaded file and connect to the dataset, you will be met with connection errors.

But if you have your dataset in a workspace backed by Premium Per User or Premium capacity, you can use the XMLA endpoint to connect, even if you are using a B2B user!

Instead of using the Analyze in Excel functionality, you can connect to your dataset as if it were Analysis Services, using the XMLA endpoint. B2B users just need to make one adjustment to the server name they enter to make this work.

In Excel, locate the Get Data button. Select From Database and then From Analysis Services.

Get Data menu in Excel with the options From Database and From Analysis Services selected.
Connecting to a Power BI dataset using the XMLA endpoint in Excel is done in a similar manner as connecting to an Analysis Services database

Open a browser window and go to the settings of the Power BI workspace that contains the dataset to which you want to connect.

Settings for a Power BI workspace called Demo Reports.
Power BI Premium workspaces of any kind should have the workspace connection string listed in the settings pane

If your workspace is backed by Premium capacity, you will be able to see this in the settings and the workspace connection will be available for you to copy. If you are a member user (not external) you could copy this info into the Server Name box of the Data Connection Wizard and go on your way.

If you are a B2B user, you need to make an adjustment as noted in Microsoft Docs. You need to replace “myorg” in the workspace connection with your primary domain name. If you have access to the Azure portal, you can find the primary domain name on the overview page for the Azure Active Directory.

Overview page in the Azure Portal for Azure Active Directory with the Primary Domain circled under basic information.
The tenant UPN, also called primary domain can be found in the Azure Portal on the AAD overview page

So if the workspace connection from the Power BI service is:
powerbi://api.powerbi.com/v1.0/myorg/Demo%20Reports

And your primary domain is:
mysupercooldomain.com

Then you would change the workspace connection to:

powerbi://api.powerbi.com/v1.0/mysupercooldomain.com/Demo%20Reports

Once you have populated the server name with the workspace connection string, change the logon credentials to “Use the following user name and password” but leave the credentials blank. Once you select the Next button, you will be prompted for your Azure credentials.

Then you will be able to select the desired dataset from the workspace and be on your way to making connected Excel reports.

Data Visualization, Microsoft Technologies, Power BI

CDOT Bar Chart Makeover

As I was browsing Twitter today, I noticed a tweet from the Colorado Department of Transportation about their anti-DUI campaign. Shown below, it contains a bar chart that appears to have been presented in PowerPoint.

Tweet from CDOT containing a poorly designed bar chart

There are some easy opportunities to improve the readability of this chart, so I thought I would use it as an example of how small improvements can have a big impact on a fairly simple chart. I recreated the chart (as best I could) in Power BI and then made two revised versions.

Bar chart with 5 bars representing suspected impaired driving fatalities per year. The top left contains 3 logos. The title is on the top right. Each bar is a different bright color. The axis labels are very small.
My Power BI version of the original chart
Redesigned bar chart that removes the thick orange line, moves the title to the top left, and uses only one color across the bars.
My 10-minute makeover that increased readability
Bar chart with red bars and an annotation noting the upward trend from 2019 to 2020. A gray placeholder with a question mark has been added for 2021 with the note "Don't become part of this statistic."
My additional iteration that applied a clearer message and focus

Especially when making data visualizations for the general public —and especially when you want to get people’s engagement on social media— you need to reduce perceived cognitive load. Otherwise, people won’t even bother to read your chart. If your chart feels too busy or too complicated, many people in your intended audience will feel it is not worth the effort to even try to read it and will move on down their Twitter feed to the next Anakin and Padme meme.

Watch the video below for all of the details.

21-minute video showing how to improve upon a bar chart found on Twitter using Power BI
DAX, Microsoft Technologies, Power BI, Power Query

Calculating Age in Power BI

In week 26 of Workout Wednesday for Power BI, I asked people to calculate the age of Nobel laureates at the time they received the award. I provided some logic, but I didn’t prescribe how to create the age calculation. This inspired a couple of questions and a round of data validation as calculating age may be trickier than you think. In this post, I’ll explore some of the ways people have calculated age in Power BI and the edge cases where those calculations may not work.

In my solution video for Workout Wednesday, I used Power Query to calculate age. This was inspired by several blog posts and videos I had seen previously. There is an Age menu option in the Power Query editor under Date.

Calculating Age with the Power Query Editor user interface

When you select a date column and use that Age option, it calculates the duration between the selected date and the current date in days. You must then replace the current date with the second date column. Next you can choose Total Years under Duration, which divides the days by 365. Finally, you must round that number down to the next integer to get years.

If you follow Ruth’s video, you can do all of that in one step that creates a custom column with the final age value.

 Number.RoundDown(Duration.TotalDays([Date2] - [Date1])/365) 

That is the most common option in Power Query as there is no DateDiff function.

There are a few options for calculating age in DAX. Some people use the DATEDIFF function.

Age DateDiff = DATEDIFF([Date1],[Date2],YEAR) 

Another way I have seen is to use YEARFRAC function.

Age YearFrac = INT ( YEARFRAC ( [Date1], [Date2], 1 ) )

The way Marco Russo suggests is to use QUOTIENT.

Age Quotient (DAX): 

Age Quotient = 

VAR Birthdate = [Date1]

VAR ThisDay = [Date2]

VAR IntBirthdate = YEAR ( Birthdate ) * 10000 + MONTH ( Birthdate ) * 100 + DAY ( Birthdate )

VAR IntThisDay = YEAR ( ThisDay ) * 10000 + MONTH ( ThisDay ) * 100 + DAY ( ThisDay )

VAR Age = QUOTIENT ( IntThisDay - IntBirthdate, 10000 )

VAR CheckedAge = DIVIDE ( Age, NOT ISBLANK ( Birthdate ) )

RETURN

    CheckedAge

As Marco points out, many people were using YEARFRAC, but there is a bug in the DAX implementation that causes it to occasionally return an incorrect answer for this purpose.

Checking the Numbers

I created a Power BI file to demonstrate the differences in these four calculations. You can download the file here. The image below displays the results in several tests. For each row, I’m using Date1 as the birthdate and Date2 as the “as of” date. You’ll notice that I focused on leap years for a few cases.

Table in Power BI with 10 date ranges showing the results from the four calculations. 6 of the 10 rows have different results across the calculations.
Example date ranges and result of the four age calculations

There are six of ten date ranges that have different results across the different calculation methods.

In the second row, the Power Query age calculation says that Feb 29 to Feb 28 in the following year is a full year. This may or may not be what you want depending on your requirements. I’m noting the difference so you can be aware. A similar thing occurs in the fifth row going from Feb 29, 2016 to Feb 28, 2020, and again on the 9th row going from March 1, 2019 to Feb 29, 2020.

On the third row, notice that the DAX DATEDIFF function calculates Feb 29 to Feb 27 of the following year to be a full year, despite it being a day or two short. Depending on what you do with leap years, you might consider Feb 29 to Feb 28 in the following year to be a full year, but that third row result means DATEDIFF is probably not the calculation I want. We see a similar result going from March 1 to Feb 28 of the following year.

YEARFRAC calculates that Feb 29 to Feb 28 in the following year is not a full year, which may be desirable. But it counts Feb 29, 2016 to Feb 29, 2020 as only three years. And we see that March 1, 2000 to March 1, 2021 is only counted as 20 years. So even without starting on a leap year, we get some incorrect results. Small numbers seem to be correct until it gets to about 13 years.

Using the QUOTIENT function provides what I consider to be the most correct results. It calculates Feb 29 to Feb 28 of the following year to be less than a year. It calculates Feb 29, 2016 to Feb 28, 2016 to be three years and not four. And it calculates March 1 to Feb 29 of the following year to be less than a year.

Which to use?

The QUOTIENT formula produces the most accurate results if you don’t want Feb 29 to Feb 28 the next year to be counted as a year. DATEDIFF and YEARFRAC produce too many incorrect results for me to ever suggest using them. Since there is a DAX option that produces more correct answers, I would just go for QUOTIENT instead of either of these two.

UPDATE: There is a better alternative! Imke Feldmann reminded me that there is an Number.IntegerDivide function in Power Query. So let’s take the logic from Marco’s DAX calculation and move it to Power Query:

(BirthDate as date, EndDate as date) =>
let
BirthDateInt = Date.Year(BirthDate)10000 + Date.Month(BirthDate)100 + Date.Day(BirthDate),
EndDateInt = Date.Year(EndDate)10000 + Date.Month(EndDate)100 + Date.Day(EndDate),
Age = Number.IntegerDivide((EndDateInt - BirthDateInt),10000)
in Age

The Power Query custom column created by invoking this function should produce better compression than a DAX calculated column. This might not be significant for a small dataset, but we should be efficient when we can.

Accessibility, Data Visualization, Microsoft Technologies, Power BI

Zooming In on a Power BI Report

Have you ever tried to use your browser to zoom in on a visual in a Power BI report? If you simply published your report and then zoomed in, you might have experienced something like the video below.

Trying to zoom in on a report that is set to Fit to page can be confusing for users.

With the default settings of the report, when you zoom in, only the menus around the report change. This is because of report responsiveness and the View setting. By default, reports are set to Fit to page. Power BI is refitting the report to the page every time you zoom.

Why would we need to zoom in?

There might be accessibility or compliance reasons to allow people to zoom in. For instance, WCAG 2.1 Success Criterion 1.4.4 states “Except for captions and images of texttext can be resized without assistive technology up to 200 percent without loss of content or functionality.” People with low vision or other vision impairments might benefit from the ability to zoom within a report page.

Another reason might be that a user simply wants to focus on one chart at a time. Power BI does have a Focus mode. Unfortunately, it currently does a poor job of increasing the font sizes on the visual that is in focus, often rendering it unhelpful.

Column chart shown in Focus Mode in Power BI with large bars and tiny text
Power BI visual shown in Focus Mode

Edit: A helpful commenter pointed out that you can zoom in and out while in Focus mode. This works pretty well on many (but not all) visuals.

What Are Our Other Options?

There are a couple of workarounds for users who need to zoom in on visuals.

  1. We can set the report view — or teach users to set the report view — to Actual size. This then allows the browser zoom to work as anticipated. We probably don’t want to set all our reports to actual size because we would lose valuable screen real estate and diminish the experience for some users who don’t need to zoom. Having the report automatically fit to the user’s screen is usually helpful. But if users can change that setting as they need too, that might be ok. Here’s an example of how that works.
Setting the view on the Power BI report to Actual size allows users to zoom with the browser

2. We can use assistive technology to zoom. Both Windows and MacOS have built-in magnifier functionality. The downside to this is that using it would not satisfy WCAG 2.1 Success Criterion 1.4.4. I think there is still some gray area/lack of expertise as far as how people are making data visualizations WCAG compliant because it’s part text and part image/shape (although it’s not rendered on the page as an image in Power BI). I’m usually more concerned that users get the information they need an have a good experience. But I want to note this in case you are trying to be WCAG compliant and might run into this issue. Here’s an example of using the magnifier in Windows. You can still use the interactivity in the report. And you can change the size of the magnification window and the level of magnification.

The Windows Magnifier allows users to zoom in to part of the report page while retaining interactivity

3. Zooming in on the report page with a touch screen works fine. If users have tablets or laptops with a touch screen, they can use their fingers to zoom and it will behave as expected. Here’s a video that shows that experience.

Those are all the workarounds I’m aware of, but I’m interested to hear how you have worked around this issue. If you have other suggestions please leave them in the comments.

I found an existing idea about increasing the text size within visuals in focus mode on Ideas.PowerBI.com. I’ve added my vote to it, and I hope you’ll do the same.

Azure, Azure Data Lake, Microsoft Technologies, Power BI

Granting ADLS Gen2 Access for Power BI Users via ACLs

It’s common that users only have access to certain folders in an Azure Data Lake Storage container. These permissions are provided not through Azure RBAC (role-based access control) roles but through POSIX-like ACLs (access control lists).

The current Power BI documentation mentions only Azure RBAC roles, but it is possible to connect to a folder with permissions granted through ACLs.

You can manage ACLs through the Azure Storage Explorer application or in the Storage Explorer preview in the Azure Portal. As an example, I have a storage account with the hierarchical namespace enabled. In the container named filesystem1 is a folder called Test. Test contains 3 files, and I want a user to import Categories.csv into Power BI.

Azure Storage Explorer showing the mmldl storage account with filesystem1 selected. The Test folder in filesystem1 is selected and 3 files are shown.
Data lake storage account with files located in a folder called Test

If I select the Test folder and then select Manage Access, I can see that an AAD user named Data Lake User has been granted access and default ACLs. Note that the user needs at least Read and Execute. Write isn’t necessary if they don’t need to change the file.

The Manage Access window in Azure Storage Explorer. The user named Data Lake User is selected. Access and Default permissions are set to give the user Read, Write, and Execute.
Managing access on the Test folder for the Data Lake Access user

But with those permissions on the Test folder, I’m not able to connect to it from Power BI Desktop. If I try, I’ll get an error that says “Access to the resource is forbidden.”

Power BI error that says "Unable to connect. We encountered an error while trying to connect. Details: Access to the resource is forbidden."
Power BI error encountered when a user doesn’t have sufficient permissions to access a file in the data lake

This is because the user is missing some permissions. We need to grant Execute permissions on all parent folders up to the root (the container).

In this case, there is only one level above my Test folder. So I select the filesystem1 container, go to Manage Access, and grant it Execute permissions.

Manage Access window in Azure Storage Explorer showing permissions for Data Lake user on filesystem1. Execute is selected for both Access and Default permissions.
Adding Execute permissions to the parent container

Note that changing the Default ACL on a parent does not affect the access ACL or default ACL of child items that already exist. So if you have existing subfolders and files to which users need access, you will need to grant access at each parent level because the default ACLs won’t apply.

Thanks to Gerhard Brueckl for noting that I needed Execute permissions on parent folders when I got stuck in testing.

If you find yourself hitting that access forbidden message in Power BI when accessing a file in ADLS Gen2, double check the user’s Execute permissions on the parent folders.

Microsoft Technologies, Power BI, Workout Wednesday

Workout Wednesdays for Power BI in 2021

I’m excited to announce that something new is coming to the Power BI community in 2021: Workout Wednesday!

Workout Wednesday logo

Workout Wednesday started in the Tableau community and is expanding to Power BI in the coming year. Workout Wednesdays present challenges to recreate a data-driven visualization as closely as possible. They are designed to help you improve your skills in Power BI and Tableau.

How You Can Participate

Watch for the Power BI challenge to be published on Wednesdays in 2021. The challenge will contain requirements and a dataset. Use the dataset to create the desired end result.

Then share your workout! You can post your workout to the Data Stories Gallery or your blog, or just share a public link. If you aren’t able to share a public link – perhaps because that option is disabled in your Power BI tenant or you don’t have a Power BI tenant– a gif, a video, or even some screenshots are just fine.

To formally participate: Post to Twitter using both the #WOW2021 and #PowerBI hashtags along with a link/image/video of your workout. Include a link to the challenge on the Workout Wednesday site. And please note the week number in your description, if possible.

Community Growth

I’m looking forward to Workout Wednesdays for a couple of reasons. First, I think Power BI needs more love in the data visualization department. We need to be talking about effective visualization techniques and mature past ugly pie charts and tacky backgrounds. And I think Workout Wednesdays will help us individually grow those skills, but it will also foster more communication and sharing of ideas around data visualization in Power BI. That in turn will lead to more product enhancement ideas and conversations with the Power BI team, resulting in a better product and a stronger community.

Second, I’m also excited to see the crosspollination and cross-platform learning we will achieve by coming together as a data visualization community that isn’t focused on one single tool. There is a lot Tableau practitioners and Power BI practitioners can learn from each other.

Join Me In January

Keep an eye out on Twitter and the Workout Wednesday website for the first challenge coming January 6. While it would be great if you did the workout for every single week, don’t be concerned if you can’t participate every week. A solution will be posted about a week later, but nothing says you can’t go back and do workouts from previous weeks as your schedule allows.

I look forward to seeing all of your lovely Workout Wednesday solutions next year!