Highlighting Below Avg Sales per Hierarchy Level with SWITCH() and ISINSCOPE() DAX Functions in Power BI

Highlighting Below Avg Sales per Hierarchy Level with SWITCH() and ISINSCOPE() DAX Functions in Power BI

I was working on a project a wee bit ago that the customer had conditional formatting requirement on a Column Chart.
They wanted to format the columns in the chart conditionally based on the average value based on the level of hierarchy you are at.
Here is the scenario, I have a Calendar hierarchy as below:

  • Calendar Hierarchy:
    • Year
    • Semester
    • Quarter
    • Month
    • Day

I use “Adventure Works DW2017, Internet Sales” Excel as my source in Power BI Desktop. If I want to visualise “Total Sales” over the above “Calendar Hierarchy” I get something like this:

Line Chart in Power BI, Total Sales by Year

Now I activate “Average Line” from “Analytics” tab of the Line chart.

Adding Average Line to Line Chart in Power BI

When I drill down in the line chart the Average line shows the average of that particular hierarchy level that I am in. This is quite cool that I get the average base on the level that I’m in code free.

Power BI, Drilling Donw in Line Chart

Easy, right?

Now, the requirement is to show the above behaviour in a “Column Chart” (yes! visualising time series with column chart, that’s what the customer wants) and highlight the columns with values below average amount in Orange and leave the rest in default theme colour.

So, I need to create Measures to conditionally format the column chart. I also need to add a bit of intelligent in the measures to:

  • Detect which hierarchy level I am in
  • Calculate the average of sales for that particular hierarchy level
  • Change the colour of the columns that are below the average amount

Let’s get it done!

Detecting Hierarchy Level with ISINSCOPE() DAX Function

Microsoft introduced ISINSCOPE() DAX function in the November 2018 release of Power BI Desktop. Soon after the announcement “Kasper de Jonge” wrote a concise blogpost about it.

So I try to keep it as simple as possible. Here is how is works, the ISINSCOPE() function returns “True” when a specified column is in a level of a hierarchy. As stated earlier, we have a “Calendar Hierarchy” including the following 5 levels:

  • Year
  • Semester
  • Quarter
  • Month
  • Day

So, to determine if we are in each of the above hierarchy levels we just need to create DAX measures like below:

ISINSCOPE Year		=	ISINSCOPE('Date'[Year])
ISINSCOPE Semester	=	ISINSCOPE('Date'[Semester])
ISINSCOPE Quarter	=	ISINSCOPE('Date'[Quarter])
ISINSCOPE Month		=	ISINSCOPE('Date'[Month])
ISINSCOPE Day		=	ISINSCOPE('Date'[Day])

Now let’s do an easy experiment.

  • Put a Matrix on the canvas
  • Put the “Calendar Hierarchy” to “Rows”
  • Put the above measures in “Values”
Detecting Year, Semester, Quarter, Month and Day hierarchy levels with ISINSCOPE in Power BI Desktop

As you see the “ISINSCOPE Year” shows “True” for the “Year” level. Let’s expand to the to the next level and see how the other measures work:

Continue reading “Highlighting Below Avg Sales per Hierarchy Level with SWITCH() and ISINSCOPE() DAX Functions in Power BI”

Quick Tips: Export Power BI Desktop and Power BI Service Model Data In One Shot with DAX Studio

Exporting Model Data to CSV 
or SQL Server in One Shot

In some of my old posts, which are most popular ones, I explained how to Export data Power BI Desktop or Power BI Service data to different destinations like CSV, Excel and SQL Server. In this quick tip I explain a very easy way to export the model data as a whole to either CSV or SQL Server with DAX Studio.

Daniil from XXL BI well explained this method, but I’d rather quickly explain how it works and add some more information.

After release 2.8 of DAX Studio, you can now quickly export the whole model to CSV and SQL Server in one shot.

Enabling Export All Data in DAX Studio

  • Open DAX Studio
  • Click “File”
  • Click “Options”
  • Click “Advanced”
  • Tick “Show Export All Data button”
DAX Studio Export Power BI Model Data Settings

Export Power BI Model Data to CSV

DAX Studio Export Power BI Model Data to CSV
Continue reading “Quick Tips: Export Power BI Desktop and Power BI Service Model Data In One Shot with DAX Studio”

Definitive Guide to Implement On-premises Data Gateway (Enterprise Mode) in Organisations

Definitive Guide to On-premises Data Gateway Implementation
Photo credit: Kayla Duhon

If you are a Business Intelligence consultant working in Power Platform, Azure Logic Apps and Azure Analysis Services landscape, you probably felt that On-premises Data Gateway is one of the essential parts of your engagements with the your customers. Installing On-premises Data Gateway can go smoothly if you already have a well thought implementation plan otherwise, it can quickly turn to a beast if you don’t have one. In this post I do my best to provide you some guidelines that can help you with your On-premises Data Gateway implementation planning. Consider the following points before, during and after the engagement:

  • Understanding Usage
  • Culture of Engagement
  • Environments (with all peopleinvolved)
  • Communication
  • Security
    • Corporate/environmental firewalls
    • Proxy Servers
    • Identity Access Management
  • People
  • Documentation/Implementation Plan
  • Installation, Configuration and Testing

Here is a diagram of important point that you should consider:

Definitive Guide to Implement On-premises Data Gateway (Enterprise Mode)

Usage

You need to understand the use of On-premises Data Gateway for your customer. If they need the gateway for their Power Platform, Azure Logic Apps, Azure Analysis Services or all of them. This is important as you either need to have access to your customer’s Power BI Service or Azure Portal or both, or you need to assist your customer to configure On-premises Data Gateway in Azure or in Power BI Service. The next points are:

  • Accessing customer’s Azure Portal and/or Power BI Service: The customer to decide whether to create a new account with sufficient rights for you or give you the credentials of an existing account. It is important to make sure you can access all environments and you have necessary rights to install/configure the gateway
  • You assist/consult a person at customer side with the implementation: you need to make sure you communicate with that person and see if he/she understands the requirements before the implementation date. Send them a calendar invitation beforehand to make sure he/she is present at that date. Always ask for a backup person just in case of an emergency happening to the primary person.
Continue reading “Definitive Guide to Implement On-premises Data Gateway (Enterprise Mode) in Organisations”

Automate Testing SSAS Tabular Models

Automate Testing SSAS Tabular

In real world SSAS Tabular projects, you need to run many different testing scenarios to prove your customer that the data in Tabular model is correct. If you are running a Tabular Model on top of a proper data warehouse then your life would be a bit easier than when you build your semantic model on top of an operational database. However it would be still a fairly time-consuming process to run many test cases on Tabular Model, then run similar tests on the data warehouse and compare the results. So your test cases always have two sides, one side is your source database that can be a data warehouse and the other side is the Tabular Model. There are many ways to test the system, you can browse your Tabular Model in Excel, connecting to your Data Warehouse in Excel and create pivot tables then compare the data coming from Tabular Model and the data coming from the Data Warehouse. But, for how many measures and dimensions you can do the above test in Excel?

The other way is to run DAX queries on Tabular Model side. If your source database is a SQL Server database, then you need to run T-SQL queries on the database side then match the results of both sides to prove the data in Tabular Model is correct.

In this post I’d like to share with you a way to automate the DAX queries to be run on a Tabular model.

Straight away, this is going to be a long post, so you can make or take a cup of coffee while enjoying your reading.

While I will not cover the other side, the source or the data warehouse side, it is worth to automate that part too as you can save heaps of times. I’m sure a similar process can be developed in SQL Server side, but, I leave that part for now. What I’m going to explain in this post is just one of many possible ways to generate and run DAX queries and store the results in SQL Server. Perhaps it is not perfect, but, it is a good starting point. If you have a better idea it would be great to share it with us in the comments section below this post.

Requirements

  • SQL Server Analysis Services Tabular 2016 and later (Compatibility Level 1200 and higher)
  • An instance of SQL Server
  • SQL Server Management Studio (SSMS)

How does it work

What I’m going to explain is very simple. I want to generate and run DAX queries and capture the results. The first step is to get all measures and their relevant dimensions, then I slice all the measures by all relevant dimensions and get the results. At the end I capture and store the results in a SQL Server temp table. Let’s think about a simple scenario:

  • you have just one measure, [Internet Sales], from ‘Internet Sales’ table
  • The measure is related to just one dimension, “Date” dimension
  • The “Date” dimension has only four columns, Year, Month, Year-Month and Date
  • you want to slice [Internet Sales] by Year, Month, Year-Month and Date

So you need to write four DAX queries as below:

EVALUATE
SUMMARIZE(
    'Internet Sales'
    , Date'[Calendar Year]
    , "Internet Sales", [Internet Total Sales]
)
EVALUATE
SUMMARIZE(
   'Internet Sales'
   , 'Date'[Month Name]
   , "Internet Sales", [Internet Total Sales]
)
EVALUATE
SUMMARIZE(
    'Internet Sales'
   , 'Date'[Year-Month]
   , "Internet Sales", [Internet Total Sales]
)
EVALUATE
SUMMARIZE(
     'Internet Sales'
    , 'Date'[Date]
    , "Internet Sales", [Internet Total Sales]
)

It is easy isn’t it? But, wait. What if you have 10 measures related to 4 dimension and each dimension has 10 columns? That sounds laborious doesn’t it? Well, in real world scenarios you won’t slice all measures by all relevant dimensions, but, you still need to do a lot. What we are going to do is to generate and run the DAX queries and store the results in a table in SQL Server. How cool is that?

OK, this is how it works…

  • Creating a Linked Server for SSAS Tabular instance from SQL Server
  • Generating DAX queries using Tabular DMVs
  • Running the queries through Tabular model and getting/storing the results in a SQL Server temp table

Continue reading “Automate Testing SSAS Tabular Models”