Quick Tips: How to Filter a Column by another Column from a Different Query in Power Query

Filter a Column by a Column from a Different Query in Power Query

A while ago I was visiting a customer that asked if they can filter a query data by a column from another query in Power BI. And I said of course you can. In this post I explain how that can be achieved in Power Query. The key point is to know how to reference a query and how to reference a column of that query in Power Query. This is useful when you have a lookup table that can be sourced from every supported data source in Power Query and you want to filter the results of another query by relevant column in the lookup query. In that case, you’ll have a sort of dynamic filtering. So, whenever you refresh your model if new records have been changed in or added to the source of the lookup query, your table will automatically include the new values in the filter step in Power Query.

Referencing a Query

It is quite simple, you just need to use the name of the query. If the query name contains special characters like space, then you need to wrap it with number sign and double quotes like #”QUERY_NAME”. So, if I want to reference another query, in a new blank query, then the Power Query (M) scripts would look like below:

let
    Source = Product
in
    Source

Or something like

let
    Source = #"Product Category"
in
    Source

Referencing a Column

Referencing a column is also quite simple. When you reference a column you need to mention the referencing query name, explained above, along with the column name in brackets. So, the format will look like #”QUERY_NAME”[COLUMN_NAME]. The result is a list of values of that particular column.

let
    Source = #"Product Category"[Product Category Name]
in
    Source
Referencing a Column from Another Query in Power Query
Continue reading “Quick Tips: How to Filter a Column by another Column from a Different Query in Power Query”

Quick Tips: On-premises Data Gateway, Network Ports Test

Network Ports Test in On-premises Data Gateway

The latest version of On-premises Data Gateway February 2019 (3000.1.257) has been released on 1st March 2019. The release note is quite short and it doesn’t really provide a lot of information. At least that’s the case as at today (6 March 2019).

There is One new feature however, that I found very interesting is the “Network Ports Test” under the “Diagnostics” tab. This is an important feature I believe that gives you the ability to deal with your network administrators or your security officers much easier. You can now easily run a test directly from the gateway UI and pass the test results to your network admins if any of test steps fails.

As you can see in the screenshot you can click on “Start new test” and then open the completed test results to send them back to your networking team is needed.

That was really quick wasn’t it?

Smile

Check this out if you want to learn about processes involved with implementing On-premises Data Gateway across organisations and how to com up with an implementation plan.

WARNING: The above post is a really long one.

Smile

Quick Tips: Time Dimension with Granularity of Seconds in Power BI and Analysis Services Tabular Models

Quick Tips: Time Dimension with Granularity of Seconds in Power BI and Analysis Services Tabular Models
Photo by Markus Spiske

In an article I posted a while back I showed different methods of creating Time dimension in Power BI and Tabular models. The Time dimension I explained was in Minutes. In this post I show you simple way to create Time dimension supporting Seconds. As this is a quick tip, I only show you how to get the Time and ID columns in the Time dimension. If you need to add time bands (time buckets) check this out for more details.

Time Dimension in Seconds Grain with Power Query (M):

Copy/paste the code below in Advanced Editor to generate Time dimension in Power Query:

let
Source = Table.FromList({1..86400}, Splitter.SplitByNothing()),
#"Renamed Columns" = Table.RenameColumns(Source,{{"Column1", "ID"}}),
#"Time Column Added" = Table.AddColumn(#"Renamed Columns", "Time", each Time.From(#datetime(1970,1,1,0,0,0)+#duration(0,0,0,[ID]))),
#"Changed Type" = Table.TransformColumnTypes(#"Time Column Added",{{"ID", Int64.Type}, {"Time", type time}})
in
#"Changed Type" 
Time Dimension in Power Query

Time Dimension in Seconds Grain with DAX:

Run the DAX expression below in a new calculated Table in Power BI or SSAS Tabular model:

Time in DAX = ADDCOLUMNS(
                       GENERATESERIES(1, 86400, 1)
                        , "Time", TIME(0, 0, 0) + [Value]/86400
                        )
Time Dimension in Seconds Level with DAX
Continue reading “Quick Tips: Time Dimension with Granularity of Seconds in Power BI and Analysis Services Tabular Models”

Definitive Guide to Implement On-premises Data Gateway (Enterprise Mode) in Organisations

Definitive Guide to On-premises Data Gateway Implementation
Photo credit: Kayla Duhon

If you are a Business Intelligence consultant working in Power Platform, Azure Logic Apps and Azure Analysis Services landscape, you probably felt that On-premises Data Gateway is one of the essential parts of your engagements with the your customers. Installing On-premises Data Gateway can go smoothly if you already have a well thought implementation plan otherwise, it can quickly turn to a beast if you don’t have one. In this post I do my best to provide you some guidelines that can help you with your On-premises Data Gateway implementation planning. Consider the following points before, during and after the engagement:

  • Understanding Usage
  • Culture of Engagement
  • Environments (with all peopleinvolved)
  • Communication
  • Security
    • Corporate/environmental firewalls
    • Proxy Servers
    • Identity Access Management
  • People
  • Documentation/Implementation Plan
  • Installation, Configuration and Testing

Here is a diagram of important point that you should consider:

Definitive Guide to Implement On-premises Data Gateway (Enterprise Mode)

Usage

You need to understand the use of On-premises Data Gateway for your customer. If they need the gateway for their Power Platform, Azure Logic Apps, Azure Analysis Services or all of them. This is important as you either need to have access to your customer’s Power BI Service or Azure Portal or both, or you need to assist your customer to configure On-premises Data Gateway in Azure or in Power BI Service. The next points are:

  • Accessing customer’s Azure Portal and/or Power BI Service: The customer to decide whether to create a new account with sufficient rights for you or give you the credentials of an existing account. It is important to make sure you can access all environments and you have necessary rights to install/configure the gateway
  • You assist/consult a person at customer side with the implementation: you need to make sure you communicate with that person and see if he/she understands the requirements before the implementation date. Send them a calendar invitation beforehand to make sure he/she is present at that date. Always ask for a backup person just in case of an emergency happening to the primary person.
Continue reading “Definitive Guide to Implement On-premises Data Gateway (Enterprise Mode) in Organisations”

Automate Testing SSAS Tabular Models

Automate Testing SSAS Tabular

In real world SSAS Tabular projects, you need to run many different testing scenarios to prove your customer that the data in Tabular model is correct. If you are running a Tabular Model on top of a proper data warehouse then your life would be a bit easier than when you build your semantic model on top of an operational database. However it would be still a fairly time-consuming process to run many test cases on Tabular Model, then run similar tests on the data warehouse and compare the results. So your test cases always have two sides, one side is your source database that can be a data warehouse and the other side is the Tabular Model. There are many ways to test the system, you can browse your Tabular Model in Excel, connecting to your Data Warehouse in Excel and create pivot tables then compare the data coming from Tabular Model and the data coming from the Data Warehouse. But, for how many measures and dimensions you can do the above test in Excel?

The other way is to run DAX queries on Tabular Model side. If your source database is a SQL Server database, then you need to run T-SQL queries on the database side then match the results of both sides to prove the data in Tabular Model is correct.

In this post I’d like to share with you a way to automate the DAX queries to be run on a Tabular model.

Straight away, this is going to be a long post, so you can make or take a cup of coffee while enjoying your reading.

While I will not cover the other side, the source or the data warehouse side, it is worth to automate that part too as you can save heaps of times. I’m sure a similar process can be developed in SQL Server side, but, I leave that part for now. What I’m going to explain in this post is just one of many possible ways to generate and run DAX queries and store the results in SQL Server. Perhaps it is not perfect, but, it is a good starting point. If you have a better idea it would be great to share it with us in the comments section below this post.

Requirements

  • SQL Server Analysis Services Tabular 2016 and later (Compatibility Level 1200 and higher)
  • An instance of SQL Server
  • SQL Server Management Studio (SSMS)

How does it work

What I’m going to explain is very simple. I want to generate and run DAX queries and capture the results. The first step is to get all measures and their relevant dimensions, then I slice all the measures by all relevant dimensions and get the results. At the end I capture and store the results in a SQL Server temp table. Let’s think about a simple scenario:

  • you have just one measure, [Internet Sales], from ‘Internet Sales’ table
  • The measure is related to just one dimension, “Date” dimension
  • The “Date” dimension has only four columns, Year, Month, Year-Month and Date
  • you want to slice [Internet Sales] by Year, Month, Year-Month and Date

So you need to write four DAX queries as below:

EVALUATE
SUMMARIZE(
    'Internet Sales'
    , Date'[Calendar Year]
    , "Internet Sales", [Internet Total Sales]
)
EVALUATE
SUMMARIZE(
   'Internet Sales'
   , 'Date'[Month Name]
   , "Internet Sales", [Internet Total Sales]
)
EVALUATE
SUMMARIZE(
    'Internet Sales'
   , 'Date'[Year-Month]
   , "Internet Sales", [Internet Total Sales]
)
EVALUATE
SUMMARIZE(
     'Internet Sales'
    , 'Date'[Date]
    , "Internet Sales", [Internet Total Sales]
)

It is easy isn’t it? But, wait. What if you have 10 measures related to 4 dimension and each dimension has 10 columns? That sounds laborious doesn’t it? Well, in real world scenarios you won’t slice all measures by all relevant dimensions, but, you still need to do a lot. What we are going to do is to generate and run the DAX queries and store the results in a table in SQL Server. How cool is that?

OK, this is how it works…

  • Creating a Linked Server for SSAS Tabular instance from SQL Server
  • Generating DAX queries using Tabular DMVs
  • Running the queries through Tabular model and getting/storing the results in a SQL Server temp table

Continue reading “Automate Testing SSAS Tabular Models”

Quick Tips: How to Enable Dataflows In Power BI Service

Dataflows in Power BI Service

Dataflows (Preview) in Power BI Service has been landed yesterday (6th November 2018). I had a little bit of difficulties to enable this cool new feature so I thought it is good to write a Quick tip about it. While Dataflows is under preveiw at the time of writing this quick tip, the situation may be totally different in the future.

Straight away, fully featured Dataflows is available in a Power BI Premium capacity or in a Power BI Embedded Capacity, but, while this is still in preview, you can take advantage of limited features available in your Power BI Pro license. Features like “Linked entities from other dataflows” or “Computed Entities”, like merging tables to a new table, are not available in a Power BI Pro license.

Dataflows Computed Entities

Enabling Dataflows

  • After sign in to Power BI Service click “Settings”
  • Click “Admin Portal”

Power BI Service Admin Portal

  • Select Capacity type you are in, either Premium or Embedded
  • Click on a desired capacity that you’d like to enable Dataflows

Managing a Premium Capacity in Power BI Admin Portal

  • Scroll down to find and click “Workloads” under “More Options”
  • Enable “Dataflows (Preview)”
  • If you stick to the default “Max Memory (%)” value that is set to 20 you’ll get an error message saying “There was an issue updating your workload setting. Try again in a little while”. The error message is not helpful at all. The reason you get the error message is that the “Max Memory (%)” value must be a number between 27 to 100 while the default is 20.

Enabling Dataflows in Power BI Service Continue reading “Quick Tips: How to Enable Dataflows In Power BI Service”