Quick Tips: Export Power BI Desktop and Power BI Service Model Data In One Shot with DAX Studio

Exporting Model Data to CSV 
or SQL Server in One Shot

In some of my old posts, which are most popular ones, I explained how to Export data Power BI Desktop or Power BI Service data to different destinations like CSV, Excel and SQL Server. In this quick tip I explain a very easy way to export the model data as a whole to either CSV or SQL Server with DAX Studio.

Daniil from XXL BI well explained this method, but I’d rather quickly explain how it works and add some more information.

After release 2.8 of DAX Studio, you can now quickly export the whole model to CSV and SQL Server in one shot.

Enabling Export All Data in DAX Studio

  • Open DAX Studio
  • Click “File”
  • Click “Options”
  • Click “Advanced”
  • Tick “Show Export All Data button”
DAX Studio Export Power BI Model Data Settings

Export Power BI Model Data to CSV

DAX Studio Export Power BI Model Data to CSV
Continue reading “Quick Tips: Export Power BI Desktop and Power BI Service Model Data In One Shot with DAX Studio”

What Does XMLA Endpoints Mean for Power BI and How to Test it for Free?

Test Environment from Power BI XMLA Endpoint

XMLA endpoint connectivity for public preview has been announced late March 2019. As at today, it is only available for Power BI Premium capacity users. This sounds like a massive restriction to a lot of people who don’t have a Premium capacity, but they’d love to see how it works. In this article I show you an easy way to get your hands to Power BI XMLA endpoint as quick as possible. Before I start, I’d like to simply explain what XMLA endpoint is and what it really means for Power BI users.

Power BI is Like Onion! It has layers!

Generally speaking, Power BI has two different layers, presentation layer and data model layer. Presentation layer is the visual layer, the one you make all those compelling reports and visualisations. The data model as the name resembles, is the layer that you make your data model in. This layer is the one you can access it via XMLA connectivity.

In a Power BI Desktop file, you can see both layers:

Different layers of Power BI

How XMLA Relates to Different Layers in Power BI?

As you may have already guessed, XMLA is only related to the data model layer and it has nothing to do with the presentation layer. So you may connect to a data model, browse the data model, import data from the model to other platforms like Excel and so forth.

XMLA Is Not New!

Seriously? Yes, seriously. It is not new. It’s been around for many years and perhaps you’ve already used it zillions of times. Whenever you’re connecting to an instance of SQL Server Analysis Services, either Multidimensional or Tabular from any tools like SQL Server Management Studio (SSMS), Power BI Report Builder, Excel, Tableau, etc…, you’re using XMLA connectivity indeed.

Power BI is an Instance of SSAS Tabular

It is true. Power BI runs a local instance of SSAS Tabular model. So, whenever you open a Power BI Desktop file (PBIX), Power BI creates a local instance of SSAS Tabular model with a random local port number that can be accessed on your local machine only. When you close the file, the local instance of SSAS Tabular is shut down and its port number is released.

I first revealed the fact that you can connect to the underlying data model in Power BI Desktop from whole different range of tools like SSMS, SQL Server Profiler, Excel, etc… on Jun 2016. So, we indeed were using XMLA to connect to Power BI data models for a long time. We can even take a step further to import our Power BI data models into an instance of SSAS Tabular. In that sense, we are literally generating XMLA scripts from Power BI to create the same data model in SSAS Tabular. How cool is that?

Sooo… What is new then?

Continue reading “What Does XMLA Endpoints Mean for Power BI and How to Test it for Free?”

Quick Tips: Conditionally Replace Values Based on Other Values in Power Query

Power Query (M) made a lot of data transformation activities much easier and value replacement is one of them. You can easily right click on any desired value in Power Query, either in Excel or Power BI, or other components of Power Platform in general, and simply replace that value with any desired alternative. Replacing values based on certain conditions however, may not seem that easy at first. I’ve seen a lot of Power Query (M) developers adding new columns to accomplish that. But adding a new column is not always a good idea, especially when you can do it in a simple single step in Power Query. In this post I show you a quick and easy way to that can help you handling many different value replacement scenarios.

Imagine you have a table like below and you have a requirement to replace the values column [B] with the values of column [C] if the [A] = [B].

Sample Data in Power BI

One way is to add a new conditional column and with the following logic:

if [B] = [A] then [C] else [B]

Well, it works perfectly fine, but wait, you’re adding a new column right? Wouldn’t it be better to handle the above simple scenario without adding a new column? If your answer is yes then continue reading.

Continue reading “Quick Tips: Conditionally Replace Values Based on Other Values in Power Query”

Quick Tips: How to Filter a Column by another Column from a Different Query in Power Query

Filter a Column by a Column from a Different Query in Power Query

A while ago I was visiting a customer that asked if they can filter a query data by a column from another query in Power BI. And I said of course you can. In this post I explain how that can be achieved in Power Query. The key point is to know how to reference a query and how to reference a column of that query in Power Query. This is useful when you have a lookup table that can be sourced from every supported data source in Power Query and you want to filter the results of another query by relevant column in the lookup query. In that case, you’ll have a sort of dynamic filtering. So, whenever you refresh your model if new records have been changed in or added to the source of the lookup query, your table will automatically include the new values in the filter step in Power Query.

Referencing a Query

It is quite simple, you just need to use the name of the query. If the query name contains special characters like space, then you need to wrap it with number sign and double quotes like #”QUERY_NAME”. So, if I want to reference another query, in a new blank query, then the Power Query (M) scripts would look like below:

let
    Source = Product
in
    Source

Or something like

let
    Source = #"Product Category"
in
    Source

Referencing a Column

Referencing a column is also quite simple. When you reference a column you need to mention the referencing query name, explained above, along with the column name in brackets. So, the format will look like #”QUERY_NAME”[COLUMN_NAME]. The result is a list of values of that particular column.

let
    Source = #"Product Category"[Product Category Name]
in
    Source
Referencing a Column from Another Query in Power Query
Continue reading “Quick Tips: How to Filter a Column by another Column from a Different Query in Power Query”

Quick Tips: On-premises Data Gateway, Network Ports Test

Network Ports Test in On-premises Data Gateway

The latest version of On-premises Data Gateway February 2019 (3000.1.257) has been released on 1st March 2019. The release note is quite short and it doesn’t really provide a lot of information. At least that’s the case as at today (6 March 2019).

There is One new feature however, that I found very interesting is the “Network Ports Test” under the “Diagnostics” tab. This is an important feature I believe that gives you the ability to deal with your network administrators or your security officers much easier. You can now easily run a test directly from the gateway UI and pass the test results to your network admins if any of test steps fails.

As you can see in the screenshot you can click on “Start new test” and then open the completed test results to send them back to your networking team is needed.

That was really quick wasn’t it?

Smile

Check this out if you want to learn about processes involved with implementing On-premises Data Gateway across organisations and how to com up with an implementation plan.

WARNING: The above post is a really long one.

Smile

Quick Tips: How to Enable Dataflows In Power BI Service

Dataflows in Power BI Service

Dataflows (Preview) in Power BI Service has been landed yesterday (6th November 2018). I had a little bit of difficulties to enable this cool new feature so I thought it is good to write a Quick tip about it. While Dataflows is under preveiw at the time of writing this quick tip, the situation may be totally different in the future.

Straight away, fully featured Dataflows is available in a Power BI Premium capacity or in a Power BI Embedded Capacity, but, while this is still in preview, you can take advantage of limited features available in your Power BI Pro license. Features like “Linked entities from other dataflows” or “Computed Entities”, like merging tables to a new table, are not available in a Power BI Pro license.

Dataflows Computed Entities

Enabling Dataflows

  • After sign in to Power BI Service click “Settings”
  • Click “Admin Portal”

Power BI Service Admin Portal

  • Select Capacity type you are in, either Premium or Embedded
  • Click on a desired capacity that you’d like to enable Dataflows

Managing a Premium Capacity in Power BI Admin Portal

  • Scroll down to find and click “Workloads” under “More Options”
  • Enable “Dataflows (Preview)”
  • If you stick to the default “Max Memory (%)” value that is set to 20 you’ll get an error message saying “There was an issue updating your workload setting. Try again in a little while”. The error message is not helpful at all. The reason you get the error message is that the “Max Memory (%)” value must be a number between 27 to 100 while the default is 20.

Enabling Dataflows in Power BI Service Continue reading “Quick Tips: How to Enable Dataflows In Power BI Service”

What is Power BI Documenter

Power BI Documenter Logo (Small)

On Saturday, 9th June 2018, we announced the existence of Power BI Documenter. As the name resembles, Power BI Documenter is a tool to help individuals and businesses to document their Power BI Desktop models. Everyone who already have several Power BI Desktop reports probably realized that documenting the solutions is not as easy as how creating a report in Power BI Desktop is. The issue is more visible in larger organisations with several Power BI Developers who are busy enough with a big list of tasks that are assigned to them on a day to day basis. Therefore, there is no time left to take care of the documentation. Every IT expert knows how important is to have proper documentation. We at Data Vizioner decided to do something tangible about this issue. So we started the project several months ago with the vision of creating web app to help individuals and businesses to keep their Power BI documentation on track. In this post I’m not going to explain how you can easily start documenting your Power BI Desktop reports using Power BI Documenter. You can learn more about Power BI Documenter and how to use it here. Despite the fact that the current version of Power BI Documenter is the very first version of the app with lots of limitations, it indeed can help users with their Power BI documentation tasks. All you need to do is to export the Power BI Desktop files (PBIX) to Power BI Template format (PBIT) and upload it to Power BI Documenter web app. Continue reading “What is Power BI Documenter”

Time Dimension in Power BI and SSAS Tabular Model Supporting Minutes Time Bands

2018-05-23 12_58_48-Symbols (Open in Visio).vsdx - Visio Professional

Date dimension has been discussed quite a lot on the Internet and you can find lots of valuable articles around it here and there. But what if you need to analyse your data in time level? A customer has a requirement to analyse their data in Minutes level. This means that the granularity of the fact table would be at minute level. So, if they store the data in their transactional database in seconds level, then we need to aggregate that data to minutes level. I don’t want to go there, just bear in mind that the granularity of your fact table is something that you must think about at the very first steps. In most cases, if not all cases, you’d be better to have a separate Time dimension. Then you need to have a TimeID or Time column in your fact table to be able to create a relationship between the Time dimension and the fact table. In this post I show you two ways to create Time dimension in Power BI:

  • Creating Time dimension with DAX
  • Creating Time dimension with Power Query (M)

Alternatively, you can take care of the Time dimension in the source system like SQL Server. Continue reading and you’ll find a T-SQL codes as complementary.

The techniques that I explain here can be done in SSAS Tabular model and Azure Analysis Services as well.

Requirements:

To follow the steps of building the test model you need to have:

  • Power BI Desktop: Download the latest version from here
  • A sample fact table containing time or datetime. I modified FactInternetSales from AdventureWorksDW and made it available for you to download in Excel format (find the download link at the bottom of the post)

Continue reading “Time Dimension in Power BI and SSAS Tabular Model Supporting Minutes Time Bands”