Quick Tips: Registering SQL Server Profiler as an External Tool in Power BI Desktop

Registering SQL Server Profiler as an External Tool

It has been a long time that I use SQL Server Profiler to diagnose my data models in the Power BI Desktop. I wrote a blog post in June 2016 about connecting to the underlying Power BI Desktop model from different tools, including SQL Server Management Studio (SSMS), Excel and SQL Server Profiler. In this quick post, I share a pbitool.json file that you can use to register the SQL Server Profiler as an external tool. Read more about how to register an external tool here. This is quite handy as this way to use SQL Server Profiler to diagnose Power BI Desktop without needing to find the diagnostic port. As an external tool, the SQL Server Profiler automatically connects to the data model via the diagnostic port. You can download the sqlserverprofiler.pbitool.json file from here. After you download the file you can open it in a text editor to see or modify the JSON code. If you are using SSMS 18, then you do not even need to modify the file. If you use a different version, the only thing you have to change is the “path”.

The contents of the sqlserverprofiler.pbitool.json file
The contents of the sqlserverprofiler.pbitool.json file
Continue reading “Quick Tips: Registering SQL Server Profiler as an External Tool in Power BI Desktop”

Quick Tips: Export Power BI Desktop and Power BI Service Model Data In One-Shot with DAX Studio

Exporting Model Data to CSV 
or SQL Server in One Shot

Update 2021 March:

You can now export the data direct from Power BI Desktop using my tool, Power BI Exporter. Read more here.

In some of my old posts, which are the most popular ones, I explained how to Export data Power BI Desktop or Power BI Service data to different destinations like CSV, Excel and SQL Server. In this quick tip I explain a very easy way to export the model data as a whole to either CSV or SQL Server with DAX Studio.

Daniil from XXL BI well explained this method, but I’d rather quickly explain how it works and add some more information.

After release 2.8 of DAX Studio, you can now quickly export the whole model to CSV and SQL Server in one shot.

Enabling Export All Data in DAX Studio

  • Open DAX Studio
  • Click “File”
  • Click “Options”
  • Click “Advanced”
  • Tick “Show Export All Data button”
DAX Studio Export Power BI Model Data Settings

Export Power BI Model Data to CSV

DAX Studio Export Power BI Model Data to CSV
Continue reading “Quick Tips: Export Power BI Desktop and Power BI Service Model Data In One-Shot with DAX Studio”

What Does XMLA Endpoints Mean for Power BI and How to Test it for Free?

Test Environment from Power BI XMLA Endpoint

XMLA endpoint connectivity for public preview has been announced late March 2019. As at today, it is only available for Power BI Premium capacity users. This sounds like a massive restriction to a lot of people who don’t have a Premium capacity, but they’d love to see how it works. In this article I show you an easy way to get your hands to Power BI XMLA endpoint as quick as possible. Before I start, I’d like to simply explain what XMLA endpoint is and what it really means for Power BI users.

Power BI is Like Onion! It has layers!

Generally speaking, Power BI has two different layers, presentation layer and data model layer. Presentation layer is the visual layer, the one you make all those compelling reports and visualisations. The data model as the name resembles, is the layer that you make your data model in. This layer is the one you can access it via XMLA connectivity.

In a Power BI Desktop file, you can see both layers:

Different layers of Power BI

How XMLA Relates to Different Layers in Power BI?

As you may have already guessed, XMLA is only related to the data model layer and it has nothing to do with the presentation layer. So you may connect to a data model, browse the data model, import data from the model to other platforms like Excel and so forth.

XMLA Is Not New!

Seriously? Yes, seriously. It is not new. It’s been around for many years and perhaps you’ve already used it zillions of times. Whenever you’re connecting to an instance of SQL Server Analysis Services, either Multidimensional or Tabular from any tools like SQL Server Management Studio (SSMS), Power BI Report Builder, Excel, Tableau, etc…, you’re using XMLA connectivity indeed.

Power BI is an Instance of SSAS Tabular

It is true. Power BI runs a local instance of SSAS Tabular model. So, whenever you open a Power BI Desktop file (PBIX), Power BI creates a local instance of SSAS Tabular model with a random local port number that can be accessed on your local machine only. When you close the file, the local instance of SSAS Tabular is shut down and its port number is released.

I first revealed the fact that you can connect to the underlying data model in Power BI Desktop from whole different range of tools like SSMS, SQL Server Profiler, Excel, etc… on Jun 2016. So, we indeed were using XMLA to connect to Power BI data models for a long time. We can even take a step further to import our Power BI data models into an instance of SSAS Tabular. In that sense, we are literally generating XMLA scripts from Power BI to create the same data model in SSAS Tabular. How cool is that?

Sooo… What is new then?

Continue reading “What Does XMLA Endpoints Mean for Power BI and How to Test it for Free?”

Automate Testing SSAS Tabular Models

Automate Testing SSAS Tabular

In real world SSAS Tabular projects, you need to run many different testing scenarios to prove your customer that the data in Tabular model is correct. If you are running a Tabular Model on top of a proper data warehouse then your life would be a bit easier than when you build your semantic model on top of an operational database. However it would be still a fairly time-consuming process to run many test cases on Tabular Model, then run similar tests on the data warehouse and compare the results. So your test cases always have two sides, one side is your source database that can be a data warehouse and the other side is the Tabular Model. There are many ways to test the system, you can browse your Tabular Model in Excel, connecting to your Data Warehouse in Excel and create pivot tables then compare the data coming from Tabular Model and the data coming from the Data Warehouse. But, for how many measures and dimensions you can do the above test in Excel?

The other way is to run DAX queries on Tabular Model side. If your source database is a SQL Server database, then you need to run T-SQL queries on the database side then match the results of both sides to prove the data in Tabular Model is correct.

In this post I’d like to share with you a way to automate the DAX queries to be run on a Tabular model.

Straight away, this is going to be a long post, so you can make or take a cup of coffee while enjoying your reading.

While I will not cover the other side, the source or the data warehouse side, it is worth to automate that part too as you can save heaps of times. I’m sure a similar process can be developed in SQL Server side, but, I leave that part for now. What I’m going to explain in this post is just one of many possible ways to generate and run DAX queries and store the results in SQL Server. Perhaps it is not perfect, but, it is a good starting point. If you have a better idea it would be great to share it with us in the comments section below this post.

Requirements

  • SQL Server Analysis Services Tabular 2016 and later (Compatibility Level 1200 and higher)
  • An instance of SQL Server
  • SQL Server Management Studio (SSMS)

How does it work

What I’m going to explain is very simple. I want to generate and run DAX queries and capture the results. The first step is to get all measures and their relevant dimensions, then I slice all the measures by all relevant dimensions and get the results. At the end I capture and store the results in a SQL Server temp table. Let’s think about a simple scenario:

  • you have just one measure, [Internet Sales], from ‘Internet Sales’ table
  • The measure is related to just one dimension, “Date” dimension
  • The “Date” dimension has only four columns, Year, Month, Year-Month and Date
  • you want to slice [Internet Sales] by Year, Month, Year-Month and Date

So you need to write four DAX queries as below:

EVALUATE
SUMMARIZE(
    'Internet Sales'
    , Date'[Calendar Year]
    , "Internet Sales", [Internet Total Sales]
)
EVALUATE
SUMMARIZE(
   'Internet Sales'
   , 'Date'[Month Name]
   , "Internet Sales", [Internet Total Sales]
)
EVALUATE
SUMMARIZE(
    'Internet Sales'
   , 'Date'[Year-Month]
   , "Internet Sales", [Internet Total Sales]
)
EVALUATE
SUMMARIZE(
     'Internet Sales'
    , 'Date'[Date]
    , "Internet Sales", [Internet Total Sales]
)

It is easy isn’t it? But, wait. What if you have 10 measures related to 4 dimension and each dimension has 10 columns? That sounds laborious doesn’t it? Well, in real world scenarios you won’t slice all measures by all relevant dimensions, but, you still need to do a lot. What we are going to do is to generate and run the DAX queries and store the results in a table in SQL Server. How cool is that?

OK, this is how it works…

  • Creating a Linked Server for SSAS Tabular instance from SQL Server
  • Generating DAX queries using Tabular DMVs
  • Running the queries through Tabular model and getting/storing the results in a SQL Server temp table

Continue reading “Automate Testing SSAS Tabular Models”