you are a Business Intelligence consultant working in Power Platform, Azure
Logic Apps and Azure Analysis Services landscape, you probably felt that
On-premises Data Gateway is one of the essential parts of your engagements with
the your customers. Installing On-premises Data Gateway can go smoothly if you
already have a well thought implementation plan otherwise, it can quickly turn
to a beast if you don’t have one. In this post I do my best to provide you some
guidelines that can help you with your On-premises Data Gateway implementation
planning. Consider the following points before, during and after the
Culture of Engagement
Environments (with all peopleinvolved)
Identity Access Management
Installation, Configuration and Testing
Here is a diagram of important point that you should consider:
need to understand the use of On-premises Data Gateway for your customer. If
they need the gateway for their Power Platform, Azure Logic Apps, Azure
Analysis Services or all of them. This is important as you either need to have
access to your customer’s Power BI Service or Azure Portal or both, or you need
to assist your customer to configure On-premises Data Gateway in Azure or in
Power BI Service. The next points are:
Accessing customer’s Azure Portal and/or Power BI Service: The customer to decide whether to create a new account with sufficient rights for you or give you the credentials of an existing account. It is important to make sure you can access all environments and you have necessary rights to install/configure the gateway
You assist/consult a person at customer side with the implementation: you need to make sure you communicate with that person and see if he/she understands the requirements before the implementation date. Send them a calendar invitation beforehand to make sure he/she is present at that date. Always ask for a backup person just in case of an emergency happening to the primary person.
Date dimension has been discussed quite a lot on the Internet and you can find lots of valuable articles around it here and there. But what if you need to analyse your data in time level? A customer has a requirement to analyse their data in Minutes level. This means that the granularity of the fact table would be at minute level. So, if they store the data in their transactional database in seconds level, then we need to aggregate that data to minutes level. I don’t want to go there, just bear in mind that the granularity of your fact table is something that you must think about at the very first steps. In most cases, if not all cases, you’d be better to have a separate Time dimension. Then you need to have a TimeID or Time column in your fact table to be able to create a relationship between the Time dimension and the fact table. In this post I show you two ways to create Time dimension in Power BI:
Creating Time dimension with DAX
Creating Time dimension with Power Query (M)
Alternatively, you can take care of the Time dimension in the source system like SQL Server. Continue reading and you’ll find a T-SQL codes as complementary.
The techniques that I explain here can be done in SSAS Tabular model and Azure Analysis Services as well.
To follow the steps of building the test model you need to have:
Power BI Desktop: Download the latest version from here
A sample fact table containing time or datetime. I modified FactInternetSales from AdventureWorksDW and made it available for you to download in Excel format (find the download link at the bottom of the post)
From April 2017 onward we can use On-premises Data Gateway for Azure Analysis Services. This means you can connect your Tabular Models hosted in Azure Analysis Services to your on-premises data sources through On-premises Data Gateway. In this post you learn how to setup and configure the On-premises Data Gateway as well as how to configure Azure Analysis Services to connect to the on-premises database(s) via the gateway.
You have an Azure subscription
You already have an up and running instance of Azure Analysis Services (AAS)
Your instance of Azure Analysis Services, your instance of Power BI Service and your On-premises Data Gateway are all in the same “Region” (Check Prerequisites section below for more details)
This post covers the most common scenario which doesn’t involve Kerberos, Proxy Server, OKTA etc…
Reading: If you are not that familiar with On-premises Data Gateway then I encourage you to read this and for more in-depth read this article. You may also watch this video from Adam Saxton discussing On-premises Data Gateway.
As stated in the “Requirements” section, it is important to check if your instances of Azure Analysis Services, Power BI Service and On-premises Data Gateway located in the same “Region”.
The reasons behind that are:
Your On-premises Data Gateway MUST be installed in the same Region as your Power BI Service sits in. Otherwise your Power BI cannot see the gateway, therefore, you’re unable to schedule automatic data refresh in Power BI Service
To get the best performance and more reliability, you have to create On-premises Data Gateway Resource in Azure side in the same region as your Azure Analysis Services lives in
But, in some cases people created their Power BI account a long time ago and perhaps they didn’t set the right region for their Power BI Service to sit in. Now, they need to create an instance of Azure Analysis Services, but, due to their organisation privacy and security, they don’t want to (or perhaps not allowed to) create the instance of Azure Analysis Services in the region that their Power BI tenant sits in. So they would prefer to create the Azure Analysis Services in another region. In that case, it is recommended to install a new instance of On-premises Data Gateway in a separate server and change the “Gateway Region” during the installation.
To check the “Region” of your instances follow the steps below:
Power BI Service Region:
Login to your Power BI Service
Click the question mark on top right side of the page
I recently spoke in Tampa Dev Azure Meeting in a webinar form on 31st Jan 2018. We received interesting questions during the presentation. My aim was to introduce basic concepts of Big Data, Azure Data Lake, Azure Data Lake Store (ADLS), Azure Data Factory (ADF) and Power BI.
I would like to thank Tampa Dev organisers and all attendees for giving the opportunity to speak in this session.
NOTE: This method is no longer available (from 1st March 2019) as Microsoft discontinued supporting Web Designer in Azure portal. Microsoft is working on an alternative solution, but there is no timeframes or details to announce yet. Read more here. In the meantime, you still can import your Power BI Desktop Models to SSAS Tabular following the method described here.
A while ago I wrote a blog post on how to import you currently existing Power BI Desktop models to SSAS Tabular 2016. However, the method I explained is NOT supported by official Microsoft BI tools like SSDT, so you may consider it as a WORKAROUND only until Microsoft supports imploring Power BI models in SSDT. In this post, I show you how to import Power BI Desktop Model to SSAS Tabular 2017 using Azure Analysis Services. It is easy and hassle free.
Power BI Desktop: Make sure you have the latest version (Current Version: 2.52.4921.581 64-bit (November 2017))
Azure Subscription: You need to install an instance of Azure Analysis Services. Don’t worry about the costs, you can create an Azure free account with $200 credit for 30 days. Learn more here.
In this post I do NOT explain how to install Azure Analysis Services
This method works only for SQL Server Analysis Services 2017 Tabular
How it works
As mentioned earlier it is really easy in compare with other methods I explained in my previous post. Azure Analysis Services is capable of importing Power BI Desktop files creating a Tabular model version of your Power BI model in the cloud. Then you can simply download Visual Studio project file and redeploy it in your on-premises instance of SSAS Tabular 2017. Let’s go through the steps…
Open Power BI Desktop
Import data from WorldWideImportersDW from any desired combination of fact tables and dimensions. I imported