Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn the coveted Fabric Analytics Engineer certification. 100% off your exam for a limited time only!

Reply
alexanderg
Advocate II
Advocate II

Approach for build a scaled solution

Hello Everybody

In your experience, Which is the best design approach to architect a scaled solution using Power BI?

Any reply would be helpful, thanks in advance!

1 ACCEPTED SOLUTION

@alexanderg

This is just one approach, but I like it for security/backup/ease of use. It leverages AD, the Desktop and the Service.

1) Security is managed via AD groups to simplify user management. You have control of which end users can read the model, which users can publish reports via that datasource (Enterprise Gateway), which users have access to the Power BI Groups. This also allows for row level security on the model level).

2) I use the desktop to build/update reports using my SSAS connection. This gives me backups/version control because I have the file which I can re-deploy anywhere, and I can store it in any number of locations including those like Sharepoint where I can see modified dates, etc.

3) I use Power BI Groups as the only place to pull company level reports into. This gives me the ability to have a group of people manage reports and own them, so everything isn't in one persons workspace.

4) For most end users, a shared dashboard is going to be easier to understand, so that is typically the best way I've found to intially start sharing reports. Although Content Packs are powerful, and I think are a great "Step 2" for implementing certain solutions, but only after I'm sure the owners and end users know how to use them.

 

This is a really high level broad strokes bullet point list, but it's typically my recommendation path for larger scalable solutions. At this point in time building business processes has to take the place of admin controls or features since those are not in the tool by default. I'm hoping more of the capabilities are added over time.

 


Looking for more Power BI tips, tricks & tools? Check out PowerBI.tips the site I co-own with Mike Carlo. Also, if you are near SE WI? Join our PUG Milwaukee Brew City PUG

View solution in original post

14 REPLIES 14
eskyline
Resolver I
Resolver I

Since its cloud based are you mostly questioning access to on premisis data and the Enterprise Gateway (EG)?

 

From the EG persective this is what I hear most recently (a webinar last week from Microsoft):

  • Power BI does not yet support high availablity/redundancy of an EG individual gateway.  Their is a recovery key that can be used ot restore an instance of the EG if necessary.
  • There are Perf Mon counters for the EG.  Its worth tracking those over time as well as well as usual server level measures % CPU, # Memory, etc. as you ramp up usage.
  • Start with 8 cores and adjust based off perfmon counters and usage
  • Potentially locate your EG geographically based off where your Power BI service lives.  You can see that location by logging into the Power BI Service and using the ?/About Power BI option.

I believe they are going to be publishing some guidance soon.

 

 

Thanks @eskyline, I refer to differents ways to deploy a scaled solution with Power BI, and based on their experiences, which could be the best approach. Since what I've seen every form of implementation has its pros and cons. For Example:

1- With Power BI Desktop you can create a very complete solution importing data from multiple sources and modeling it and all this running in memory. But the cons is when you publish in the service (Because Actually Doesn't exists an On-Premise solution) you are limited to 250MB in your dataset, so is not scalable (I think so).

2- The other alternative is create a solution using DirectQuery, with this you are not limited in the dataset and also you get always the most recent data, but the cons is DirectQuery has many limitations in the Query Editor, Modeling, DAX and lose the in-memory technology..

3- Also we can work with SSAS, but in this way you lose all the ETL capabilities of Power BI, because Power BI will act only as a presentation layer.

@alexanderg My 2cents: If you have the option to use SSAS, then go that route. 

The only downside, that I see, is that it requires more knowledge of different applications to build the backend etl.

 

The benefits:

There are no limitations to size (as you mention)

There are methods for version control (TFS) and backups

You don't have to rebuild everything if your 1 Desktop file that contains everything becomes corrupt for some unknown reason.

You get the benefits of using an in-memory model - especially in SQL 2016

You get to keep your data on-premises (matters to some, not all)

You can use your model in any visual platform that supports a connection to it

 

There are probably other benefits, but that's what I can think of at the moment.

 


Looking for more Power BI tips, tricks & tools? Check out PowerBI.tips the site I co-own with Mike Carlo. Also, if you are near SE WI? Join our PUG Milwaukee Brew City PUG

Thanks @Seth_C_Bauer I will considerate that route, use a SSAS Tabular as a backend! So for you which is the best approach to organizing in the Power BI service the content of a solution based on SSAS. taking into consideration aspects of security, governance, access levels, collaboration, etc.?

@alexanderg

This is just one approach, but I like it for security/backup/ease of use. It leverages AD, the Desktop and the Service.

1) Security is managed via AD groups to simplify user management. You have control of which end users can read the model, which users can publish reports via that datasource (Enterprise Gateway), which users have access to the Power BI Groups. This also allows for row level security on the model level).

2) I use the desktop to build/update reports using my SSAS connection. This gives me backups/version control because I have the file which I can re-deploy anywhere, and I can store it in any number of locations including those like Sharepoint where I can see modified dates, etc.

3) I use Power BI Groups as the only place to pull company level reports into. This gives me the ability to have a group of people manage reports and own them, so everything isn't in one persons workspace.

4) For most end users, a shared dashboard is going to be easier to understand, so that is typically the best way I've found to intially start sharing reports. Although Content Packs are powerful, and I think are a great "Step 2" for implementing certain solutions, but only after I'm sure the owners and end users know how to use them.

 

This is a really high level broad strokes bullet point list, but it's typically my recommendation path for larger scalable solutions. At this point in time building business processes has to take the place of admin controls or features since those are not in the tool by default. I'm hoping more of the capabilities are added over time.

 


Looking for more Power BI tips, tricks & tools? Check out PowerBI.tips the site I co-own with Mike Carlo. Also, if you are near SE WI? Join our PUG Milwaukee Brew City PUG

Hi @Seth_C_Bauer,

 

It seems really interesting thread to have discussion. It is likely that we were in the same situation to choose approach how to integrate PowerBI in our legacy system (over 5 years old). we are currently using Qlik, and now we had decision to move on Power BI.

 

Eventually, we go ahead with approach:

 

1. Move our database from on-premise to Azure SQL. (This way we don't need to install enterprise gateway). 

 

2. Built lots of Database Views to support PowerBI. With this way we can do lots of tweats (limit columns, aggregations, renaming, calculated columns....) under SQL instead of Dax and Data Tool on PowerBI (SQL is still more powerfull and familiar with us than DAX). The PowerBI will load data over Views, not tables directly. The con for this approach is we will loose the relationship between tables, so we have to re-add again manually on Power BI relationship. 

 

3. We use scheduling to refresh data, not direct query. So the limitation 250M is acceptable for us, since we use Views doing tweats in oder to decrease the size.

 

4. The hardest part we think is security, for now PowerBI only uses AAD for authentation. Our system has its own security mechanism on premise. As my understanding we have no way to do authentication from PowerBI to external Identity Provider. So we have to sync our users from our system to AAD. This is awkward approach since the way to manage users is totally different with AAD (mutlti-tenant, roles...). How we deal with roles? We have to build each PowerBI file for each roles even they have the same Power BI UI but different data. It's  not still good approach but it does work, though.

 

If you have any idea, it would be highly appreciated

 

 

 

 

@cuongle Thanks for your feedback. I am agree with you, this is a very interesting discussion. So I have 2 question.

 

1- you say you currently use Qlik, but you are going to move to Power BI. I used QlikView & QlikSense and both are powerfull tools. So why are the main reasons for you desicion? 

2- according to your aproach, you develop in Power BI Desktop a model based on views from Azure SQL using import Mode, you design your model and then you upload this to the service. So because Azure SQL is in the cloud, you don't need to use a gateway. That's what I understood... I am right?

 

Any comment would be helpful. Thanks in advance

@alexanderg Sorry for this late reply.

 

1. It's not my decision, it's from top management but the main reason is mostly related to money, Power BI is cheaper and have better and modern UI. But we did not get rid of Qlik totally we try to use Qlik and Power BI in parallel since all our current clients are using Qlik, we do offer power BI for new client and even for current clients who want to move from Qlik to Power BI (some client request to integrate Power BI instead of using Qlik).

 

2. Yes, you are asolutely right 🙂

 

@alexanderg and @Seth_C_Bauer,

 

The Power BI Embedded is really potential, we are investigate it in order to change from Power BI Service to Embedded, the benefit is great:

 

1. No Office 365, no Azure Sync, the security on report level, not on user level, so the approach to integrate Power BI to application is simpler.

 

2. Cost saving.

3. Connection String can be changed programmatically.

 

One limitation I know so far:

 

1. No support scheduling on "Import Data" mode.

@cuongle I'm going to take a really broad stroke at this because there are many different elements to all your questions. But before I dive in, is there a reason you are not using a tabular model? This would basically provide a better solution than what you outline in 1-3 and cover the row level security needs I think you reference in #4.

Azure AD is required regardless, but there are many areas of that I am not familiar with in syncing requirements and things you can/can't do with your on premises AD.


Looking for more Power BI tips, tricks & tools? Check out PowerBI.tips the site I co-own with Mike Carlo. Also, if you are near SE WI? Join our PUG Milwaukee Brew City PUG

@Seth_C_Bauer Thanks for your quick response.

 

Well, the reason why we did not go with the route SSAS is because the system is built over 5 years ago and we do manage Facts/Dimensions over star schema by ourselves. it is really dynamic to align with our business and even allows end users doing "Tagging" (change Dimension Foreign Keys on Fact table) which we think SSAS is not relevant for us.

 

So that's why we go ahead with Database Views.

@cuongle The Service update today just released the Row Level Security feature. Check it out to see if it works for you. 

https://powerbi.microsoft.com/en-us/blog/power-bi-service-march-update-part-two/#usage


Looking for more Power BI tips, tricks & tools? Check out PowerBI.tips the site I co-own with Mike Carlo. Also, if you are near SE WI? Join our PUG Milwaukee Brew City PUG

@Seth_C_Bauer Thanks for you reponse, I don't have any specific question :), I just want give our initial approach for PowerBI and would like to hear the feedback if needed or there is something not corrective much with this approach.

 

Thank you for the link, highly appreciated. This is what we expected, with this feature we just need one dashboard for each client.

 

Cheers

@cuongle gotcha. Since you didn't correct me, I'm assuming your question revolves around row level security. The only ways I am aware that you can apply row level security are in tabular models (which you aren't using) and sometime within a month, there will be some RLS functionality deployed within Power BI (As announced at Data Insights). I don't know if it will work in all instances (Direct Query) or just on imported data. Outside that, sql 2016 will allow RLS on a database if I'm not mistaken.

If this doesn't answer what you are looking for, let me know.


Looking for more Power BI tips, tricks & tools? Check out PowerBI.tips the site I co-own with Mike Carlo. Also, if you are near SE WI? Join our PUG Milwaukee Brew City PUG

Helpful resources

Announcements
April AMA free

Microsoft Fabric AMA Livestream

Join us Tuesday, April 09, 9:00 – 10:00 AM PST for a live, expert-led Q&A session on all things Microsoft Fabric!

March Fabric Community Update

Fabric Community Update - March 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors
Top Kudoed Authors