Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get certified in Microsoft Fabric—for free! For a limited time, get a free DP-600 exam voucher to use by the end of 2024. Register now

Reply
PowerBI_Scott
Frequent Visitor

Rough estimate on cost to get help with monitoring Power BI

I'm looking for rough estimates on the cost to get a consultant to help us monitor Power BI.

 

My company is using Power BI, but our main focus has been producing data for people to view.  We've reached a point where we need to start monitoring activity.  Our IT department stated they are not willing to hand out the level of access needed to get to that information, but we could have Power BI export that data into our data warehouse.

 

We would like to get this setup quickly so we are looking for help.  I'm supposed to provide an estimate today 4/15/24, so I'd appreciate it if I could get a response today.  It may be acceptable to provide an estimate a day or so later, but don't bother responding if it is after 4/17/2024.

 

Some examples of things we would like to monitor at a tenant level instead of at a workspace level includes the following:

  • monitoring whether datasets are refreshing successfully
  • The time it takes for datasets to refresh
  • How many people are viewing reports, what reports they are viewing, and who are viewing the reports
  • Suggest improved setups
  • Who has created objects or deleted objects.
1 ACCEPTED SOLUTION

Ok,

Do you know if you have any api sources staged on the aws dwh? If yes it should be pretty easy to stage data from Power BI APIs as well.

 

If Fabric is an option it can also Connect to a lot of sources with pipelines and a gateway: https://learn.microsoft.com/en-us/fabric/data-factory/how-to-access-on-premises-data

this list of connectors will be increased going forward.

 

you can also use dataflow gen2 and put the data in onelake or Something.

 

i have also experience with using power automate with gateway and oracle on prem dwh, and this also works pretty good.

 

last option is to us the rest api custom connector directly in power bi, and store all the data in a semantic model. You can the also use incremental refresh to obtain more historical data from the activity event api.

 

so a lot of options, and a lot og if and buts to Say exaxctly How long time. I think I Could do it in 40 hours, approuch $5000

 

Br

Marius


Br
Marius
BI Fabrikken
www.bifabrikken.no

View solution in original post

3 REPLIES 3
PowerBI_Scott
Frequent Visitor

We only have Power BI Pro licenses with a gateway to our data warehouse currently on AWS.  We don't have Azure.

Thanks!

Ok,

Do you know if you have any api sources staged on the aws dwh? If yes it should be pretty easy to stage data from Power BI APIs as well.

 

If Fabric is an option it can also Connect to a lot of sources with pipelines and a gateway: https://learn.microsoft.com/en-us/fabric/data-factory/how-to-access-on-premises-data

this list of connectors will be increased going forward.

 

you can also use dataflow gen2 and put the data in onelake or Something.

 

i have also experience with using power automate with gateway and oracle on prem dwh, and this also works pretty good.

 

last option is to us the rest api custom connector directly in power bi, and store all the data in a semantic model. You can the also use incremental refresh to obtain more historical data from the activity event api.

 

so a lot of options, and a lot og if and buts to Say exaxctly How long time. I think I Could do it in 40 hours, approuch $5000

 

Br

Marius


Br
Marius
BI Fabrikken
www.bifabrikken.no
mariussve1
Impactful Individual
Impactful Individual

Hi,

 

I have created an admin app in my company, that provides a list of all refresh jobs with status, start and end time, average time used, if it was refreshed by schedule or ondemand and so on.

 

I have also created a report showing who is using the apps and when.

 

Its all been created using REST API's with SPN  where I stored credentials in Azure Key Vault. You can use either azure data factory, Fabric pipelines or Power Automate.

 

Are you using a Premium capacity, or Pro licenses only? Are you using a gateway or is your dwh in the cloud? Power BI also have a custom api connector you can use, and store all the data in a senantic datamodel, but I think the best way is to store data in a dwh table or in onelake.

 

If I had all the access needed, and could use adf or Fabric pipelines and Could store all the data in tables on a dwh I think I Could set it all up in a week or so.

It is Then pretty strait forward for you to maintain it.

 

br

Marius


Br
Marius
BI Fabrikken
www.bifabrikken.no

Helpful resources

Announcements
November Carousel

Fabric Community Update - November 2024

Find out what's new and trending in the Fabric Community.

Live Sessions with Fabric DB

Be one of the first to start using Fabric Databases

Starting December 3, join live sessions with database experts and the Fabric product team to learn just how easy it is to get started.

Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early Bird pricing ends December 9th.

Nov PBI Update Carousel

Power BI Monthly Update - November 2024

Check out the November 2024 Power BI update to learn about new features.