Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.

Reply
PowerBI_Scott
Frequent Visitor

Rough estimate on cost to get help with monitoring Power BI

I'm looking for rough estimates on the cost to get a consultant to help us monitor Power BI.

 

My company is using Power BI, but our main focus has been producing data for people to view.  We've reached a point where we need to start monitoring activity.  Our IT department stated they are not willing to hand out the level of access needed to get to that information, but we could have Power BI export that data into our data warehouse.

 

We would like to get this setup quickly so we are looking for help.  I'm supposed to provide an estimate today 4/15/24, so I'd appreciate it if I could get a response today.  It may be acceptable to provide an estimate a day or so later, but don't bother responding if it is after 4/17/2024.

 

Some examples of things we would like to monitor at a tenant level instead of at a workspace level includes the following:

  • monitoring whether datasets are refreshing successfully
  • The time it takes for datasets to refresh
  • How many people are viewing reports, what reports they are viewing, and who are viewing the reports
  • Suggest improved setups
  • Who has created objects or deleted objects.
1 ACCEPTED SOLUTION

Ok,

Do you know if you have any api sources staged on the aws dwh? If yes it should be pretty easy to stage data from Power BI APIs as well.

 

If Fabric is an option it can also Connect to a lot of sources with pipelines and a gateway: https://learn.microsoft.com/en-us/fabric/data-factory/how-to-access-on-premises-data

this list of connectors will be increased going forward.

 

you can also use dataflow gen2 and put the data in onelake or Something.

 

i have also experience with using power automate with gateway and oracle on prem dwh, and this also works pretty good.

 

last option is to us the rest api custom connector directly in power bi, and store all the data in a semantic model. You can the also use incremental refresh to obtain more historical data from the activity event api.

 

so a lot of options, and a lot og if and buts to Say exaxctly How long time. I think I Could do it in 40 hours, approuch $5000

 

Br

Marius

View solution in original post

3 REPLIES 3
PowerBI_Scott
Frequent Visitor

We only have Power BI Pro licenses with a gateway to our data warehouse currently on AWS.  We don't have Azure.

Thanks!

Ok,

Do you know if you have any api sources staged on the aws dwh? If yes it should be pretty easy to stage data from Power BI APIs as well.

 

If Fabric is an option it can also Connect to a lot of sources with pipelines and a gateway: https://learn.microsoft.com/en-us/fabric/data-factory/how-to-access-on-premises-data

this list of connectors will be increased going forward.

 

you can also use dataflow gen2 and put the data in onelake or Something.

 

i have also experience with using power automate with gateway and oracle on prem dwh, and this also works pretty good.

 

last option is to us the rest api custom connector directly in power bi, and store all the data in a semantic model. You can the also use incremental refresh to obtain more historical data from the activity event api.

 

so a lot of options, and a lot og if and buts to Say exaxctly How long time. I think I Could do it in 40 hours, approuch $5000

 

Br

Marius

mariussve1
Solution Supplier
Solution Supplier

Hi,

 

I have created an admin app in my company, that provides a list of all refresh jobs with status, start and end time, average time used, if it was refreshed by schedule or ondemand and so on.

 

I have also created a report showing who is using the apps and when.

 

Its all been created using REST API's with SPN  where I stored credentials in Azure Key Vault. You can use either azure data factory, Fabric pipelines or Power Automate.

 

Are you using a Premium capacity, or Pro licenses only? Are you using a gateway or is your dwh in the cloud? Power BI also have a custom api connector you can use, and store all the data in a senantic datamodel, but I think the best way is to store data in a dwh table or in onelake.

 

If I had all the access needed, and could use adf or Fabric pipelines and Could store all the data in tables on a dwh I think I Could set it all up in a week or so.

It is Then pretty strait forward for you to maintain it.

 

br

Marius

Helpful resources

Announcements
RTI Forums Carousel3

New forum boards available in Real-Time Intelligence.

Ask questions in Eventhouse and KQL, Eventstream, and Reflex.

MayPowerBICarousel

Power BI Monthly Update - May 2024

Check out the May 2024 Power BI update to learn about new features.

LearnSurvey

Fabric certifications survey

Certification feedback opportunity for the community.

Top Solution Authors