Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
bdpr_95
Helper I
Helper I

Extracting Report and Data Source Metadata from Power BI

Hi,

 

I have a request from my manager who wants to know if it's possible to extract from Power BI the information about how many reports we have, what the data sources are (e.g., Snowflake), and, if possible, the schema used (e.g., SAP schema). Can anyone help me with this? Is it possible to do this using notebooks or something similar?

 

Thanks!

13 REPLIES 13
v-tejrama
Community Support
Community Support

Hi @bdpr_95 ,

 

You’re on the right path with the Power BI REST APIs, as Greg mentioned. These APIs are useful for getting a list of datasets and their data sources, which can help you identify connections to systems like Snowflake or SAP. However, the APIs don’t provide detailed schema information, such as table or field-level details from the source. To get this level of detail, you’ll need to connect to the dataset using the XMLA endpoint with tools like SSMS, DAX Studio, or Tabular Editor, which allow you to run DMV queries for schema metadata. If you’re working with Microsoft Fabric, you could also use notebooks to automate this process. Some third-party tools like Power BI Helper might offer additional support, though schema-level details may vary.

 REST APIs give you an overview, but for detailed schema information, you’ll need to use external tools.

Thank you.

 

But is it possible to build something automated that analyzes all datasets in a workspace or across multiple workspaces? Doing it report by report makes the process very time-consuming.

Hi @bdpr_95 ,

 

Yes, your follow up question is absolutely valid.

Doing this analysis report by report can definitely become very time consuming.

While the Power BI REST APIs are helpful for getting a list of datasets and their data sources, they don’t provide deep schema level information like table or field names.

 

To build something fully automated that analyzes all datasets across one or more workspaces with detailed schema, you'll need to go beyond REST APIs. You can use tools like SSMS, DAX Studio, or Tabular Editor via the XMLA endpoint to run DMV queries and extract that metadata. If you're working in Microsoft Fabric, you could also automate this through notebooks.

 

So yes, automation is possible, but for full schema level analysis, it will involve a mix of REST APIs and external tools.

 

Thank you,

Tejaswi.

Hi @v-tejrama,

 

Can you provide some guidance? Do you have any documentation?

 

thanks a lot!

Hello again @ @bdpr_95 ,

Here are the official Microsoft documentation links related to this topic please have a look for detailed guidance.

 

To achieve this, you’ll need to follow a combination of REST API access and XMLA connectivity. Here’s the official documentation that can help you get started:

Register an Azure App for Power BI REST API access:
https://learn.microsoft.com/en-us/power-bi/developer/register-app

Authenticate using Service Principal (ideal for automation):
https://learn.microsoft.com/en-us/power-bi/developer/embedded/embed-service-principal

Connect to Power BI Datasets using XMLA Endpoint:
You can use tools like SSMS or DAX Studio to query model metadata.
https://learn.microsoft.com/en-us/power-bi/enterprise/service-premium-connect-tools

 

Schema Discovery using DMV Queries (TMSCHEMA):

Run queries like - SELECT * FROM $SYSTEM.TMSCHEMA_TABLES - to explore model structure.

https://learn.microsoft.com/en-us/analysis-services/instances/use-dynamic-management-views-dmvs-to-m...

 

Using the XMLA endpoint, you can connect to your dataset and query internal schema information such as table names, columns, relationships, etc. This is very useful when REST APIs fall short on detailed metadata.

 

Thank you.

 

But like this, I can only do it report by report. Imagine I have 1000 reports, this is going to be a painful task.

 Hi again @bdpr_95 ,

 

Yes, completely understand your concern doing this one by one for each report can get very time-consuming, especially when you're dealing with hundreds or thousands of reports.

 

To handle this more efficiently, you can actually automate the entire process using a combination of Power BI REST APIs and XMLA connectivity. Using REST APIs, you can loop through all the workspaces and fetch the list of reports or datasets programmatically.

Then, for each dataset, you can connect via the XMLA endpoint and run TMSL or DMV queries to get detailed metadata like tables, columns, relationships, etc.

This approach helps you avoid doing it manually for each report and makes it much more scalable. You can use PowerShell or Python to script this out, and even schedule it to run regularly if needed.

 

Thank you.

 

Hi @bdpr_95 ,

 

Just checking in have you been able to resolve this issue? If so, it would be greatly appreciated if you could mark the most helpful reply accordingly. This helps other community members quickly find relevant solutions.

Thank you.

 

Greg_Deckler
Community Champion
Community Champion

@bdpr_95 I would take a look at the Power BI REST API's. You should be able to get all datasets (semantic models) and then get all of the data sources for each dataset:

Datasets - Get Datasources - REST API (Power BI Power BI REST APIs) | Microsoft Learn



Follow on LinkedIn
@ me in replies or I'll lose your thread!!!
Instead of a Kudo, please vote for this idea
Become an expert!: Enterprise DNA
External Tools: MSHGQM
YouTube Channel!: Microsoft Hates Greg
Latest book!:
DAX For Humans

DAX is easy, CALCULATE makes DAX hard...

Unfortunately, this only gives me the datasource. I was hoping to also get the schema used.

 

Thanks anyway!

@bdpr_95 Yeah, not sure about the schema. Might have to look for a third party tool that would do that although I don't know of one off the top of my head. Here is the ebook link. https://www.amazon.com/DAX-Humans-CALCULATE-Guide-Makes-ebook/dp/B0FGLRYWW8

I'll also provide you with a PDF if you shoot me a proof of purchase on LinkedIn or gdeckler @ gmail.



Follow on LinkedIn
@ me in replies or I'll lose your thread!!!
Instead of a Kudo, please vote for this idea
Become an expert!: Enterprise DNA
External Tools: MSHGQM
YouTube Channel!: Microsoft Hates Greg
Latest book!:
DAX For Humans

DAX is easy, CALCULATE makes DAX hard...

If I buy it in Kindle format, will you send me a PDF version?

Hi Greg, how can I get the eBook version of your book?

Helpful resources

Announcements
July 2025 community update carousel

Fabric Community Update - July 2025

Find out what's new and trending in the Fabric community.

July PBI25 Carousel

Power BI Monthly Update - July 2025

Check out the July 2025 Power BI update to learn about new features.

Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.