The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredCompete to become Power BI Data Viz World Champion! First round ends August 18th. Get started.
Hi! I have an situation I could use some help with.
Because we haven't been able to get either Alation or Purview to work properly end-to-end (complete documentation from source to Power BI dataset), we've been trying to develop stop gap solutions to document our data models. I developed a small solution using this article as guidance. It works very well but the problem is that it will only document one model at a time.
We have dozens of "enterprise" datasets so we'd like to extend this model to accommodate more than model.
The idea we're currently running with is to create a database with a list of our targeted data models along with tables for all the metadata we want to collect. That way we could simply add a slicer to select which dataset and our users are set. Our intention was to develop an ADF pipeline to iterate through the targeted models and then execute a series of DMV queries against the Power BI dataset and write the results to the appropriate table and done! But...as it turns out, ADF does not appear to have a connector for Azure Analysis Services which complicates things.
I've Googled and found lots of info on how to process AAS/Power BI datasets from ADF, but not to connect and execute DMV queries. Can anyone provide some ideas on where/how to start with this? @GilbertQ I would greatly appreciate your assistance, if you're willing 🙂
Thank you everyone for any help!
Solved! Go to Solution.
You could look at using the API Scanner which will have almost all the information you are looking for.
If you want something more specific it can be done with functions in Power Query to loop through the data and put it into a dataset.
I typically would advise to use the likes of PowerShell to extract the data you need into Azure Blob storage so that you have always got the source files saved. Then create a dataset off the files in Azure Blob Storage.
You could look at using the API Scanner which will have almost all the information you are looking for.
If you want something more specific it can be done with functions in Power Query to loop through the data and put it into a dataset.
I typically would advise to use the likes of PowerShell to extract the data you need into Azure Blob storage so that you have always got the source files saved. Then create a dataset off the files in Azure Blob Storage.
Hi @GilbertQ wanted to follow up...not everything was there. Specifically, relationships and calculation groups/items are not included in the output from the API call. Those things are critical for documentation so back to the drawing board. But thank you again!
Hi @GilbertQ Thank you for replying. After I posted the comment about APIs I did some Googling and found this. Everything is there.
As an alternative, if anyone knows of a way to use API calls to extract the metadata that would be just as acceptable solution 🙂