The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredCompete to become Power BI Data Viz World Champion! First round ends August 18th. Get started.
I'd like to scan all workspaces and find their datasourceInstances details. Currently, when I run Admin - WorkspaceInfo GetScanResult call, i can get the connection details as:
'connectionDetails': {'server': 'corp\\ser1', 'database': 'database1'},
But I'd like to further know the table or view names in the database that the dataset connect to. Is there any way I can get those info?
Thanks,
Hi @Boji ,
Yes, kind of, but it's not as nicely prepared as the connetion details and it requires Premium workspaces.
Basically the dataset structure is:
If you have some kind of Premium workspaces then you can get the Power Queries of each table from the XMLA API and parse the SQL tables out of the Power Query code. The simpler your Power Queries are, the simpler you can keep this implementation. For an unrestricted solution there is a Power Query parser written in typescript available on github that could help you, and a list of supported data sources that could occur in the query code from the Power Query documentation. But probably, and with SQL Server queries only, you can keep it much simpler - example see below.
Implementation
Install-Module -Name SqlServer -Scope CurrentUser
$cred = Get-Credential
$XmlaCommand = @"
<Discover xmlns='urn:schemas-microsoft-com:xml-analysis'>
<RequestType>TMSCHEMA_PARTITIONS</RequestType>
<Restrictions>
<RestrictionList>
<DatabaseName>MyDatasetName</DatabaseName>
</RestrictionList>
</Restrictions>
<Properties></Properties>
</Discover>
"@
Invoke-ASCmd -Credential $cred -Server "powerbi://api.powerbi.com/v1.0/myorg/MyWorkspaceName" -Database "MyDatasetName" -Query $XmlaCommand
get-tabular-model-schema-partitions.ps1 Be aware that you need to URL-encode the workspace name.
...
<QueryDefinition>let
Source = Sql.Database("mysqlserver.database.windows.net", "AdventureWorksDW2019"),
dbo_FactResellerSales = Source{[Schema="dbo",Item="FactResellerSales"]}[Data]
in
dbo_FactResellerSales</QueryDefinition>
...
XML<QueryDefinition> in TMSCHEMA_PARTITIONS response.
In this example, the connection is server: mysqlserver / database: AdventureWorksDW2019 and the table is FactResellerSales. All columns are selected. If you want to get all the column names for your lineage you need to query the SQL Server if they are not listed in the Power Query code.
Thoughts
BR
Martin
Martin,
Your answer is amazingly helpful. There are a lot of new things that I will need time to digest and put into practice. Is it possible to suggest the most feasible way for the scenario below?
Our organization is moving the database from an existing on-premises server to Snowflake cloud. We will need to update the dataset in Power BI to redirect to the new location. We will have a one-to-one match list of the existing table/view name, database, and server name to future Snowflake table/view name. Since there are hundreds of Power BI reports that would be impacted, I am working to write a script either in Python or PowerShell to automate the transformation.
A little more detail: our company is on the premium capacity, and I am on the fabric administration role. The Power BI spaces that I would need to work with to redirect the dataset resource location include both premium workspaces and pro workspaces. The power query is typically simple without incremental change (but some of the tables are refreshed on schedule). That is to say, most of the datasets only have one query (one query could load multiple SQL table/view to do ETL in power query). I will need to write a script to find all SQL tables/views in Power Query to redirect them to the future Snowflake storage location. Do you have any suggestions for the best route? I don't mind to work on the power query parsing as long as I can get the power query script and update it with new table name in the Snowflake.
Thanks for your kind help in advance,
Boji
Hi @Boji ,
Since the PowerQuery command to load from SQL Server is SQL.Database or SQL.Databases and for Snowfalke it's Snowflake.Databases you need to change the PowerQueries anyway. You cannot easily use steps ike "change datasource", which is more useful like for switching from dev to prod database withing the same technology.
If you are lucky, then you just need to replace the first lines of a PowerQuery that gets it data from SQL Server into lines that get the same table from Snowflake. Create one PowerQuery from Snowflake manually and compare the code pattern for SQL Server and Snowflake sources. Then you can write code to extract the database and table/view names, generate the Snowflake PowerQuery code and replace the lines in the PowerQuery code.
Now, the tricky part might be how to access, read, and change the PowerQuery code from a script. Basically you have two options:
Kind regards,
Martin