Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started

Reply
Miruthan
Advocate II
Advocate II

Where Can I find the published report in PowerBI Server

Hi experts,

Could you please help me to find the publised report in PowerBI report server?

 

I am trying to find the published report in PBIRS folder in PowerBI Server, but i couldn't. Where the publish report get stored in report server?

Thanks

12 REPLIES 12
Anonymous
Not applicable

The data about what is stored is in the dbo.catalog table in the ReportServer database. The actual report definition is in the dbo.CatalogItemExtendedContent table. You link them on the itemID

 

I don't think you get PBIX files on disk anywhere. i certainly haven't been able to find any.

 

The column you need from dbo.CatalogItemExtendedContent  is the Content column and you select it with somehting like this

 

SELECT	CT.[Path]
        ,CT.[Type]
		,cc.ContentType
        ,CONVERT(varbinary(max), cc.[Content]) AS BinaryContent
FROM dbo.[Catalog] AS CT
		LEFT OUTER JOIN dbo.CatalogItemExtendedContent cc
			ON ct.ItemID = cc.ItemId
WHERE CT.[Type] IN (13) -- this idicates a PBI report
	AND ct.ContentType = 'CatalogItem'
AND ct.ItemID = 'AE1BC1C9-BB25-4D4C-AFD5-2A1AC49C281F'

You can use some PowerShell to export the results of this query (much as you could do with old RDL and RDS data) Essentially if you save the BInaryContent to  a file you shoudl have a working PBIX.

 

There are a couple of wrinkles with this that are PBIX specific

 

There are multiple lines for each report in dbo.CatalogItemExtendedContent these are for 

 

CatalogItem

DataModel

PowerBIReportDefinition

 

The CatalogItem seems to be the file you deploy. So this is the one you want to extract. Its the wokring PBIX file

 

The PowerBIReportDefinition seems to be the same but with the DataModel file removed from the zipped PBIX

 

The DataModel is the file that has been extracted out. I assumed this is so you can setup a schedule to refresh the data all on its own and not have to roll it back inside the PBIX definition.

 

Of course for some PBIX files (those using LiveConnections) there is no DataModel entry in dbo.CatalogItemExtendedContent so the Content of the CatalogItem row and PowerBIReportDefinition row seem to be the same. Since these kinds of connections cannot be refreshed on a schedule there's no point in having a DataModel entry. I'm guessing PBI-SSRS just figures this out itself.

 

If you are stuck for tracking down a PowerShell to extract the report def then let me know. We have one that we run every hour so we can keep a backup of all the PBIX files as they change. People overwrite them incorrectly surprisingly frequently.

 

Some form of version control in the DB would be useful but operational stuff is probably low down the priority list on the dev stack.

 

 

 

Anonymous
Not applicable

stpnet, I am interested in that PowerShell script to extract the report definitions. That sounds like a great way to assist with our source control. 🙂
Anonymous
Not applicable

Its going  to look something like this

 

<# .SYNOPSIS
      Export of all SSRS reports datasources and images
   .DESCRIPTION
      This PowerShell script exports all (or filtered) reports, data sources and images directly from the ReportServer database
      to a specified folder. For the file name the complete report path is used; for file name invalid characters are replaced with a -.
      Reports are exported with .rdl as extension, data sources with .rds and resources without any additional extension.
      Please change the "Configuration data" below to your enviroment.
      Works with SQL Server 2005 and higher versions in all editions.
      Requires SELECT permission on the ReportServer database.
   .NOTES
      Author  : Olaf Helper
      Requires: PowerShell Version 1.0, Ado.Net assembly
   .LINK
      GetSqlBinary: http://msdn.microsoft.com/en-us/library/system.data.sqlclient.sqldatareader.getsqlbinary.aspx
#>


# Configuration data
[string] $server   = "YourServerName";        # SQL Server Instance.
[string] $database = "ReportServer";        # ReportServer Database.
[string] $folder   = "C:\WorkRepo\Export_PBI_SSRS\";          # Path to export the reports to.

# Select-Statement for file name & blob data with filter.	
$sql = "SELECT	CT.[Path]
        ,CT.[Type]
		,ISNULL(cc.ContentType,'SSRS') as ContentType
        ,CONVERT(varbinary(max), cc.[Content]) AS PBI_BinaryContent
        ,CONVERT(varbinary(max), ct.[Content]) AS RDL_BinaryContent
FROM dbo.[Catalog] AS CT
		LEFT OUTER JOIN dbo.CatalogItemExtendedContent cc
			ON ct.ItemID = cc.ItemId
WHERE CT.[Type] IN (2, 8, 5,13)
	AND ISNULL(cc.ContentType,'CatalogItem') = 'CatalogItem'";		
 #       WHERE CT.[Type] IN (8)";

# Open ADO.NET Connection with Windows authentification.
$con = New-Object Data.SqlClient.SqlConnection;
$con.ConnectionString = "Data Source=$server;Initial Catalog=$database;Integrated Security=True;";
$con.Open();

Write-Output ((Get-Date -format yyyy-MM-dd-HH:mm:ss) + ": Started ...");

# New command and reader.
$cmd = New-Object Data.SqlClient.SqlCommand $sql, $con;
$rd = $cmd.ExecuteReader();

$invalids = [System.IO.Path]::GetInvalidFileNameChars();
# Looping through all selected datasets.
While ($rd.Read())
{
    Try
    {
        # Get the name and make it valid.
        $name = $rd.GetString(0);
		Write-Output "fetching $name"
        foreach ($invalid in $invalids)
           {    $name = $name.Replace($invalid, "-");    }

        If ($rd.GetInt32(1) -eq 2)
            {    $name = $name + ".rdl";    }
        ElseIf ($rd.GetInt32(1) -eq 5)
            {    $name = $name + ".rds";    }
        ElseIf ($rd.GetInt32(1) -eq 8)
            {    $name = $name + ".rsd";    }
        ElseIf ($rd.GetInt32(1) -eq 11)
            {    $name = $name + ".kpi";    }
        ElseIf ($rd.GetInt32(1) -eq 13)
	    {   $name = $name + ".pbix";    }

			
			
        Write-Output ((Get-Date -format yyyy-MM-dd-HH:mm:ss) + ": Exporting {0}" -f $name);

        $name = [System.IO.Path]::Combine($folder, $name);

        # New BinaryWriter; existing file will be overwritten.
        $fs = New-Object System.IO.FileStream ($name), Create, Write;
        $bw = New-Object System.IO.BinaryWriter($fs);

        # Read of complete Blob with GetSqlBinary
        if ($rd.GetString(2) -eq "SSRS") {
			$bt = $rd.GetSqlBinary(4).Value;
		} else{
			$bt = $rd.GetSqlBinary(3).Value;
		}
		
		
        $bw.Write($bt, 0, $bt.Length);
        $bw.Flush();
        $bw.Close();
        $fs.Close();
    }
    Catch
    {
        Write-Output ($_.Exception.Message)
    }
    Finally
    {
        $fs.Dispose();
    }
}

# Closing & Disposing all objects
$rd.Close();
$cmd.Dispose();
$con.Close();
$con.Dispose();

Write-Output ((Get-Date -format yyyy-MM-dd-HH:mm:ss) + ": Finished");

Hi @Anonymous Thanks for sharing this script just saved my day... I was having an issue with my PowerBI Report server and needed reinstall it but was concerning about the backup of the reports and this is one of the easiest ways to backup it and also to facilitate the restore if a user simply delete or mess up the reports 🙂

 

Thank you

Anonymous
Not applicable

Stpnet,

you saved my life with this script.

I Just changed the connection string as follow : 

$con.ConnectionString = "Data Source=$server;Initial Catalog=$database;Integrated Security=False; User ID=sa;Password=mypass;";

 

and it perfectly worked.

 

Thanks a lot !

Anonymous
Not applicable

Thanks for this script. It is life-changing
Anonymous
Not applicable

LOL! You need to get out more! Mark it as an answer if its useful

Anonymous
Not applicable

Sorry Just realised you were looking for the version control script we use.

 

Its basically the same we just assemble the SQL with a where clause looking at both the Created and Modified columns in dbo.catalog. Anything that has changed in the last hour and a bit (we schedule the script to run every hour)

 

We also add a timestamp to the path for the output so we know when we exported it out. You have to manage the output location of course otherwise it just grows for the rest of history. But there are Pshell script to delete stuff older than x days out there on the interent.

 

 

 

 

 

Hi @Anonymous @Anonymous

 

Is there any possibilities to update the connection string our powerbi report without opening the file in powerBI desktop using powerShell script?

Anonymous
Not applicable

I suspect not though it may depend on what type of connection you have.

 

If you are using Live Query then the connection is stored inside the PBIX file, so you could unzip it, alter the connection details and rezip it. I have no idea if this would work as I have no practical way of testing this.

 

For other data sources the DataMashup is zipped and has some encodings beyond the end of the archive. So though you can Unzip it and access the M scripts which have the connection settings in them you can't then zip it back up and produce a useable PBIX file.

 

So overall the answer is probably no. If you have imported data and want to schedule a refresh you can of course change the connection details in the data source of the deployed PBIX...

 

 

@Anonymous ,

 

Thanks for all you sharing.
I do not want to export a PBIX file. 
I just need to run a query against the data of a particular reports metadata, which I understand now to be available within the binary data of content field of CatalogItemExtendedContent table.

Could you help with that?

@vissvess  unfortunately Report Server does not provide any api calls that you can use to query either the data or metadata in a pbix report. 

Helpful resources

Announcements
September Hackathon Carousel

Microsoft Fabric & AI Learning Hackathon

Learn from experts, get hands-on experience, and win awesome prizes.

Top Solution Authors