Join us for an expert-led overview of the tools and concepts you'll need to pass exam PL-300. The first session starts on June 11th. See you there!
Get registeredPower BI is turning 10! Let’s celebrate together with dataviz contests, interactive sessions, and giveaways. Register now.
Hello all to start this out, I am a data modeling novice. If this isn't the best place then I appreciate anyone pointing me in the right direction.
I am trying to improve the modeling for a very complex series of reports I have created within my role. Previous modeling was one SUPER report where every single datasource and type was imported and modeled. I have tried splitting these datasources down so that formatting/processing is done be several smaller reports and then that data is fed into the report that are displayed to our stake holders. Unfortunately I am receiving the error below when publishing my Power BI reports.
Error: this dataset contains calculated tables or calculated columns that refer to remote tables which will result in refresh failures after publication
These reports include several differnet data connections.
I have a few calculated tables used to organize the data for ease of export/refrence to other reports.
Do you think if I remove the calculated tables that the reports will refresh again? I really don't know how to move forward with these reports at this point or a better way to approach the task.
Any guideance or direction is appreciated. Thanks for your time!!
Extra history if anyone is interested:
Initially everything was in Excel files that I updated manually. These all have to live on a sharepoint site because of business needs.
I was able to use Power Automate to export some of our data to a sharepoint folder
Then used Power Query in excel to combine the information into a single file. Unfortunately as far as I can find auto refresh of Power Query doesn't work on sharepoint files. I couldn't find a workaround.
Several tables grew larger then what Excel could handle. A teradata table was created for one of them.
I then queried the teradata table, imported 10+ tables from various excel files, imported the files in our SharePoint locations into one large report.
There were some refresh inconsistencies and the report was MASSIVE and took FOREVER to update or refresh. So I started on this datamodeling adventure.
Based on several articles I've read through, I thought that if I created a separate power bi reports to import and edit the data from the larger datasources (one from teradata, one for the separate sharepoint lists), then feed all those into the report for our stake holders our report would work better.
Thus far that has not been the case.
Hello @Douttful,
Calculated tables/columns that reference remote tables can cause refresh issues. Test if removing these calculated tables resolves the issue. If the calculated tables are essential, consider creating them in a way that doesn't rely on direct queries to remote data sources.
If possible, import the data from the other Power BI datasets into your main report, rather than using direct queries. Its also worthwile making sure the flow is set up to refresh the data in SharePoint correctly and that Power BI is scheduled to refresh after these updates. When dealing with complex transformations and multiple data sources, Power BI Dataflows might be a more efficient way to prepare your data before it gets to your report.
Should you require any further assistance, please do not hesitate to reach out to me.
User | Count |
---|---|
84 | |
75 | |
69 | |
48 | |
39 |
User | Count |
---|---|
111 | |
56 | |
50 | |
42 | |
40 |