Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Shape the future of the Fabric Community! Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions. Take survey.

Reply
BugmanJ
Resolver I
Resolver I

How to avoid the problem of not being able to refresh Calculated Tables off Direct Query Tables

Good Afternoon All,

Having fallen for the same problem I have seen many people suffer from on here, I am asking what is the best solution to avoid this?

Problem
You have a Dataset you are accessing via DirectQuery. From this directquery, you build some summerised tables which you then build your PowerBI dashboards from. You can refresh just fine in the Desktop software, great! But if you upload to the PowerBI Service and try to set this to a scehduled refresh, it will fail. In short, you can not have a refresh, in the powerbi service if you are using calculated tables that refernence a table in Direct Query.

Solutions?
So what are peoples solutions for this? My thoughts so far:

 

1) Calanders - If your building a calander off a date, either hard code the date or DAX calculate a time reference. Neither of these are useful if your data is a constantly moving target

 

2) Use Power Automate to create a CSV which you then reference. Does work, but my current one is timing out because there are just too many rows to process in the time given

3) Create a PowerBI dataset which doesnt use DQ. Doable but wow what a filesize.

4) Don't use calcualted tables, not really practical for large datasets when you need to minimize the amount of data you are pulling from the cloud

 

5) Some people have noted changing the person whom owns the dataset will work. Tried, it doesnt work for me or a few others i have also seen.

 

None of these are great options. Tableua doesn't have this problems so why is PowerBI so behind the times on this? Anyone come up with a better way or just a way in general??


Regards
J

8 REPLIES 8
lbendlin
Super User
Super User

Problem
You have a Dataset you are accessing via DirectQuery. From this directquery, you build some summerised tables

That's your problem all right.

 

Have you considered using the Analysis Services connector against the dataset semantic model?  That will allow you to import tables.  Not optimal but a good enough compromise.

How would I do this? I believe from what I have read online that analysis devices would only connect to a server and not a semantic model?

You can connect to any semantic model as if it were an Analysis Server since a semantic model is pretty much an instance of SSAS Tabular.

 

In your workspace navigate to the semantic model's settings page.  Go to the Server setting section and copy the connection string

 

lbendlin_0-1701634420866.png

Use that connection string in your new file.  It will then give you the option to connect live or via import mode, and you can specify your own DAX query if you want.

 

Thanks, when i try, i have to cut down to the part which goes

 

api.powerbi.com

 

It then gives the error: "We could not connect to the Analysis Services server because the connection timed out or the server name is incorrect."

 

What should be the correct name? Do i need a value in for the database? My original dataset is pulling data from a AWS if that helps

 

 

when i try, i have to cut down to the part which goes

api.powerbi.com

 

 

 

You need to keep that part.  Remove the "Data Source="  part, and the "Initial Catalog" part including the semicolon.

 

The server part should be 

 

powerbi://api.powerbi.com/v1.0/myorg/<workspace name>

 

The database part is the name of the semantic model.  

Thanks that really helpful.
I now suffer with this problem:
Details: "This server does not support Windows credentials. Please try Microsoft account authentication."

But no where does it give me the ability to enter MS account details?

That's because you tried with the wrong authentication, and Power Query is mean and is remembering that choice.  Go to the Data Source settings are clear the credentials. Then try again, with organizational account.

Must be an organisational setting somewhere as this is still failing yet a friend showed me how this would work in a different organisation. However it appears this will still bring in the all the data, the whole point of the summary of direct data it that it doesnt do this.

But thank you for the assistance. I will mark the above as a solution for those whom Organisations allow it.

Helpful resources

Announcements
November Carousel

Fabric Community Update - November 2024

Find out what's new and trending in the Fabric Community.

Dec Fabric Community Survey

We want your feedback!

Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions.