Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started

Reply
qmartiny
Helper III
Helper III

Changing data source from excel to SQL without losing calculated columns?

Hello all,


I have a query in a PowerBI that combined multiple excel files into 1. 

Recently I got the access to the SQL server so I switched the query source to use the SQL instead.

I made sure all the fields were named the same.

 

However when importing I lost all the calculated columns.

Has anyone experienced this?

 

I have done this before swapping excel sources but never excel - SQL.

 

I guess it could be the table name that is different from the sheet name in the excel but I don't really know how to get around that.

 

Thank you for your help.

 

Quentin

1 ACCEPTED SOLUTION

Hi @qmartiny,

 

This is hard to say without having the actual model to play with, but I would doubt so. In my experience, this never happend. Not that I've done it million times, but quite a number of times.

 

Technically, losing of the calculated columns should not also occur in the normal way of business if the table name of the SQL version of the query is exactly the same as the old one (Excel-version of the query). The PBI is presumably agnostic of anything what happens on the query side and only deal with the output.

In the wort case scenario you could loose relationships and/or calculated columns. It should not really freeze the import. Would this be possible to verify your assumption to delete all the relationships to the table (in a copy of the model) and see if it loads Ok?

Could you please also check that not just the column names, but also the data types are the same as in the original table?

 

Good luck, 

John 

View solution in original post

6 REPLIES 6
jbwtp
Memorable Member
Memorable Member

Hi @qmartiny,

 

I would suggest to build the SQL query as a separate query in PQ, disable loading to PBI for it.

In the original query that feeds to your table in the PBI Designer: remove everything except one step and make it pointing to your new SQL query. This would mask for PBI that something has changed on the background, so the calcualted columns and measures sould not disappear.

 

Kind regards,

John

Hi John,


Thanks a lot for this ! 

It looked promising.

I did as you said - reference the query + keep the last step ( a simple "remove nulls")

 

However I'm getting the dreaded infinite 'loading' time. I tried just the SQL query by itself and it loaded fine.

I made sure all the columns were exactly the same with no extra columns. 

 

Any ideas why this could be happening still?

 

qmartiny_0-1663333950043.png

 

Hi @qmartiny,

 

In this case, your "old" query should look something like this:

let
    Source = MyNewSQLQuery,
    #"Removed Blank Rows" = Table.SelectRows(MyNewSQLQuery, each not List.IsEmpty(List.RemoveMatchingItems(Record.FieldValues(_), {"", null})))
in
    #"Removed Blank Rows"

 

Can you rewrite it to look like this:

let
    Source = MyNewSQLQuery
in Source

This theoretically will make it passing through the SQL query without any modifications and should be execat same as running the SQL query itself.

 

See if this improves performance and then experiement with adding more transsformations (e.g. RemoveNulls).

 

Cheers,

John

 

Hi John,

 

I'm still getting infinite loading. I tried to increase the number of simultaneous evaluations but no change.
I think the model might be too complex still.

 

I'm guessing it has to do with the existing relationships + calculated tables..

 

Thanks for your help anyway

 

 

Hi @qmartiny,

 

This is hard to say without having the actual model to play with, but I would doubt so. In my experience, this never happend. Not that I've done it million times, but quite a number of times.

 

Technically, losing of the calculated columns should not also occur in the normal way of business if the table name of the SQL version of the query is exactly the same as the old one (Excel-version of the query). The PBI is presumably agnostic of anything what happens on the query side and only deal with the output.

In the wort case scenario you could loose relationships and/or calculated columns. It should not really freeze the import. Would this be possible to verify your assumption to delete all the relationships to the table (in a copy of the model) and see if it loads Ok?

Could you please also check that not just the column names, but also the data types are the same as in the original table?

 

Good luck, 

John 

Hi @jbwtp 

 

first of all thank you very much for your help so far.

As mentioned the new source has different columns + columns names. 
As per your advice I made sure all where transformed to match the exact structure of the original table.

 

Yesterday I tried using an excel folder again to mimic the old query as close as possible but I still got the infinite loading.

 

Today I will attempt to clean up the existing relationships. The issue when I tried to do this yesterday is I had some calculated columns using "RELATED" that effectively froze the model if I removed the relationship without deleting the table..

 

happy days 

 

Thanks again for your help.

 

i will let you know what happens

 

Quentin

Helpful resources

Announcements
Sept PBI Carousel

Power BI Monthly Update - September 2024

Check out the September 2024 Power BI update to learn about new features.

September Hackathon Carousel

Microsoft Fabric & AI Learning Hackathon

Learn from experts, get hands-on experience, and win awesome prizes.

Sept NL Carousel

Fabric Community Update - September 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors