Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Don't miss out! 2025 Microsoft Fabric Community Conference, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount. Prices go up February 11th. Register now.

Reply
WalkerGBG
Frequent Visitor

Sharepoint Multiple File Upload - File layout changes

I currently use PowerBI as a talented amateur and take almost everything I do from the helpfiles and community posts so in no way an expert

 

Current Situation
I've created using the PowerBI desktop app a report (and dataset) that pulls in daily files using "Sharepoint Folders" with the last few years worth of data. They are simple billing files showing consumption of the services our company sells. Each file is about 18000 rows and has 200+ columns in it. All is working fine with that and up until recently have experienced no issues with the scheduled refreshes I have implemented.

Current Problem
A week or so ago the format of the file changed - with a new column being added into new versions of the file. The column is an important one as it's a new product so I want to be able to see the usage of that. It's the first change in the file for quite a while

What I've noticed is that this is not being imported when the daily file is loaded each day. I had assumed that all such new columns would be included in the dataset.

So have investigated and can see that in the Query there's the below - where it takes the layout and defines it.

= Table.TransformColumnTypes(#"Expanded Table Column1",{{"Source.Name", type text}, {"Date", type datetime}, {"OrgID", Int64.Type}, {"Parent", Int64.Type}, {"ProductID", Int64.Type}, {"CustomerName", type text}, COMMENT - etcetcetc, {"", type any}})

So that makes sense as it's not been defined to be pulled into the dataset.  So I've tried adding into this query the new field so it looks like this

= Table.TransformColumnTypes(#"Expanded Table Column1",{{"Source.Name", type text}, {"Date", type datetime}, {"OrgID", Int64.Type}, {"Parent", Int64.Type}, {"ProductID", Int64.Type}, {"CustomerName", type text}, COMMENT - etcetcetc,{"NewFIELD", Int64.Type}, {"", type any}})

But when I try and commit that I get an error message associated with the NewFIELD stating "Expression.SyntaxError: Invalid identifier."

Potential Solution
I realise that I could take the original 600 or so files and put them into an 'old folder' and have the new format fields coming into their normal folder and do a Sharepoint Load from 'old folder' then do another Sharepoint Load from the normal folder and merge the data together (think I can) but.....

 

... was wondering if there is a way to do this more elegantly so that there could be a single upload process in which I can add additional columns in future.

 

Any advice on this would be greatly appreciated

Thanks in advance

3 REPLIES 3
v-piga-msft
Resident Rockstar
Resident Rockstar

Hi @WalkerGBG ,

I have a little confused about your scenario.

Do you have the refresh issue in Power BI Desktop?

In addition, what about changing your query like below?

= Table.TransformColumnTypes(#"Expanded Table Column1",{{"Source.Name", type text}, {"Date", type datetime}, {"OrgID", Int64.Type}, {"Parent", Int64.Type}, {"ProductID", Int64.Type}, {"CustomerName", type text},{"NewFIELD", Int64.Type},COMMENT - etcetcetc, {"", type any}})

If it is convenient, could you share your M query in the Advanced Editor?

Best Regards,

Cherry

 

Community Support Team _ Cherry Gao
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Thanks for the response.  

 

Don't actually know what you mean when you say M Query 😞

 

To better explain with an example 

  • I have a folder in a sharepoint location that contains (say) 10 files
  • The format of the 10 files contains identical 10 columns
  • I currently import all of these using Sharepoint Folder which imports them all into one file (using a process similar to the one on this URL - https://powerbi.microsoft.com/en-us/blog/combining-excel-files-hosted-on-a-sharepoint-folder/)
  • But now there's an 11th file with 11 columns.
  • The import process when it was created didn't know about that column so doesn't pull it in.

So I'm looking for a way for this type of import/transform to cater for changes in the individual files.  Happy to hear that it's not possible, but if it might be possible I'd like to try

In the meantime I'm trying the potential solution mentioned earlier.

 

Am loading historic files before change using one Sharepoint Folder & Combine to create a table

Am loading new files after chagne from a new location using one SHarepoint Folder & Combine to create a table

 

Am trying to Append them into a new table now but there's about 12 million rows so might take a while.  Hoping that will fulfil the requirement

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!

Jan25PBI_Carousel

Power BI Monthly Update - January 2025

Check out the January 2025 Power BI update to learn about new features in Reporting, Modeling, and Data Connectivity.

Jan NL Carousel

Fabric Community Update - January 2025

Find out what's new and trending in the Fabric community.

Top Solution Authors