Power BI is turning 10! Tune in for a special live episode on July 24 with behind-the-scenes stories, product evolution highlights, and a sneak peek at what’s in store for the future.
Save the dateEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Hi!
I have (3) queries in my application that represent 2017, 2018, and 2019 sales by period/year.
I appended queries 2 and 3 to query 1 to get a holistic view of all the data.
When I view sales by period/year in a table, all the data from query 1 and 2 look good and a few months from query 3 look good as well, but Jan 2017 - September 2017 is completely wrong.
I checked sales by period/year just form that query (not the appended query) and all the data is correct, so it is something happening specifically when appending. Is there some sort of cap on how much data can be appended that is causing records to fall off? For example, none of the records from company A are showing up after appending, but are there in the original query.
Hello Brysonds,
is there an update on how you fixed this problem?
It seams like I ran in to a simular problem aswell.
Best regards
I am having an issue appending queries as well. I have 4 queries, each representing a calendar quarter, so Q1, Q2, Q3, Q4 of 2023. Each query contains identical columns: ticket number, open date, location, etc.
I can append 3 queries, no problem, but the 4th query does not bring in the "open date" data. I have tried numerous methods to append the queries.
Any ideas on what's happening?
Do you happen to have the sample data ?
I did not provide sample data, but I provided information on the data, if that might help.
If you use R there is another way to append using R.
R can be used to Append multiple tables. However, R can append tables when they have same Column Names and same number of Columns (as far as I know).
Power BI Append can append tables with dissimilar number of columns and dissimilar column names.
Example - tbl_A appended with tbl_B
let Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("VczBDQAhCATAXngTI4voWYuh/zbkDh9ewmOT2WUtqlbiUKUTE5S72Bsaj2HkHIX2L9hz/LBeXJP0EO4leEqyyMeYJX5frMlo5L4B", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type text) meta [Serialized.Text = true]) in type table [#"Date " = _t, Price1 = _t, Price2 = _t]), #"Changed Type" = Table.TransformColumnTypes(Source,{{"Date ", type text}, {"Price1", type number}, {"Price2", type number}}) in #"Changed Type"
tbl_A
let Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("bcuxDcAwCETRXaiRBUewnVks9l/DJDSxlOqKd38tEjTpDaKdmPRm81yAJwWn6qFFeXnM0vynHFosX3Z+VXlWjHHGl1XsoIgN", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type text) meta [Serialized.Text = true]) in type table [#"Date " = _t, Price1 = _t, Price2 = _t]), #"Changed Type" = Table.TransformColumnTypes(Source,{{"Date ", type text}, {"Price1", type number}, {"Price2", type number}}), #"Run R script" = R.Execute("# 'dataset' holds the input data for this script#(lf)df_B <- dataset#(lf)df_A <- dataset2#(lf)df_temp <- rbind(df_A, df_B)#(lf)#(lf)output <- df_temp",[dataset=#"Changed Type", dataset2=tbl_A]), output = #"Run R script"{[Name="output"]}[Value] in output
tbl_B
Have you checked to make sure all columns have the same data types and names for all the queries?
Hi Nick,
Yes, I have checked that. In fact Query 2 and 3 are a copy of Query 1. The only difference is the parameter value for the data source for the date range of data. Other than that, they are identical in every way.
If I view the data of any single query using that query source, then everything is exactly right. It's only when bringing all 3 together that I see the issue.
I checked the table view row values for each query, and see that hundreds of thousands of records are missing.
User | Count |
---|---|
78 | |
74 | |
42 | |
32 | |
28 |
User | Count |
---|---|
100 | |
93 | |
51 | |
50 | |
48 |