Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredJoin us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM. Register now.
Let's say that I have a dataset A that I am merging with another dataset B. If I am doing some transformation when I am preparing A, will that all those steps in A be performed again when doing the merging with dataset B? Asking this because I see this happening with my datasets. I am calling a PowerShell script from R as a step in one of the dataset. I am merging this dataset with another dataset later somewhere. I am receiving two emails each time I refresh the dataset. Is there any way to fix this?
Another problem because of this is, I have one dataset where I export it to SQL Server using R as part of an intermediate step. I am then merging this dataset with another dataset. During this merging, I am seeing error like "Invalid object name <tablename>" in the merge step. Why is this happening? The R transformation step was succesful with the original data set but fails when I merge this with another data set.
Hi @karun_r,
1. In Query Editor, when we perform new step, it will re-reading the source and re-applying intermediate steps. One way to work around this is try a Reference table.
For example if the source table is named Data, create a reference to it named DataSteps with the following query:
let
Source = Data
in
Source
When the DataSteps table is finished with all steps, then copy/paste the steps to the Data table.
2. It seems use R script is one step before merging two tables, can you share the sample tables and R script for us to test?
Best Regards,
Qiuyun Yu
Where should I be doing that let Source = Data thing?
And regarding the sample data thing, I will try to create a insert script and share it here. I am just reading data from one SQL Server and doing some steps on that dataset to make it ready for the dashboard. Meanwhile, for other purpose, I am just storing the entire dataset in one table. Here is the R script that I am using for that.
# 'dataset' holds the input data for this script
library(RODBC);
serverConn <- odbcDriverConnect(connection="Driver={SQL Server Native Client 11.0};server=<server>;database=<database>;trusted_connection=yes;");
truncateOutput <- sqlQuery(serverConn, "truncate table <tableName>")
out <- sqlSave(serverConn, dataset, tablename ="<tableName>",rownames=FALSE, append = TRUE, varTypes=NULL)
output <- datasetI am doing this for three different datasets (Lets say A , B and C). Now during merging, I am loading A first and then merging it with B and then merging the combined dataset with C. Even if the dataset is loading multiple times, I am not seeing consisntent results between those three.
A had 4170 records in the original source. In my replica, I got 4351
B had 4160 records in the original source. In my replica, I got 12020
C had same records both in source and my replica.
I am really confused on why this happening when I merge the datasets. I am just really frustrated with all of this and thinking to do all the dumping part from SSIS and just use Power BI for pretty display.
And what is the point of re-doing all the steps when we merge one dataset with another ?
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!
Check out the October 2025 Power BI update to learn about new features.
| User | Count |
|---|---|
| 76 | |
| 37 | |
| 31 | |
| 29 | |
| 26 |