Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us for an expert-led overview of the tools and concepts you'll need to become a Certified Power BI Data Analyst and pass exam PL-300. Register now.

Reply
PBILover
Helper V
Helper V

incremental dataset refresh on cosmos db

Hi,

 i have following questions while implementing incremental dataset on a cosmos datasource in a power BI.

1. I have cheked on stored rows for last 5 years , and refresh rows in last 10 days, if currently i m having only 10 columns in a dataset(datasource is having almost 80 columns) and now user wants to see one more column (i e 11 column) in a report , will all the historical data get refreshed with this new column?

2.  If there will be  any changes in the dataset i-e added new calculated column or measure, or changed type or added a new step in a edit query, how it will work for a historical data?

If  wanted that all historical data also get refreshed with this new changes what will be the options?

If the dataset will be large then it will get timeout error if we try to reload all the data again, how to handle this?

 

 

Thanks,

Namita

2 REPLIES 2
Icey
Community Support
Community Support

Hi @PBILover ,

 

 

When you want to do some changes to your dataset, it is needed to do it in Power BI Desktop and then re-publish the report.

The dataset in Power BI Service will be completely replaced. And incremental refresh will be based on this new dataset. It has nothing to do with the dataset before being replaced.

 

In addition, for timeout error, please refer to this document: Query timeouts.

 

 

 

Best Regards,

Icey

 

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Thanks Icey for your reply.

I will look into this time out article.

1.  I have a data set which is loading whole data which took almost 3 hours to complete (without incremental dataset refresh).

On demand6/22/2020, 5:12:26 PM6/22/2020, 8:02:58 PMCompleted

2.  I saved above dataset with different name and implemented incremental datset which is failed and run internally for more than almost 10 hours

Details Type Start End Status Message

 Scheduled6/23/2020, 7:51:09 AM In progress 
Hide
On demand6/22/2020, 10:30:39 PM6/23/2020, 7:51:09 AMFailedThere was an error when processing the data in the dataset.
Data source error:Unable to read data from the transport connection: An existing connection was forcibly closed by the remote host..

 

i can see the end time (error time), which is i can see the secheduled time for next refresh.

whats happening here when i am trying to implement the incremental dataset refresh?

 
 

my datasource is having data since may 2019

Thanks.

 

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June 2025 Power BI Update Carousel

Power BI Monthly Update - June 2025

Check out the June 2025 Power BI update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.