Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.

Reply
PBILover
Helper V
Helper V

incremental dataset refresh on cosmos db

Hi,

 i have following questions while implementing incremental dataset on a cosmos datasource in a power BI.

1. I have cheked on stored rows for last 5 years , and refresh rows in last 10 days, if currently i m having only 10 columns in a dataset(datasource is having almost 80 columns) and now user wants to see one more column (i e 11 column) in a report , will all the historical data get refreshed with this new column?

2.  If there will be  any changes in the dataset i-e added new calculated column or measure, or changed type or added a new step in a edit query, how it will work for a historical data?

If  wanted that all historical data also get refreshed with this new changes what will be the options?

If the dataset will be large then it will get timeout error if we try to reload all the data again, how to handle this?

 

 

Thanks,

Namita

2 REPLIES 2
Icey
Community Support
Community Support

Hi @PBILover ,

 

 

When you want to do some changes to your dataset, it is needed to do it in Power BI Desktop and then re-publish the report.

The dataset in Power BI Service will be completely replaced. And incremental refresh will be based on this new dataset. It has nothing to do with the dataset before being replaced.

 

In addition, for timeout error, please refer to this document: Query timeouts.

 

 

 

Best Regards,

Icey

 

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Thanks Icey for your reply.

I will look into this time out article.

1.  I have a data set which is loading whole data which took almost 3 hours to complete (without incremental dataset refresh).

On demand6/22/2020, 5:12:26 PM6/22/2020, 8:02:58 PMCompleted

2.  I saved above dataset with different name and implemented incremental datset which is failed and run internally for more than almost 10 hours

Details Type Start End Status Message

 Scheduled6/23/2020, 7:51:09 AM In progress 
Hide
On demand6/22/2020, 10:30:39 PM6/23/2020, 7:51:09 AMFailedThere was an error when processing the data in the dataset.
Data source error:Unable to read data from the transport connection: An existing connection was forcibly closed by the remote host..

 

i can see the end time (error time), which is i can see the secheduled time for next refresh.

whats happening here when i am trying to implement the incremental dataset refresh?

 
 

my datasource is having data since may 2019

Thanks.

 

Helpful resources

Announcements
FabCon Global Hackathon Carousel

FabCon Global Hackathon

Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes!

September Power BI Update Carousel

Power BI Monthly Update - September 2025

Check out the September 2025 Power BI update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Solution Authors
Top Kudoed Authors