Check your eligibility for this 50% exam voucher offer and join us for free live learning sessions to get prepared for Exam DP-700.
Get StartedDon't miss out! 2025 Microsoft Fabric Community Conference, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount. Prices go up February 11th. Register now.
I'm using a large table as a dataset in Power BI. I applied an incremental refresh to this dataset, but after publishing with the initial scope defined by the RangeStart and RangeEnd, it requires a one-time back fill of archived data. I am running into connection and timeout issues with this back fill. Is there a way to run this back fill in smaller chunks?
Solved! Go to Solution.
Use CSV or Parquet files as fake partitions and then append them in Power Query. Yes, you will reload all the data each time but both CSV and Parquet are ingesting very fast.
Use the XMLA endpoint with tools like SSMS and then refresh individual partitions selectively.
This is a premium per user workspace but we do not have premium capacity. From what I've read, XMLA requires premium capacity, is this true? If so, is there a way to partition the data load outside of premium capacity?
Use CSV or Parquet files as fake partitions and then append them in Power Query. Yes, you will reload all the data each time but both CSV and Parquet are ingesting very fast.
Sorry if this is a novice question but how would I create separate csv files out of a sql table/view? This needs to be an automated, scheduled refresh.
There is another option - using bootstrapping to create empty partitions and then fill them individually via XMLA.
https://learn.microsoft.com/en-us/power-bi/connect-data/incremental-refresh-xmla
Troubleshoot incremental refresh and real-time data - Power BI | Microsoft Learn
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Prices go up Feb. 11th.
Check out the January 2025 Power BI update to learn about new features in Reporting, Modeling, and Data Connectivity.
User | Count |
---|---|
146 | |
87 | |
66 | |
52 | |
46 |
User | Count |
---|---|
215 | |
90 | |
83 | |
66 | |
58 |