Join us for an expert-led overview of the tools and concepts you'll need to pass exam PL-300. The first session starts on June 11th. See you there!
Get registeredJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
We have 2 sources of data for reporting.
The nightly files are the result of the days data with some processing by the source system. We have built out a warehouse of Dimensions and Facts from this using spark jobs for the ELT process. Then transformed into Dimesnion/Facts in the Synapse Data Warehouse. This all is working great.
We are not using our Fabric warehouse in production yet as I need to solve the intraday reporting problem first. There is some data that comes in nightly files we process that does not come in the intraday data, but is needed to fully report the intraday data. Currently, we are just writing everything to a regular Azure SQL database for both historical and intraday.
My question is whether Data Warehouse is capable of keeping up with intraday changes? Will writing those changes as they come in to Data Warehouse be feasible? I presume it all just depends on the Capacity we choose, but wanted some input from y'all more knowledgable on it.
Solved! Go to Solution.
Hi @Digidank ,
Thanks for using Fabric Community.
Synapse Data Warehouse (DW) can definitely handle your intraday data updates. It's designed for large-scale data ingestion and frequent updates, so capacity shouldn't be a major obstacle. Here's how we can tackle both nightly and intraday data:
1. Nightly Data:
Keep using your existing Spark jobs to process and load nightly data files into Synapse DW. This is a familiar and efficient approach.
2. Intraday Updates:
Break down your intraday data into smaller batches and ingest them into Synapse DW throughout the day. This provides near real-time updates for your reports.
Important Note:
While Fabric notebooks are a valuable tool for data exploration and analysis, they currently cannot directly load data into the Data Warehouse. We recommend using Data Factory pipelines for this purpose.
Additional Tips:
Partition your tables in Synapse DW to optimize query performance for both historical and intraday data.
Consider using materialized views for frequently used aggregations on intraday data to further improve reporting speed.
Docs to refer -
Ingesting data into the warehouse - Microsoft Fabric | Microsoft Learn
Hope this is helpful. Please do let me know incase of further queries.
Hi @Digidank ,
Thanks for using Fabric Community.
Synapse Data Warehouse (DW) can definitely handle your intraday data updates. It's designed for large-scale data ingestion and frequent updates, so capacity shouldn't be a major obstacle. Here's how we can tackle both nightly and intraday data:
1. Nightly Data:
Keep using your existing Spark jobs to process and load nightly data files into Synapse DW. This is a familiar and efficient approach.
2. Intraday Updates:
Break down your intraday data into smaller batches and ingest them into Synapse DW throughout the day. This provides near real-time updates for your reports.
Important Note:
While Fabric notebooks are a valuable tool for data exploration and analysis, they currently cannot directly load data into the Data Warehouse. We recommend using Data Factory pipelines for this purpose.
Additional Tips:
Partition your tables in Synapse DW to optimize query performance for both historical and intraday data.
Consider using materialized views for frequently used aggregations on intraday data to further improve reporting speed.
Docs to refer -
Ingesting data into the warehouse - Microsoft Fabric | Microsoft Learn
Hope this is helpful. Please do let me know incase of further queries.
Hi @Digidank ,
Glad to know that we have answered your query. Please continue using Fabric Community on your further queries.
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
User | Count |
---|---|
2 | |
1 | |
1 | |
1 | |
1 |
User | Count |
---|---|
3 | |
3 | |
2 | |
2 | |
1 |