Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Learn more

Reply
0553914
Frequent Visitor

Copy contents of a table to another table (history) in my SSAS model at the end of each day

I have a tabular model, in which I capture near real-time data throughout the day.  At the end of each day, I want to copy the contents of this table (current day data) to another table in my model (historical data), using a SQL Agent Job and using the Refesh: Add/Calculate routine to incrementally load this data into a partition within the historical table.  This is how I've envisioned accomplishing this however someone may have a better suggestion/approach.  The goal is to allow my users to view the results from previous days in my Power BI report using the data loaded into the historical table.

1 ACCEPTED SOLUTION

Hi @0553914 ,

 

Thank you for your follow up question. Here’s a detailed guide to help you with querying SSAS  tables and performing incremental loads using a SQL Agent Job.

 

  1. Querying SSAS Tables: Since SSAS is a multidimensional or tabular model, you can't directly query it using traditional T-SQL. Instead, you’ll need to use DAX (Data Analysis Expressions) for Tabular Models or MDX (Multidimensional Expressions) for multidimensional models. For incremental loads, filter the data based on a date or a flag that marks the current day’s data. This way, you pull only the needed data for the historical table.

  2. Setting Up the SQL Agent Job: To automate the process, set up a SQL Agent Job that runs at the end of each day. Here’s what it will do:

    • Query SSAS: Use an SSIS package or an SSAS connection to extract data from your SSAS Tabular model for the current day.

    • Copy Data to Historical Table: Insert the extracted data into your historical table so the current day's data is copied over for historical reporting.

    • Schedule the job to run daily, say at 11:59 PM, to ensure the data is copied over every day.

  3. Incremental Refresh Using Partitions: For efficient data management, especially with large datasets, use partitions in your SSAS model. This divides the historical table into smaller, more manageable pieces (like one per day). When the data updates, only the relevant partition (such as the current day's data) is refreshed. This method improves performance since it avoids reprocessing the entire dataset.

Helpful References: 

Regards,

Yugandhar.

View solution in original post

9 REPLIES 9
V-yubandi-msft
Community Support
Community Support

Hi @0553914 ,

We wanted to follow up on our previous conversation regarding the issue you were experiencing. Have you had a chance to try the solutions we discussed?

If you need further assistance or have any updates, please let me know. If the issue is resolved, kindly mark it As Accepted Solution.

 

Best regards, 

Yugandhar.

V-yubandi-msft
Community Support
Community Support

Hi @0553914 ,

As we haven’t heard back from you, we wanted to kindly follow up to check if the solution we provided for your issue worked for you  or let us know if you need any further assistance?

Your feedback is important to us, Looking forward to your response. 


Thank You.

V-yubandi-msft
Community Support
Community Support

Hi @0553914 ,

 

We noticed we haven't received a response from you yet, so we wanted to follow up and ensure the solution we provided addressed your issue. If you require any further assistance or have additional questions, please let us know.

Your feedback is valuable to us, and we look forward to hearing from you soon.

 

Thank You.

 

V-yubandi-msft
Community Support
Community Support

Hi @0553914 ,

 

Thanks for contacting the Microsoft Fabric community.  Your approach of using a SQL Agent Job to copy daily data into a historical table and the Refresh - Add/Calculate routine for partition updates is a smart way to manage both real-time and historical data in your Power BI reports.

 

To improve performance and usability, consider these tips.

  • Partition your historical table by date for efficient queries.

  • Avoid full dataset reloads and improve performance by keeping historical data separate to prevent slowdowns in real-time queries.

  • If you're dealing with large datasets, use aggregations to speed up Power BI queries.

Many thanks to @nilendraFabric  and @collinq  for their valuable insights. Your guidance significantly enhances the discussion.

If there is any post helps, then please consider Accept it as the solution  to help the other members find it more quickly

nilendraFabric
Super User
Super User

Hello @0553914 

 

you are following the best possible way to achieve this.

 

By scheduling a SQL Agent job , you can efficiently copy your day’s data into a historical table with an additional export date flag. For environments where you wish to keep the operations within SSAS Tabular, dynamically partitioning the data using a Refresh: Add/Calculate routine is a robust way to manage daily snapshots while preserving high performance on your live data. This method ensures that both current and historical data are well segregated and available for reporting.

 

 

I'm looking for more specific direction regarding the type of query and format I should use in my SQL Agent job.  I'm used to running SQL queries against relational databases to do incremental loads, but have not been successful querying an SSAS table.

Hi @0553914 ,

 

Thank you for your follow up question. Here’s a detailed guide to help you with querying SSAS  tables and performing incremental loads using a SQL Agent Job.

 

  1. Querying SSAS Tables: Since SSAS is a multidimensional or tabular model, you can't directly query it using traditional T-SQL. Instead, you’ll need to use DAX (Data Analysis Expressions) for Tabular Models or MDX (Multidimensional Expressions) for multidimensional models. For incremental loads, filter the data based on a date or a flag that marks the current day’s data. This way, you pull only the needed data for the historical table.

  2. Setting Up the SQL Agent Job: To automate the process, set up a SQL Agent Job that runs at the end of each day. Here’s what it will do:

    • Query SSAS: Use an SSIS package or an SSAS connection to extract data from your SSAS Tabular model for the current day.

    • Copy Data to Historical Table: Insert the extracted data into your historical table so the current day's data is copied over for historical reporting.

    • Schedule the job to run daily, say at 11:59 PM, to ensure the data is copied over every day.

  3. Incremental Refresh Using Partitions: For efficient data management, especially with large datasets, use partitions in your SSAS model. This divides the historical table into smaller, more manageable pieces (like one per day). When the data updates, only the relevant partition (such as the current day's data) is refreshed. This method improves performance since it avoids reprocessing the entire dataset.

Helpful References: 

Regards,

Yugandhar.

collinq
Super User
Super User

Hi @0553914 ,

I think that you are correct that the best method is in SQL if you have the rights and abilities.  Power BI does not take "snapshots" easily.  The only way to get around all of this would be with Fabric and Dataflows and dropping the table into the Lakehouse or Warehouse on a regular basis.




Did I answer your question? Mark my post as a solution!

Proud to be a Datanaut!
Private message me for consulting or training needs.




I'm using an on-prem SSAS server, so fabric is not an option.

Helpful resources

Announcements
Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

October Power BI Update Carousel

Power BI Monthly Update - October 2025

Check out the October 2025 Power BI update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Kudoed Authors