Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get certified in Microsoft Fabric—for free! For a limited time, the Microsoft Fabric Community team will be offering free DP-600 exam vouchers. Prepare now

Reply
manishbeniwal
Frequent Visitor

How to transfer one table into multiple tables?

I have a big table of 3M+ records, I want to transfer my table data into SQL, So please suggest:

  • How can I transfer my data so that going forwards that process should be automatically transfer my data from that power bi table to SQL?
  • I used DAX studio to transfer table to SQL but I guess there is a limit? also how that DAX studio process can be automatic next time onwards?
  • I splitted my 3M+ data table into few different table using Reference, Index column and filter for 1M each in new tables. Is that the write approach? Will that auto refresh my reference table when data get refresh?

Appriciate your response. thanks

3 REPLIES 3
jennratten
Super User
Super User

Hello - here are some suggestions and things to consider...

  • What is the source of your table?  If it is possible to have the data written to SQL from the source instead of Power BI that is usually a preferred approach.
  • Why are you needing to write to SQL from Power BI?  Are you performing transformations on the data  and/or combining multiple data sources and need the resulting table to be written to SQL? Depending on the scenario, it still may be best to handle this in SQL without Power BI. 
  • If writing from Power BI is the best option, consider how the table is structured and how you can use the datapoints to programmatically query the data in batches and write to SQL.  For example, if you have a date column in the table 1/1/2022 - 12/31/2022, you could split the data by month.  You would define the range of months that are present in the data and query the data using a variable to represet each month, essentially each iteration would be month + 1.  An easy way to handle this would be to use Power Automate, which can query the dataset, create the variable, issue a DAX query against a dataset, write the result to SQL and perform the job in a loop and on a schedule. 
manishbeniwal
Frequent Visitor

Thanks @Mahesh0016  for your quick response. In my first query I am looking for My power Bi 3M+ records table data to move to SQL table, means from Power BI to SQL and that should be automatic going forwards, no manual interventation to do the same task. I was using DAX studio to do so, but that is not doing all the records in one shot, also DAX studio activity is manual everytime. Happy to hear back the best solution from you and Team. thanks again 

Mahesh0016
Super User
Super User

@manishbeniwal  Please Folloe Below Information! Thank You!!

 

 

It seems like you're looking for a solution to transfer your data from Power BI to SQL automatically and ensure that the process is ongoing. Based on your requirements and the information provided, here's a comprehensive approach to achieve your goals:

1. **Automated Data Transfer from Power BI to SQL**:

To achieve an automated data transfer from Power BI to SQL, follow these steps:

- **Initial Data Transfer**:
1. Use Power Query in Power BI Desktop to load your data from your source (3M+ records) into Power BI.
2. Set up the necessary transformations and data modeling within Power BI.
3. Publish the Power BI report to the Power BI service.

- **Automated Refresh**:
1. In the Power BI service, schedule the data refresh for your dataset. This involves configuring the dataset to pull data from your source on a regular basis (daily, weekly, etc.).
2. Power BI will automatically refresh the dataset based on the schedule you've set. It will pull the latest data from your source and update the dataset.

2. **Using DAX Studio for Data Transfer**:

DAX Studio is mainly used for writing and analyzing DAX queries. While it's possible to use DAX Studio to transfer data to SQL, it's not a common or efficient method for bulk data transfers. It's more suited for analytical queries within Power BI. Instead, consider using the Power Query capabilities in Power BI Desktop for data transformation and loading.

3. **Splitting Data into Smaller Tables**:

Splitting your 3M+ data table into smaller tables using reference, index columns, and filtering is a valid approach, especially for performance optimization. However, this approach alone won't provide automated refresh capabilities.

- **Automated Refresh for Split Tables**:
1. Set up an ETL (Extract, Transform, Load) process to regularly refresh and update your split tables in SQL from your source data in Power BI.
2. This ETL process could be implemented using tools like SSIS (SQL Server Integration Services), Azure Data Factory, or custom scripts that connect to Power BI's data sources and update the split tables.

In summary:

- Use Power Query in Power BI Desktop to load and transform your data.
- Schedule automated data refresh in the Power BI service for ongoing updates.
- If you've split your data into smaller tables, implement a separate ETL process to refresh those split tables in SQL.

Remember that the choice of tools and technologies (Power BI, SQL, ETL tools) will depend on your existing infrastructure, data volume, and your organization's preferences. Always test your automated processes thoroughly before deploying them in a production environment.

Helpful resources

Announcements
OCT PBI Update Carousel

Power BI Monthly Update - October 2024

Check out the October 2024 Power BI update to learn about new features.

September Hackathon Carousel

Microsoft Fabric & AI Learning Hackathon

Learn from experts, get hands-on experience, and win awesome prizes.

October NL Carousel

Fabric Community Update - October 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors