Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Compete to become Power BI Data Viz World Champion! First round ends August 18th. Get started.

Reply
SarahHope
Helper II
Helper II

Data Integration Strategy Advice

Hello,

 

I am relatively new to Power BI - about a year old.  I'm working on a dashboard and, before I get too much futher in, I want to re-think my data integration strategy.  I don't want to assume I am knowledgable of all my options.  So I was hoping someone could direct me to the most efficient method.

My SQL database is huge and is updated once per day.    I am using direct query for my source and I have a few different SQL queries to define my source data tables. I have them each related in a star format.  

This works except for that I'm running into limited functionalitiy with caculated tables/columns, limited transformation capeabilities.  For example, I can't use FORMAT or see the tables that I'm working on in the row and column format.  Since I'm not very experienced, it feels kind of like working blind.  

Should I keep working around these things?  Should I load entire tables even if they are enormous?  Should I load some tables and use SQL query as my source to gather less data from huge tables?  Is there something else that I'm missing?

I am eventually going to want to publish and schdule refresh of the data in the app.powerbi.com, so that might be a consideration too.  

 

I will appreciate suggestions.  If it is helpful to know more about the data or see it in order to suggest better options, I can do that - but I think my question is pretty general.

Thanks,
Sarah

 

 

2 ACCEPTED SOLUTIONS
Shravan133
Super User
Super User

Since you're working with a large SQL database, it's crucial to optimize performance while maintaining the ability to transform and analyze your data effectively.

Data Integration Options

  1. DirectQuery vs. Import Mode:

    • DirectQuery: Allows real-time data access and queries the database directly, but has limitations on transformations and calculated columns/tables.
    • Import Mode: Imports data into Power BI, enabling full transformation and calculation capabilities but can be slow and resource-intensive for very large datasets.
  2. Hybrid Approach:

    • Use DirectQuery for large, frequently changing tables where real-time data access is essential.
    • Use Import Mode for smaller, less frequently changing tables or for pre-aggregated data that requires complex transformations.

Recommended Strategies

  1. Optimize SQL Queries:

    • Write efficient SQL queries to reduce the amount of data pulled into Power BI.
    • Use views in your SQL database to pre-aggregate or pre-filter data before it reaches Power BI.
  2. Composite Models:

    • Power BI supports composite models, which allow you to use both DirectQuery and Import Mode in the same dataset.
    • Keep large tables in DirectQuery and smaller, more manageable tables in Import Mode.
  3. Dataflows:

    • Use Power BI Dataflows to perform ETL (Extract, Transform, Load) operations before loading data into your Power BI dataset.
    • Dataflows store the data in the Power BI service, allowing you to perform transformations and reduce the load on your desktop model.
  4. Aggregations:

    • Create aggregated tables to reduce the volume of data queried in DirectQuery mode.
    • Use aggregation tables to handle summarized data and direct detailed queries to the original large tables only when necessary.
  5. Partitions and Incremental Refresh:

    • Use partitions to divide large tables into smaller, more manageable chunks.
    • Implement incremental refresh to only refresh data that has changed, reducing the load and time required for data refreshes.

View solution in original post

Anonymous
Not applicable

Hi @SarahHope ,

 

Based on what you've described, your SQL data is large and needs to be refreshed daily, based on that your best bet would be to use incremental refreshes. Instead of doing a "full refresh" every day, consider refreshing "incremental partitions". For calculations that need to be done with DAX, you can use import mode for the data.

Incremental refresh for semantic models and real-time data in Power BI - Power BI | Microsoft Learn

 

Hope it helps!

 

Best regards,
Community Support Team_ Scott Chang

 

If this post helps then please consider Accept it as the solution to help the other members find it more quickly.

View solution in original post

3 REPLIES 3
SarahHope
Helper II
Helper II

Thank you for these responses.  They are helpful in getting me to think differently about how I import my data.  I'm still thinking and fiddling with it.  I really appreciate your taking time to help me out. 

Anonymous
Not applicable

Hi @SarahHope ,

 

Based on what you've described, your SQL data is large and needs to be refreshed daily, based on that your best bet would be to use incremental refreshes. Instead of doing a "full refresh" every day, consider refreshing "incremental partitions". For calculations that need to be done with DAX, you can use import mode for the data.

Incremental refresh for semantic models and real-time data in Power BI - Power BI | Microsoft Learn

 

Hope it helps!

 

Best regards,
Community Support Team_ Scott Chang

 

If this post helps then please consider Accept it as the solution to help the other members find it more quickly.

Shravan133
Super User
Super User

Since you're working with a large SQL database, it's crucial to optimize performance while maintaining the ability to transform and analyze your data effectively.

Data Integration Options

  1. DirectQuery vs. Import Mode:

    • DirectQuery: Allows real-time data access and queries the database directly, but has limitations on transformations and calculated columns/tables.
    • Import Mode: Imports data into Power BI, enabling full transformation and calculation capabilities but can be slow and resource-intensive for very large datasets.
  2. Hybrid Approach:

    • Use DirectQuery for large, frequently changing tables where real-time data access is essential.
    • Use Import Mode for smaller, less frequently changing tables or for pre-aggregated data that requires complex transformations.

Recommended Strategies

  1. Optimize SQL Queries:

    • Write efficient SQL queries to reduce the amount of data pulled into Power BI.
    • Use views in your SQL database to pre-aggregate or pre-filter data before it reaches Power BI.
  2. Composite Models:

    • Power BI supports composite models, which allow you to use both DirectQuery and Import Mode in the same dataset.
    • Keep large tables in DirectQuery and smaller, more manageable tables in Import Mode.
  3. Dataflows:

    • Use Power BI Dataflows to perform ETL (Extract, Transform, Load) operations before loading data into your Power BI dataset.
    • Dataflows store the data in the Power BI service, allowing you to perform transformations and reduce the load on your desktop model.
  4. Aggregations:

    • Create aggregated tables to reduce the volume of data queried in DirectQuery mode.
    • Use aggregation tables to handle summarized data and direct detailed queries to the original large tables only when necessary.
  5. Partitions and Incremental Refresh:

    • Use partitions to divide large tables into smaller, more manageable chunks.
    • Implement incremental refresh to only refresh data that has changed, reducing the load and time required for data refreshes.

Helpful resources

Announcements
August Power BI Update Carousel

Power BI Monthly Update - August 2025

Check out the August 2025 Power BI update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.

Top Solution Authors