The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredCompete to become Power BI Data Viz World Champion! First round ends August 18th. Get started.
Hello,
I am relatively new to Power BI - about a year old. I'm working on a dashboard and, before I get too much futher in, I want to re-think my data integration strategy. I don't want to assume I am knowledgable of all my options. So I was hoping someone could direct me to the most efficient method.
My SQL database is huge and is updated once per day. I am using direct query for my source and I have a few different SQL queries to define my source data tables. I have them each related in a star format.
This works except for that I'm running into limited functionalitiy with caculated tables/columns, limited transformation capeabilities. For example, I can't use FORMAT or see the tables that I'm working on in the row and column format. Since I'm not very experienced, it feels kind of like working blind.
Should I keep working around these things? Should I load entire tables even if they are enormous? Should I load some tables and use SQL query as my source to gather less data from huge tables? Is there something else that I'm missing?
I am eventually going to want to publish and schdule refresh of the data in the app.powerbi.com, so that might be a consideration too.
I will appreciate suggestions. If it is helpful to know more about the data or see it in order to suggest better options, I can do that - but I think my question is pretty general.
Thanks,
Sarah
Solved! Go to Solution.
DirectQuery vs. Import Mode:
Hybrid Approach:
Optimize SQL Queries:
Composite Models:
Dataflows:
Aggregations:
Partitions and Incremental Refresh:
Hi @SarahHope ,
Based on what you've described, your SQL data is large and needs to be refreshed daily, based on that your best bet would be to use incremental refreshes. Instead of doing a "full refresh" every day, consider refreshing "incremental partitions". For calculations that need to be done with DAX, you can use import mode for the data.
Incremental refresh for semantic models and real-time data in Power BI - Power BI | Microsoft Learn
Hope it helps!
Best regards,
Community Support Team_ Scott Chang
If this post helps then please consider Accept it as the solution to help the other members find it more quickly.
Thank you for these responses. They are helpful in getting me to think differently about how I import my data. I'm still thinking and fiddling with it. I really appreciate your taking time to help me out.
Hi @SarahHope ,
Based on what you've described, your SQL data is large and needs to be refreshed daily, based on that your best bet would be to use incremental refreshes. Instead of doing a "full refresh" every day, consider refreshing "incremental partitions". For calculations that need to be done with DAX, you can use import mode for the data.
Incremental refresh for semantic models and real-time data in Power BI - Power BI | Microsoft Learn
Hope it helps!
Best regards,
Community Support Team_ Scott Chang
If this post helps then please consider Accept it as the solution to help the other members find it more quickly.
DirectQuery vs. Import Mode:
Hybrid Approach:
Optimize SQL Queries:
Composite Models:
Dataflows:
Aggregations:
Partitions and Incremental Refresh: