Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
Hi all — I’m new in my organization and we’re in the early stages of adopting Power BI for enterprise reporting. I’ll be handling Power BI end-to-end.
Our setup:
We have a live POS system capturing daily transactions.
To avoid impacting the live POS, the DBA is creating a centralized Oracle Fact table that will be populated from the transactional system.
Power BI will then connect to that Fact table to generate daily sales dashboards for management.
My main concerns:
Since the data volume will be enormous, what’s the best approach for connecting Power BI to this Oracle fact table?
Should I use DirectQuery, Import, or a hybrid (composite) model?
What’s the recommended way to handle daily refreshes, performance, and long-term scalability?
How do enterprise setups generally avoid crashes or lag when connecting to large DBs?
I’d appreciate any advice on architecture, data modeling, or real-world experience with similar setups. Thanks in advance!
Solved! Go to Solution.
To provide a professional and accurate answer to your questions, a few key clarifications are needed first:
What exactly do you mean by "enormous data volume"? How many rows and columns are we talking about?
What are the data types involved?
How many distinct values are there per column? (This is important because Power BI’s VertiPaq engine compresses data more efficiently when values repeat.)
How many refreshes per day are expected?
Are real-time or near real-time insights required?
In general, it's highly recommended to avoid working with a single wide flat table. Instead, use a star schema model, where you have:
A Fact table with only transactional data (no descriptive columns), and
Related dimension tables (e.g., Customers, Stores, Products, etc.)
This structure:
Improves performance by leveraging VertiPaq compression
Makes DAX logic easier to manage
Saves memory and boosts query efficiency
If the data model is well-designed and doesn't require real-time queries, Import mode is usually preferred. It allows better user experience, faster visual responsiveness, and easier development.
DirectQuery or Composite models can become more complex and slower, as each interaction with a visual can result in a live query to the source database. These models require careful tuning and are best used when Import is not feasible.
You should also aggregate your data as much as possible before loading into Power BI to reduce the volume and complexity.
Bottom line: For a truly tailored solution, I recommend consulting with a Power BI expert or Microsoft partner in your region. They can assess your infrastructure, data model, and business needs to help you define the best architecture for scalability and performance.
If this post helps, then please consider Accepting it as the solution to help the other members find it more quickly
Thank you for reaching out to the Microsoft Fabric Forum Community.
@GanesaMoorthyGM - Since your team is still working on these tasks, it's great to hear that you received the right suggestion from the user. Please reach out to the community if you have any issues. We are happy to assist you.
Thanks.
Hope everything’s going smoothly on your end. I wanted to check if the information was helpful. if you have any other issues please reach community
Hi @GanesaMoorthyGM
If you have any other issues please reach community, we are happy to assists you.
Thank you for reaching out to the Microsoft Fabric Forum Community.
To provide a professional and accurate answer to your questions, a few key clarifications are needed first:
What exactly do you mean by "enormous data volume"? How many rows and columns are we talking about?
What are the data types involved?
How many distinct values are there per column? (This is important because Power BI’s VertiPaq engine compresses data more efficiently when values repeat.)
How many refreshes per day are expected?
Are real-time or near real-time insights required?
In general, it's highly recommended to avoid working with a single wide flat table. Instead, use a star schema model, where you have:
A Fact table with only transactional data (no descriptive columns), and
Related dimension tables (e.g., Customers, Stores, Products, etc.)
This structure:
Improves performance by leveraging VertiPaq compression
Makes DAX logic easier to manage
Saves memory and boosts query efficiency
If the data model is well-designed and doesn't require real-time queries, Import mode is usually preferred. It allows better user experience, faster visual responsiveness, and easier development.
DirectQuery or Composite models can become more complex and slower, as each interaction with a visual can result in a live query to the source database. These models require careful tuning and are best used when Import is not feasible.
You should also aggregate your data as much as possible before loading into Power BI to reduce the volume and complexity.
Bottom line: For a truly tailored solution, I recommend consulting with a Power BI expert or Microsoft partner in your region. They can assess your infrastructure, data model, and business needs to help you define the best architecture for scalability and performance.
If this post helps, then please consider Accepting it as the solution to help the other members find it more quickly
Thanks for your quick response, @Ritaf1983.
We are currently in the exploration and solution design phase for an enterprise-wide Power BI rollout.
When I mentioned "enormous data," I was being cautious — while the data volume is already considered high, our long-term plan is to connect Power BI with live or near real-time data from global channels (POS and eCommerce). So we’re anticipating significant growth and want to ensure we design a solution that’s scalable and future-proof.
Our primary business requirement is to display day -1 sales reports (i.e., today's sales available on dashboards by the following morning). This has led to internal discussions around whether Import mode or DirectQuery would be more appropriate — especially considering performance, scalability, and the need for accurate, timely data delivery.
Since we’re still early in the process, I currently don’t have full visibility into the raw transactional data or the finalized fact model — the data engineering and DBA teams are still working on that. However, your insights around data modeling, VertiPaq compression, and architecture are extremely helpful and much appreciated as we shape our solution.
Thanks again for your guidance!
The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!
| User | Count |
|---|---|
| 39 | |
| 38 | |
| 36 | |
| 28 | |
| 28 |
| User | Count |
|---|---|
| 124 | |
| 88 | |
| 74 | |
| 66 | |
| 65 |