Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now
Hello Power BI Community,
We have a requirement to visualize transactional data spanning several years at a granular (transactional) level. Currently, for just 3 months, our dataset contains approximately 43 billion records. Since this volume is extremely large, we are unable to use Import mode as it fails due to resource limitations.
Our requirement is to display data over the entire lifetime of these transactions, not just aggregated summaries, within Power BI.
We have considered aggregations, but they do not meet the business need for detailed transactional-level visibility.
Could you please advise on the best practices or solutions to handle and visualize such massive volumes of detailed data in Power BI? Are there recommended architectures, storage modes, or tools (e.g., DirectQuery, composite models, aggregations, dataflows, or Azure services) that we should consider? Any example references or guidance would be highly appreciated.
We Tried it with Import Mode but It Failed!
Thank you!
Solved! Go to Solution.
Hi @2014aniruddha ,
Thanks for reaching out to Community Forum.
You can achieve ODAG-style filtering in Power BI using DirectQuery and parameters, without moving your data to the cloud. Start by creating a Power Query parameter: go to Home - Manage Parameters - New Parameter, name it p_product and choose the appropriate type (Whole Number or Text). Next, incorporate this parameter into your Oracle query in Advanced Options of the DirectQuery connection, like:
select *
from fact_table
where product = #"p_product"
The value for #"p_product" will be dynamically set at runtime.
Then, create a disconnected product table (e.g., SELECT DISTINCT product FROM fact_table) and use it in a slicer for user selection. Link the slicer to the parameter via a What-If Parameter or a DAX bridge table, so that queries only execute after the user selects filters, preventing full table scans. For very large datasets (over 100,000 rows), consider using Paginated Reports for better performance.
Reference: https://learn.microsoft.com/en-us/power-bi/connect-data/desktop-dynamic-m-query-parameters
While you cannot embed p_product directly in SQL as in Qlik ODAG, leveraging a Power Query parameter (#"p_product") with slicer selection replicates similar functionality. Queries are executed only after user input, effectively avoiding full table scans.
The table visual needs to display the complete transactional data (including all transactions), allowing users to filter by user number. However, the main challenge is handling the large volume of data efficiently. Please advise on the best approach to manage this bulk data load.
Hi @tejyam ,
Thanks for reaching out to Microsoft Fabric Community Forum.
Managing 43B+ rows in Import mode is not practical in Power BI, but there are effective strategies you can implement: DirectQuery / Composite Models: Store detailed transactions in a scalable source such as Synapse, Databricks, or ADX, and connect through DirectQuery. Use an aggregated Import table for summaries (monthly or weekly) to ensure quick visual performance, with DirectQuery used only for detailed analysis as needed.
Aggregations: Set up aggregation tables in Power BI so most queries are resolved quickly, while still enabling access to transaction-level data when necessary.
Mandatory Filters: When displaying transactions in a table, apply required filters (such as date, user number, or region) to limit the data queried and avoid loading billions of rows at once.
Paginated Reports: For scenarios where you need to review or export the full transactional dataset, Paginated Reports are suitable for handling large, filtered exports and complement Power BI dashboards.
Using this hybrid approach will maintain dashboard performance while providing users with access to all necessary transaction details as required.
Regards,
Sreeteja.
Hi @tejyam ,
I hope the information provided above assists you in resolving the issue. If you have any additional questions or concerns, please do not hesitate to contact us. We are here to support you and will be happy to help with any further assistance you may need.
Hi @tejyam ,
I hope the above details help you fix the issue. If you still have any questions or need more help, feel free to reach out. We’re always here to support you.
Hi @tejyam ,
I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions
Moving data to any cloud based system is not possible for now. We have to have oracle as data warehouse.
We have Qlik sense ODAG reporting which is currently able to accept the inputs from different dimensions and run the query with those filtered values and return the result. We are looking for similar functionality in Power BI.
Typically our requirement is a big fact table will be queries and returned data only when user will select some values from the filters in the reports like region, date range, product etc. and those selected values will be passed to the query (which is built on a big fact table having billions of records) and it will return those filtered output which should max in millions or in some 100Ks.
Is this possible to achive ? If anyone can help please.
Hi @2014aniruddha ,
Use DirectQuery to Oracle so queries run only after filters, enforce mandatory slicers/parameters to avoid full scans, and use Paginated Reports for large filtered extracts. This mimics ODAG-like on-demand querying without moving data to the cloud.
Hi
I have a requirement whereI can't load the full fact table due to huge volume. Hence I want that to be in direct query mode where say product filter will have a product_parameter which will run the query based on that product_parameter value selcted by the user at front end.
I have showed the parameter product_parameter as filter slicer (show as value), now whatever value user will select in that filter , i want that value to passed to the direct query on the fact table and fetch the data for that filtered value.
I need to know how I can place the parameter in the direct query sql like
[selct * from fact_table where product = 4 ] instead of '4' I want to mention the parameter name say p_product.. how to write the above sql please in the direct query ?
Any help please on this.
Many Thanks
Ani
Hi @2014aniruddha ,
Thanks for reaching out to Community Forum.
You can achieve ODAG-style filtering in Power BI using DirectQuery and parameters, without moving your data to the cloud. Start by creating a Power Query parameter: go to Home - Manage Parameters - New Parameter, name it p_product and choose the appropriate type (Whole Number or Text). Next, incorporate this parameter into your Oracle query in Advanced Options of the DirectQuery connection, like:
select *
from fact_table
where product = #"p_product"
The value for #"p_product" will be dynamically set at runtime.
Then, create a disconnected product table (e.g., SELECT DISTINCT product FROM fact_table) and use it in a slicer for user selection. Link the slicer to the parameter via a What-If Parameter or a DAX bridge table, so that queries only execute after the user selects filters, preventing full table scans. For very large datasets (over 100,000 rows), consider using Paginated Reports for better performance.
Reference: https://learn.microsoft.com/en-us/power-bi/connect-data/desktop-dynamic-m-query-parameters
While you cannot embed p_product directly in SQL as in Qlik ODAG, leveraging a Power Query parameter (#"p_product") with slicer selection replicates similar functionality. Queries are executed only after user input, effectively avoiding full table scans.
Hi @tejyam
Do you need to vizualise the transaction for all transactions ? Or selecting a few of them and visualise it for this one would be enoug?
If this approach is fine with your need, you could use Power BI paginated report (you apply the filters before retrieving the dataset)
Check out the November 2025 Power BI update to learn about new features.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!