Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Compete to become Power BI Data Viz World Champion! First round ends August 18th. Get started.

Reply
EduBubicz
Frequent Visitor

Alternatives for transforming data (PBI desk) using a DataFlow with Import-instead of Direct Query

Hi,

I searched across the internet and forums and found articles like:

https://community.fabric.microsoft.com/t5/Desktop/Filtering-before-import-of-data/td-p/978014

https://community.fabric.microsoft.com/t5/Desktop/Is-it-Possible-to-Filter-Data-from-a-Dataflow-befo...

However, I still haven't found a good solution. The scenario is as follows:

Suppose I have a dataflow with only one table (for the sake of simplification) that comes directly from a table of service order items, which has approximately 15 million rows each year, with data going back to 2018 and multiple customer IDs in it.

I have data analysts who are called upon when there is a need to conduct very specific analyses requested by random customers.


We, as a company, want to give them access to a dataflow with these records so they can obtain the information they need and perform the custom analyses required using ETL in Power BI Desktop.


The problem is that, in order to achieve this, they always need to import around 100 million rows before they can filter for a specific client and data period.


I searched online for alternatives but could not find any that satisfy our needs.

I have a few boundary conditions:
- The analysts are not very data-savvy, so I don’t want them SQL-querying anything, which excludes using datamarts and querying them with filters, etc.
- RLS does not apply because the analysts take whatever customer is in their queue, so I cannot impose restrictions prior to accessing the data.

In my mind, some simple solutions could be (but I could not make them work or they are not supported):
- A dataflow with parameters that are prompted when the dataflow is being imported in Power Query (but it seems that this is not possible).
- Load only a minimal chunk of the dataflow (let's say, 5-10 rows) for the user to understand what each column is about, allow the user to perform all the ETL, and then load the entire dataset. I tried this but could not figure out if it is possible.


Do you have any sugestions/alternatives for doing this considering the conditions above?

Thanks in advance!

Ed

2 ACCEPTED SOLUTIONS
v-pnaroju-msft
Community Support
Community Support

Thankyou, @Akash_Varuna, for your response.

Hi @EduBubicz,

We appreciate your inquiry in the Microsoft Fabric Community Forum.

Based on my understanding, please find below an approach using Power BI Dataflows with Parameterized Queries in Power BI Desktop. This approach fully meets your requirements, is user-friendly for analysts, and ensures scalability without the need for SQL. It allows filtering before import, supports sample previews, and requires minimal maintenance.

Kindly follow the steps outlined below which may help resolve the issue:

  1. In Power BI Service, create a dataflow for your service order items table (approximately 100 million rows) using the appropriate connector (e.g., SQL Server). Apply initial transformations (such as removing unnecessary columns) in Power Query Online and save the dataflow. If needed, enable Incremental Refresh to partition data by date (for example, yearly) to facilitate faster refreshes (this requires Premium capacity).

  2. Set up parameters in Power BI Desktop. Connect to the dataflow by navigating to Home > Get Data > Power BI Dataflows. In the Power Query Editor, create two parameters: CustomerID (Text) to filter by customer, and DateRange (Date) to filter by date (or alternatively, create StartDate and EndDate parameters). Apply these filters in Power Query (for example, CustomerID = CustomerID parameter and OrderDate >= DateRange parameter) to reduce data volume before import.

  3. Preview a sample for ETL. Temporarily limit the data to 5–10 rows using “Keep First Rows” in Power Query Editor to define the ETL transformations (such as pivots and merges). Once the transformations are finalised, remove the row limit and apply the parameterized filters to import the filtered dataset.

  4. Analysts can open the .pbix file, update the CustomerID and DateRange parameters in Power Query Editor, and refresh to import only the filtered data. They can then perform custom ETL and build reports using the Power BI Desktop interface.

  5. Publish the .pbix file to Power BI Service and share it with analysts. Set up a refresh schedule for the dataflow to keep the data up to date.

Additionally, please refer to the following link for more information:
Parameters - Power Query | Microsoft Learn

If you find this response helpful, kindly mark it as the accepted solution and provide kudos. This will assist other community members facing similar queries.

Should you have any further questions, please feel free to contact the Microsoft Fabric community.

Thank you.

View solution in original post

Hi @EduBubicz,

Thank you for your follow-up. You are absolutely correct; the solution involves using a master .pbix file in Power BI Desktop, which is configured once with parameters such as CustomerID and DateRange, along with the required transformations. Analysts can then open this file, update the parameters to filter the data dynamically, and import only the filtered subset, thereby avoiding the need to reload the entire 100 million rows.

We are pleased to know that you find this approach feasible. It ensures a user-friendly experience for your non-technical analysts, eliminates the need for SQL queries, and reduces maintenance efforts by relying on a single dataflow.

If you find our response helpful, kindly consider marking it as the accepted solution and awarding kudos. This will assist other community members facing similar queries.

Should you have any further questions, please feel free to reach out to the Microsoft Fabric community.

Thank you.

View solution in original post

5 REPLIES 5
v-pnaroju-msft
Community Support
Community Support

Thankyou, @Akash_Varuna, for your response.

Hi @EduBubicz,

We appreciate your inquiry in the Microsoft Fabric Community Forum.

Based on my understanding, please find below an approach using Power BI Dataflows with Parameterized Queries in Power BI Desktop. This approach fully meets your requirements, is user-friendly for analysts, and ensures scalability without the need for SQL. It allows filtering before import, supports sample previews, and requires minimal maintenance.

Kindly follow the steps outlined below which may help resolve the issue:

  1. In Power BI Service, create a dataflow for your service order items table (approximately 100 million rows) using the appropriate connector (e.g., SQL Server). Apply initial transformations (such as removing unnecessary columns) in Power Query Online and save the dataflow. If needed, enable Incremental Refresh to partition data by date (for example, yearly) to facilitate faster refreshes (this requires Premium capacity).

  2. Set up parameters in Power BI Desktop. Connect to the dataflow by navigating to Home > Get Data > Power BI Dataflows. In the Power Query Editor, create two parameters: CustomerID (Text) to filter by customer, and DateRange (Date) to filter by date (or alternatively, create StartDate and EndDate parameters). Apply these filters in Power Query (for example, CustomerID = CustomerID parameter and OrderDate >= DateRange parameter) to reduce data volume before import.

  3. Preview a sample for ETL. Temporarily limit the data to 5–10 rows using “Keep First Rows” in Power Query Editor to define the ETL transformations (such as pivots and merges). Once the transformations are finalised, remove the row limit and apply the parameterized filters to import the filtered dataset.

  4. Analysts can open the .pbix file, update the CustomerID and DateRange parameters in Power Query Editor, and refresh to import only the filtered data. They can then perform custom ETL and build reports using the Power BI Desktop interface.

  5. Publish the .pbix file to Power BI Service and share it with analysts. Set up a refresh schedule for the dataflow to keep the data up to date.

Additionally, please refer to the following link for more information:
Parameters - Power Query | Microsoft Learn

If you find this response helpful, kindly mark it as the accepted solution and provide kudos. This will assist other community members facing similar queries.

Should you have any further questions, please feel free to contact the Microsoft Fabric community.

Thank you.

Thanks for the response @v-pnaroju-msft.

Just for clarification,

The idea is to create like a master pbix where the responsible for the data load all data in pbi desktop (but in fact only once), configures the parameters as filters and the row limitations and shares for the analysts to open and change the parameters to filter at will, so they don't need to load all data over and over. 

Is that right?
I think its a very feasible approach indeed. Thanks for the answer.

Hi @EduBubicz,

Thank you for your follow-up. You are absolutely correct; the solution involves using a master .pbix file in Power BI Desktop, which is configured once with parameters such as CustomerID and DateRange, along with the required transformations. Analysts can then open this file, update the parameters to filter the data dynamically, and import only the filtered subset, thereby avoiding the need to reload the entire 100 million rows.

We are pleased to know that you find this approach feasible. It ensures a user-friendly experience for your non-technical analysts, eliminates the need for SQL queries, and reduces maintenance efforts by relying on a single dataflow.

If you find our response helpful, kindly consider marking it as the accepted solution and awarding kudos. This will assist other community members facing similar queries.

Should you have any further questions, please feel free to reach out to the Microsoft Fabric community.

Thank you.

Akash_Varuna
Super User
Super User

Hi @EduBubicz  I think you could also try split the data into smaller, pre-filtered dataflows based on logical segments (like year, region, or customer groups). Analysts can then select and import only the relevant dataflow without SQL queries.

Thanks for your suggestions!

Helpful resources

Announcements
August Power BI Update Carousel

Power BI Monthly Update - August 2025

Check out the August 2025 Power BI update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.

Top Solution Authors
Top Kudoed Authors