Explore and share Fabric Notebooks to boost Power BI insights in the new community notebooks gallery.
Check it out now!Microsoft is giving away 50,000 FREE Microsoft Certification exam vouchers. Get Fabric certified for FREE! Learn more
I have a report that we are trying to develop. Created a connection from Salesforce to Power BI using Azure Data Lake Storage Gen 2. I normally dont deal with data this large as there are about 2 million rows on the Salesforce end. When I do my transformation I choose to filter by last 365 days to narrow it down in Power Query. Is this the right thing to do? I think it still preloaded all 2 million of those rows from Salesforce and then filtered it. This is what I need...
The ability to query the Salesforce data before loading all those unneeded entries from the Salesforce data lake. Can someone explain how to do this in Power Query?
Solved! Go to Solution.
Instead of loading all the data into Power BI, you can use Salesforce Object Query Language (SOQL) to filter the data at the source.
You can also use params in your query :
SELECT * FROM YourObject WHERE CreatedDate >= @StartDate AND CreatedDate <= @EndDate
If you are using ADLS Gen 2, ensure that the data ingestion process to ADLS filters data before storing it.
You can use ADF or any other ETL tools to filter the data before it lands in ADLS.
Instead of loading all the data into Power BI, you can use Salesforce Object Query Language (SOQL) to filter the data at the source.
You can also use params in your query :
SELECT * FROM YourObject WHERE CreatedDate >= @StartDate AND CreatedDate <= @EndDate
If you are using ADLS Gen 2, ensure that the data ingestion process to ADLS filters data before storing it.
You can use ADF or any other ETL tools to filter the data before it lands in ADLS.