The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredCompete to become Power BI Data Viz World Champion! First round ends August 18th. Get started.
Hi,
I have a Salesforce report that I would like to connect to Power BI, but it is over 25K rows, so the direct Salesforce report API is not an option as the limit is 2K rows. I had an idea to download the current report as a CSV, use that as my data source, and then refresh the Salesforce report API each day to add any new rows to my original dataset that have been added that day. Is there a clever way to do this within PowerBI? Thank you
Solved! Go to Solution.
@amitchandak , @Anonymous - Well, I think the easy way to do this would be to import the CSV file once, turn off refresh. Create another table using the API. Append the two tables, remove duplicates. You could run that until all 2000 API entries are duplicates and at that point you just update your CSV file.
Not pretty but functional. Could definitely be improved upon but that would at least be a start.
Hi, I know this reply is very late lol, but were you able to find a solution? As a workaround, maybe you can try to test your connection with a 3rd party connector and you won't have to unite 2 datasets later. I've tried windsor.ai, supermetrics and funnel.io. I stayed with windsor because it is much cheaper so just to let you know other options. In case you wonder, to make the connection first search for the Salesforce connector in the data sources list:
After that, just grant access to your Salesforce account using your credentials, then on preview and destination page you will see a preview of your Salesforce fields:
There just select the fields you need. It is also compatible with custom fields and custom objects, so you'll be able to export them through windsor. Finally, just select PBI as your data destination and finally just copy and paste the url on PBI --> Get Data --> Web --> Paste the url.
@amitchandak , @Anonymous - Well, I think the easy way to do this would be to import the CSV file once, turn off refresh. Create another table using the API. Append the two tables, remove duplicates. You could run that until all 2000 API entries are duplicates and at that point you just update your CSV file.
Not pretty but functional. Could definitely be improved upon but that would at least be a start.
Thanks! I'll try this
User | Count |
---|---|
87 | |
84 | |
36 | |
35 | |
30 |
User | Count |
---|---|
95 | |
74 | |
67 | |
52 | |
51 |