Skip to main content
Showing results for 
Search instead for 
Did you mean: 

Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started

Frequent Visitor

How to Bypass Salesforce 2000 Rows Limitation

Hey guys,


A few weeks ago I had a problem with the 2000 rows limitation while importing data from Salesforce to Power BI. I did a little research and found out that this was a problem with the Salesforce API, Microsoft couldn't do anything to fix it and a lot of people were struggling with that issue as well. 


However, I finally found an easy and quick way to bypass this problem and since this community helped me a lot when I started working with Power BI, I feel that now is my time to give it back. Now we don't have to manually download the report from Salesforce and replace an Excel file or create a lot of reports to split the data in pieces of 2000 rows.


You will import data from Salesforce reports to a Google Sheets and then import data from this spreadsheet to your Power BI file.


- Open a sheet in Google Sheets.


- Go to Add-Ons > Get Add-Ons.


- Search for 'Salesforce' and install the 'Data Connector for Salesforce'.


data connector.PNG


- Login to your Salesforce account, allow and authorize the permissions Google Sheets needs to access the data.


- Go to Add-ons > Data Connector for Salesforce > Open.


- You will find this menu:


 open addon.PNG


- Now you can import data from all your reports with no limitation clicking on "Reports".


- After that you can click on "Refresh" and schedule auto data refreshes every 4, 8 or 24 hours.


- Open your Power BI, Get Data > From Web


- Get a shareable link of your Google Sheet (with permission to view at least) and paste it.


- Modify your Google Sheet link from:




Now it's done. You can start working with your Salesforce data without the 2000 rows limitation and with auto refreshes.


Hope it helps!

Community Support
Community Support

Hi @fernandohiltner ,


That's pretty cool! Thanks for sharing that!


Community Support Team _ Frank
If this post helps, then please consider Accept it as the solution to help the others find it more quickly.

View solution in original post

New Member

You can upvote the request to increase that 2000-row limitation.  The more votes, the more likely it will be changed.  Here is that link.

Helper IV
Helper IV

I don't think this is working anymore. The connector will also download only 2000 records into the gsheet.


Then it will tell you to re-auth to get "all data" but it just does not work. You can be revoking in SF and authorizing here in G Sheet over and over this message will just keep popping out and of course, the connector continues to download only 2000 records:


(Not to mention the message is outdated too - you need to revoke the permissions in your personal settings in SF). 


And of course, no way to contact Google about this (maker of the add-on). 

Helper I
Helper I

Just want to say thank you for this neat workaround, awesome stuff!👍

Regular Visitor

My Salesforce object has 137 columns, when trying to extract it gives the following error: Error: Limit Exceeded: URL Length of URLFetch.

Regular Visitor

Does this method allow authentication using username/password or another method or it is only via unsecured web connector (Anyone with the link can access the data)?



Not applicable

Hi Fernando,




Thanks for the solution here, is was fantastic.


But unfortunately I still have an issue to solve, when I am using google sheets and as my report is quite long, the Salesforce connector shows un error "Exceeded maximum execution time", and because of that I can't import the report. I have searched about this error and everyone says the limits is 5 min running, but I don't have a solution for that. Could you please help me on it?




Hi Matheus!




Firstly, thank you for your feedback.


Regarding your problem, I have never faced this issue before. I guess this must be a limitation within Google Sheets for running scripts. I guess I would split the report into smaller parts (to run the request in less than 5 minutes) and then aggregate it all inside Power BI. Is this a viable option for you?


Hope it helps!

Community Support
Community Support

Hi @fernandohiltner ,


That's pretty cool! Thanks for sharing that!


Community Support Team _ Frank
If this post helps, then please consider Accept it as the solution to help the others find it more quickly.

Helpful resources

Europe Fabric Conference

Europe’s largest Microsoft Fabric Community Conference

Join the community in Stockholm for expert Microsoft Fabric learning including a very exciting keynote from Arun Ulag, Corporate Vice President, Azure Data.

Power BI Carousel June 2024

Power BI Monthly Update - June 2024

Check out the June 2024 Power BI update to learn about new features.


Fabric Community Update - June 2024

Get the latest Fabric updates from Build 2024, key Skills Challenge voucher deadlines, top blogs, forum posts, and product ideas.

RTI Forums Carousel3

New forum boards available in Real-Time Intelligence.

Ask questions in Eventhouse and KQL, Eventstream, and Reflex.

Top Solution Authors
Top Kudoed Authors