Join us for an expert-led overview of the tools and concepts you'll need to pass exam PL-300. The first session starts on June 11th. See you there!
Get registeredPower BI is turning 10! Let’s celebrate together with dataviz contests, interactive sessions, and giveaways. Register now.
Hello Community!
I have created an API connection to a ServiceNow instance for realtime data from a table. The data connects in the modeler just fine, however when I went to start exploring the data, I realized I had but only 1000 rows. I didn't use any specific parameter when connecting to ServiceNow, I just chose Get Data > Web > Put in the API value https://(company).service-now.com/api/now/v2/table/{tableName} and signed in as organization account. I was successfully able to connect to the table data using that API and its parameters located in the docs found here. These docs state that the default pagination is 10000 not 1000 so I'm a little confused why I only have 1000 records or rows of data. I have almost 20000 records in this table and expect this to grow about half this size in the next year. Therefore I really don't wish to deal with manual exporting of data regularly to import again. Any thoughts as to why I'm only seeing 1000 records and what I might do to work with these better?
I'm fairly new to working with PowerBI and any suggestions or thoughts are appreciated!
Thnak you!
Hello,
In case you're still struggling exporting data from Servicenow, you might want to try out our Servicenow app: Power BI Connector:
We have a handy User and Admin Guides for your convenience, but feel free to reach out support team at support@alpha-serve.com if you need any assistance.
Thank you.
Regards,
Anton
===
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @Anonymous ,
Are you on Power Query Editor? If so, it is set as default to view first 1,000 rows. To change that, which on the "Column profiling based on top 1000 rows" at the bottom, and click "Column profiling based on entire dataset".
Thank you. I'm going to try this right now and refresh the data sources.
I did notice that when in the Advanced Editor, there were a whole bunch of paramters placed in afterwards, specifically the one at the end had 1000 records. I removed this and added 3000 but when I did and refreshed, I noticed I only had the last 17000k of records or so and that my single card indicated - using the record number (first, last) as a field would indicate that I'm only getting a subset of the data. I'm not getting any errors or anything but this is really strange. I've been reading about pagination, but I have no idea how to go about that in Power BI.
Let me try this and reply! Thank you again for an expedient response @Anonymous
So I did this and nothing more really happened. Maybe I did something wrong?
I went back into the Power Query Editor and refreshed the data source. No change.
I then went into the data source and modified the string to have 20000 instead of 30000.
This time I encountered an error and when examining it - the message is
Transaction cancelled: maximum execution time exceeded
Hi @Anonymous ,
If the query is running correctly, you may want to try to increase the timeout period.
Please refer to this article https://community.powerbi.com/t5/Desktop/Query-Timeout-on-update/td-p/53459
You might want to do some querying to select only needed columns so it runs faster.
I also found the following, for which this person is attempting to pull way too many records. Notice they reference to another Power BI forum where some M code is used, except that I'm not sure how I would run that when I'm not using an API key but an organizational account.
Hmm, I'm must be doing something right or wrong here because I did get different results. I added that timeout for 5m to the end of the string as shown the the advanced editor. I didn't have any syntax issues. I then tried refreshing the data and received no error. I then found that when examining instances of record numbers from NOW, I was getting random record numbers from start to finish. In fact, only 3-4k will ever download with the 5m timeout.
If I attempt to export these manually from NOW, I get the message about my 17k records taking a long time and if I want them emailed to me or to wait. I waited for a csv and it took less than 5m around 40+ Mb of data. I don't think this is too excessive so I'm clearly running into something here.
User | Count |
---|---|
84 | |
78 | |
69 | |
46 | |
42 |
User | Count |
---|---|
106 | |
50 | |
49 | |
40 | |
40 |