Starting December 3, join live sessions with database experts and the Microsoft product team to learn just how easy it is to get started
Learn moreGet certified in Microsoft Fabric—for free! For a limited time, get a free DP-600 exam voucher to use by the end of 2024. Register now
Im currently subscribing to the pro license and I have a large databricks hive table that consists of 900,000,000 rows. When i tried to refresh this table on powerbi service to get the newest data, i ran into the following error.
Data source error:<ccon>DataSource.Error: ERROR [HY000] [Microsoft][Hardy] (125) Token expired while fetching results: TEAuthTokenExpired. Please update the token and try again.. </ccon>. The exception was raised by the IDataReader interface. Please review the error message and provider documentation for further information and corrective action. Table: go_order_parts_w_shortage.Cluster URI:WABI-NORTH-EUROPE-F-PRIMARY-redirect.analysis.windows.netActivity ID:fa100e63-ae15-4073-8930-8c1444fbbcd8Request ID:e121d8fd-19dc-8b3b-b92f-692986ac1616Time:2023-08-09 04:21:51Z
I tried doing an incremental refresh, but i couldnt even get the table to be imported into the local powerbi desktop to begin with. It will have a similar token error as shown below.
Later on, instead of using the import mode, i tried to switch to the direct query mode but i guess the table is too huge to be queried directly, my visual couldnt be loaded and it hits a "Visual has exceeded the available resources" error in both powerbi service and powerbi desktop
Is there any solution to this issue? Thanks in advance
Hi @lbendlin @aj1973
Thank you for the responses, i managed to resolve the issue after switching from using Azure AD credential to using my databricks personal access token. I think its because Azure AD tokens are only valid for 60 minutes by default and my refresh takes more than an hour to complete, hence i got the token expired error.
Indeed the token is only valid for 60 mins.
Glad you got it work.
Regards
Amine Jerbi
If I answered your question, please mark this thread as accepted
and you can follow me on
My Website, LinkedIn and Facebook
The issue here doesn't seem to be the dataset size. Rather it takes too long to fetch the data from the data source.
Incremental refresh will help if you can identify a suitable datetime column for the partition management. You will have to experiment with the partition sizes to avoid running into the same token timeouts.
Hi Ibendlin, thank you for the reply. However, I couldnt even get the data to be imported into powerbi desktop to configure the incremental refresh because of the size.
Is it possible that I only use a subset of data (let's say 3 months) to configure the initial incremental refresh. But later on, I need a way to retrieve and refresh the whole dataset because my report needed it.
yes, you can limit the developer data as you like by providing useful values for RangeStart and RangeEnd. These values are just for you, they will be ignored by the Power BI service.
I think @jacque_jk needs to be aware that a dataset cannot exceed 1G of size under a Pro license and in a shared capacity WS.
Regards
Amine Jerbi
If I answered your question, please mark this thread as accepted
and you can follow me on
My Website, LinkedIn and Facebook
Starting December 3, join live sessions with database experts and the Fabric product team to learn just how easy it is to get started.
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early Bird pricing ends December 9th.
User | Count |
---|---|
36 | |
21 | |
18 | |
17 | |
10 |
User | Count |
---|---|
34 | |
33 | |
32 | |
19 | |
14 |