Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started

Reply
JoyceW2020
New Member

Errors while refreshing dataflow

Getting the following errror while refreshing Dataflows:

Error: The credentials provided for the PowerBI source are invalid. (Source at PowerBIInternal.).. RootActivityId = 5e7f6b7b-9988-48d6-8f69-69491f1f0f88.Param1 = The credentials provided for the PowerBI source are invalid. (Source at PowerBIInternal.) Request ID: c97c28f7-465a-fb40-11d9-9241c1689b64.

 

Nothing was changed in the Dataflow and the error started at midnight PST (9/16/2020).

8 REPLIES 8
Jameswalter
New Member

Have you check the data type of the column also and verify that all data comes as number in your power bi desktop . pls do this once i am pretty sure after doing this you will able to solve that . note: By seeing the issue it on data side pls check you data once 

Regards,
J Wick


InsightBob
Frequent Visitor

I have a severity A ticket outstanding. we began experiencing these issues on 10/09 with some DFs and now the whole lot have fallen over.

I'm not satisfied with the workaround as it will mean tearing up our architecture and cause us scheduling issues.

I really think the service status on the support page shouldn't read "No known issues: Power BI is running smoothly" - its not really is it?

Anonymous
Not applicable

Same issue! 4 dataflows pulling from Salesforce and Excel, hourly refresh cadence, began failing yesterday at the 12:30AM scheduled refresh.

 

Spent all yesterday trying to troubleshoot, I was finally able to get everything running again, solution makes no sense but it's working for now. I hope this helps someone in the same predicament! 

 

My environment: 4 dataflows, the 1st pulls all the source data, the 3 following are Linked/Computed Entities. I am in a Premium workspace. Sources are Salesforce Objects & Excel. 

 

What I tried: Refreshing credentials, changing Privacy settings, switching gateways/gateway versions, importing the 1st dataflow as a new dataflow --- and it still failed. Finally, I took that source dataflow and imported it to a separate Premium Workspace - then it FINALLY refreshed!!! I was able to bring over the 3 reminaing/linked dataflows (importing JSON), and repoint their linked entities to the source dataflow (all in this separate Premium Workspace). The process was finicky and tedious but it finally worked. 1 Note - I did have to run it from the top (1st query kicks off 2, 3, & 4) in order for 2, 3 or 4 to refresh the first time around. 

 

The only thing different between the two workspaces is which capacity the workspace is assigned (we only have 2). The dataset consuming the dataflows is still in the orginal workspace and working fine. I asked our administator to move the second workspace to the same capacity as the orginal, and immediately the refresh failed. As soon as he switched the second workspace back to the other capacity, refreshes started working fine and ran hourly all night.

 

I cannot figure out what the issue is, my gut says this is related to metadata or authenication details stored in the metadata for this workspace? Perhaps a new capacity setting enabled by default? I would be great to find a solution, the dataflows still will not work in the original workspace and corresponding capacity

 

 

 

Hi @Anonymous I know this reply is very late but in this case I think there's no need for a dataflow. 

As a workaround, maybe you can try to test your connection with a 3rd party connector. I've tried windsor.ai, supermetrics and funnel.io. I stayed with windsor because it is much cheaper so just to let you know other options. In case you wonder, to make the connection first search for the Salesforce connector in the data sources list:

 

SALESFORCE-1.png

 

After that, just grant access to your Salesforce account using your credentials, then on preview and destination page you will see a preview of your Salesforce fields:

 

SALESFORCE-2.png

 

There just select the fields you need. It is also compatible with custom fields and custom objects, so you'll be able to export them through windsor.  Finally, just select PBI as your data destination and finally just copy and paste the url on PBI --> Get Data --> Web --> Paste the url. 

 

SELECT_DESTINATION_NEW.png

same issue, microsoft support replied with a workaround to disable the load, it did work but i need the the load enable because my dataset is using it. issue not resolved for me using that workaround. i really hope they resolve it asap 😞

nmuchoney
Frequent Visitor

Same here!

Me too!

same issue too....

https://community.powerbi.com/t5/Service/dataflows-with-linked-entities-failing-to-refresh/m-p/13766...

 

We have contacted Microsoft and confirmed the issue.... In those conversations, we learned that the Power BI team is aware and working on it.. We did reiterate the severity of this issue.

Helpful resources

Announcements
Europe Fabric Conference

Europe’s largest Microsoft Fabric Community Conference

Join the community in Stockholm for expert Microsoft Fabric learning including a very exciting keynote from Arun Ulag, Corporate Vice President, Azure Data.

July 2024 Power BI Update

Power BI Monthly Update - July 2024

Check out the July 2024 Power BI update to learn about new features.

July Newsletter

Fabric Community Update - July 2024

Find out what's new and trending in the Fabric Community.