Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredGet Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now
Encountering an issue while importing data from Azure Cost Management into a Lakehouse using Power Query dataflow in Fabric.
Using inputs (parameters) such as Enrollment Number, Number of Months, and Scope, the process was running fine until 18th Aug but has been continuously failing since then. Any technical suggestions would be greatly appreciated. Thanks!
Below are the Errors noticed:
There was a problem refreshing the dataflow: 'Something went wrong, please try again later. If the error persists, please contact support.'. Error code: ActionUserFailure. (Request ID: e25a0238-de0d-49e1-9cdf-ccdf2ba146bf).
Azure Usage details_WriteToDataDestination: There was a problem refreshing the dataflow. Error code: 999999. (Request ID: e25a0238-de0d-49e1-9cdf-ccdf2ba146bf).
TrackingID#2509010010005392
Solved! Go to Solution.
Hi @fkc38 ,
Thanks for sharing the details. From what you’ve described, it looks like the failure happens specifically during ingestion into the Fabric Lakehouse table. The error message:
Azure Usage details_WriteToDataDestination ... Error code: 999999
suggests the problem occurs at the write-to-destination stage rather than during data retrieval. A few things to try include re-authenticating the data source by refreshing the Azure Cost Management connection in your Dataflow settings, as expired tokens or permissions can cause failures. You might also test smaller data loads by reducing the “Number of Months” parameter or narrowing the scope; if these succeed, the issue could be related to payload size limits during Lakehouse ingestion.
Other helpful steps are creating a new Lakehouse or writing to a Warehouse to see if the issue is tied to your current schema or partitions, validating the API externally via Power BI Desktop or Postman, and checking that your Lakehouse is within capacity and quota limits. If the table schema was manually modified, try restoring or recreating the destination table. If none of these resolve the problem, submitting a Microsoft support ticket with the Request ID or Tracking ID from the failed refresh is recommended, as the generic error code may reflect recent backend changes in the service.
Hi @fkc38 ,
Thanks for sharing the details. From what you’ve described, it looks like the failure happens specifically during ingestion into the Fabric Lakehouse table. The error message:
Azure Usage details_WriteToDataDestination ... Error code: 999999
suggests the problem occurs at the write-to-destination stage rather than during data retrieval. A few things to try include re-authenticating the data source by refreshing the Azure Cost Management connection in your Dataflow settings, as expired tokens or permissions can cause failures. You might also test smaller data loads by reducing the “Number of Months” parameter or narrowing the scope; if these succeed, the issue could be related to payload size limits during Lakehouse ingestion.
Other helpful steps are creating a new Lakehouse or writing to a Warehouse to see if the issue is tied to your current schema or partitions, validating the API externally via Power BI Desktop or Postman, and checking that your Lakehouse is within capacity and quota limits. If the table schema was manually modified, try restoring or recreating the destination table. If none of these resolve the problem, submitting a Microsoft support ticket with the Request ID or Tracking ID from the failed refresh is recommended, as the generic error code may reflect recent backend changes in the service.
Hi @fkc38 ,
I hope the information provided above assists you in resolving the issue. If you have any additional questions or concerns, please do not hesitate to contact us. We are here to support you and will be happy to help with any further assistance you may need.
Hi @fkc38 ,
I hope the above details help you fix the issue. If you still have any questions or need more help, feel free to reach out. We’re always here to support you
Hi @fkc38
I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions.
Hi @fkc38 ,
Can you try creating a new lakehouse and trying, or writing directly to a warehouse?
This will help us narrow down where exactly the problem is.
If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution.
Hi @fkc38,
What you’re seeing is a fairly common pattern when a previously stable connector suddenly starts failing: the ActionUserFailure and 999999 error codes are generic indicators that something changed in the service endpoint or authentication layer, not necessarily in your query. Since this began specifically after August 18, it suggests a backend update or policy change in the Azure Cost Management API.
Re-authenticate the data source connection in your dataflow settings. Token expiration or changes in consent scopes often trigger these errors.
Test outside Fabric: Use Power BI Desktop or Postman with the same parameters (Enrollment Number, Scope, Number of Months) to confirm whether the API still returns data.
Check limits: The Azure Cost Management API enforces throttling and row size limits. If your requested time window expanded or data volume grew, try a smaller “Number of Months” value to see if the query succeeds.
If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution.
Thank you for the reply. But I am facing this issue in Fabric Lakehouse while ingesting the data in to the table.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!
Check out the October 2025 Fabric update to learn about new features.