Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered

Reply
DRSH_9876
Regular Visitor

DataFlow Gen2 - Errors while publishing details to warehouse

There was a problem refreshing the dataflow: 'Something went wrong, please try again later. If the error persists, please contact support.'. Error code: ActionUserFailure. (Request ID: fd78e68b-6f84-484f-b3fe-dff8c16de3f1).

apptiomcloudhub_WriteToDataDestination: There was a problem refreshing the dataflow: 'Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: DataSource.Error: Pipeline execution failed (runId: 534a8769-0465-45b3-a289-3c777a06e78e). Operation on target ca-7f06aab9-23b1-4b02-b698-37ac038fe93f failed: ErrorCode=DelimitedTextBadDataDetected,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Bad data is found at line 2 in source CloudabilityActualCostExport01192023_c06aab61-1b4f-4fb1-8799-fd6401d036c6.csv.,Source=Microsoft.DataTransfer.ClientLibrary,''Type=CsvHelper.BadDataException,Message=You can ignore bad data by setting BadDataFound to null.
IReader state:
ColumnCount: 60
CurrentIndex: 21
HeaderRecord:
["InvoiceSectionName","AccountName","AccountOwnerId","SubscriptionId","SubscriptionName","ResourceGroup","ResourceLocation","Date","ProductName","MeterCategory","MeterSubCategory","MeterId","MeterName","MeterRegion","UnitOfMeasure","Quantity","EffectivePrice","CostInBillingCurrency","CostCenter","ConsumedService","ResourceId","Tags","OfferId","AdditionalInfo","ServiceInfo1","ServiceInfo2","ResourceName","ReservationId","ReservationName","UnitPrice","ProductOrderId","ProductOrderName","Term","PublisherType","PublisherName","ChargeType","Frequency","PricingModel","AvailabilityZone","BillingAccountId","BillingAccountName","BillingCurrencyCode","BillingPeriodStartDate","BillingPeriodEndDate","BillingProfileId","BillingProfileName","InvoiceSectionId","IsAzureCreditEligible","PartNumber","PayGPrice","PlanName","ServiceFamily","CostAllocationRuleName","benefitId","benefitName"]
IParser state:
ByteCount: 0
CharCount: 1705
Row: 2
RawRow: 2
Count: 60
RawRecord:
Hidden because ExceptionMessagesContainRawData is false.
,Source=CsvHelper,' Details: Reason = DataSource.Error;RunId = 534a8769-0465-45b3-a289-3c777a06e78e'. Error code: Fast Copy User Error. (Request ID: fd78e68b-6f84-484f-b3fe-dff8c16de3f1).

1 ACCEPTED SOLUTION

Hi @DRSH_9876 ,

Thanks for getting back and for sharing the detailed screenshots of your Dataflow and the error logs.

Regarding the missing Publish button, in Dataflow Gen2 with CI/CD enabled (Git integration), the Publish option is replaced by Save and Run. It looks like this setting was enabled when you created the Dataflow, which is why you’re seeing the Save and Run button instead.

 

As for the error you’re encountering, it’s challenging to pinpoint the root cause based on the screenshots alone. Since you’ve mentioned facing a similar error during the Copy Data activity in your pipelines, this suggests the issue might be originating from the data source itself - possibly related to the CSV file formatting or structure.

 

Please verify the consistency of the CSV files, especially the second line where the error occurs, to ensure the columns align with the header and data format expectations.

 

If the issue persists or requires more detailed troubleshooting, raising a support ticket with Microsoft Fabric support would be the best option to get dedicated help:
https://support.fabric.microsoft.com/support

If you have already raised a support ticket, please feel free to share any insights or solutions provided by the support team here. This would greatly help others in the community who might be facing similar issues.

 

Best Regards,
Vinay,
Fabric Community Support.

View solution in original post

11 REPLIES 11
v-veshwara-msft
Community Support
Community Support

Hi @DRSH_9876 ,
Thanks for reaching out to Microsoft Fabric Community and for sharing the detailed error message.

In addition to what @miguel asked earlier, I wanted to share a few more points that might help you troubleshoot this issue.

 

The error DelimitedTextBadDataDetected along with CsvHelper.BadDataException suggests there might be inconsistencies in the source file, such as mismatched column counts, unexpected delimiters, or special characters. The error indicates that the parser encountered an issue at line 2 of the CSV file, so reviewing that line for any anomalies might be helpful.

 

Also, just a note: when your Dataflow's destination is set to Warehouse, Fabric uses a staging mechanism during the load, which can sometimes amplify errors related to data format. As a  workaround, could you try setting the destination to Lakehouse instead of Warehouse? This can sometimes help bypass staging-related issues and may resolve the error.

 

Additionally, I found a few community threads where users faced similar challenges:

 

Please let us know how it goes if you try switching the destination or reviewing the CSV data.

 

Hope this helps. Please reach out for further assistance.
If this post helps, then please consider to give a kudos and Accept as the solution to help the other members find it more quickly.


Thank you.

Thank you for sharing me the details.  Let me give you a brief background..

 

I am using a Fabric trial version, wanted to import the Azure cost management and billing details (CSV) into Fabric Lakehouse or Warehouse.

 

Step1: Created the workspace.

Step2: Selected data pipeline to organize and move the data, had to choose copy data assistant as other options were throwing some errors.

Step3 : Copy data – Choose Azure blobs and gave the connection strings using Organization account with create new connection.

Step4: It shows me the folder path in Microsoft Azure where the CSV is generated.I choose the latest file and this throws the error [Bad data is found at line 2 in source CloudabilityAmmortizedCostExport01192023_f4bc323c-cc94-49dc-adf0-c26e2947adbd.csv. You can ignore bad data by setting BadDataFound to null. IReader state: ColumnCount: 77 CurrentIndex: 21 HeaderRecord: ["InvoiceSectionName","AccountName","AccountOwnerId","SubscriptionId","SubscriptionName","ResourceGroup","ResourceLocation","Date","ProductName","MeterCategory","MeterSubCategory","MeterId","MeterName","MeterRegion","UnitOfMeasure","Quantity","EffectivePrice","CostInBillingCurrency","CostCenter","ConsumedService","ResourceId","Tags","OfferId","AdditionalInfo","ServiceInfo1","ServiceInfo2","ResourceName","ReservationId","ReservationName","UnitPrice","ProductOrderId","ProductOrderName","Term","PublisherType","PublisherName","ChargeType","Frequency","PricingModel","AvailabilityZone","BillingAccountId","BillingAccountName","BillingCurrencyCode","BillingPeriodStartDate","BillingPeriodEndDate","BillingProfileId","BillingProfileName","InvoiceSectionId","IsAzureCreditEligible","PartNumber","PayGPrice","PlanName","ServiceFamily","CostAllocationRuleName","benefitId","benefitName"] IParser state: ByteCount: 0 CharCount: 2544 Row: 2 RawRow: 2 Count: 77 RawRecord: Hidden because ExceptionMessagesContainRawData is false. Activity ID: 59d472c2-dbdb-4c89-88b2-2b38d627abdf]

 

In Preview date pane: If I choose schema agnostic (binary copy)..it works but gives wrong details. If I choose File format as comma, tab…etc none of the options work.

 

Step 5: I choose Lakehouse as data destination and give name..it connects to root folder files and stores the csv data there.

 

Step 6 : Open lakehouse and choose files – select the file and load to tables.

 

In short, I want to ingest the cost management and billing data from Azure on daily basis into lakehouse, transform the data into warehouse and then using powerbi publish reports.

 

Am I missing any steps.. I am using fabric for the first time and trial version.

 

Appreciate your help and guidance

This seems a bit different to the issue originally raised. The error from the initial message was coming from Dataflows or from pipelines? It appears that the last comment, the one I'm replying to, is only mentioning a pipeline and no Dataflow involvement. Could you please confirm?

Hi, I encountered the first error when using the Dataflow Gen2 pipeline. Since it wasn't resolved, I tried using the Data Pipeline with all formats, but it gave the same error except for the binary file selection. Thank you.

For the dataflow error, would you mind going through the previous replied provided ?

 

Do feel free to open a new topic in the Data pipelines subforum do the community can better assist with that specific error. We can continue using this current topic for the error provided on the Dataflow Gen2 side

Sure. For Dataflow Gen2 pipeline the error still persists and I am looking for a solution. Thanks!

Do let us know if you could provide more information about the scenario that you're trying to do and if you could answer the questions from the previous replies. Those will help us try to determine the root cause of the issue. If you could also share some steps to reproduce the error, that will help tremendously.

 

Ultimately, if you prefer to have direct assistance from our support team you can raise a support ticket and get immediate help from our team. Below is the link to reach support:

https://support.fabric.microsoft.com/support

miguel
Community Admin
Community Admin

Hi!

Could you share more information about your Dataflow? How many queries is it trying to load to the Warehouse? Are they using automatic settings? are you able to reproduce this behavior in a different Dataflow? if yes, could you please share the repro steps so we can test on our side?

While saving and Running to Lakehouse it creates error log. Publish button is not visible ,as an alternative save n Run is the option givenWhile saving and Running to Lakehouse it creates error log. Publish button is not visible ,as an alternative save n Run is the option givenError LogError Log

Hi @DRSH_9876 ,

Thanks for getting back and for sharing the detailed screenshots of your Dataflow and the error logs.

Regarding the missing Publish button, in Dataflow Gen2 with CI/CD enabled (Git integration), the Publish option is replaced by Save and Run. It looks like this setting was enabled when you created the Dataflow, which is why you’re seeing the Save and Run button instead.

 

As for the error you’re encountering, it’s challenging to pinpoint the root cause based on the screenshots alone. Since you’ve mentioned facing a similar error during the Copy Data activity in your pipelines, this suggests the issue might be originating from the data source itself - possibly related to the CSV file formatting or structure.

 

Please verify the consistency of the CSV files, especially the second line where the error occurs, to ensure the columns align with the header and data format expectations.

 

If the issue persists or requires more detailed troubleshooting, raising a support ticket with Microsoft Fabric support would be the best option to get dedicated help:
https://support.fabric.microsoft.com/support

If you have already raised a support ticket, please feel free to share any insights or solutions provided by the support team here. This would greatly help others in the community who might be facing similar issues.

 

Best Regards,
Vinay,
Fabric Community Support.

Hi @DRSH_9876 ,

We’re following up once more regarding your query. Could you please confirm if the issue has been resolved? If you raised a support ticket, we’d appreciate it if you could share any updates or insights from the support team.

 

If the issue has been resolved, we kindly request you to share the resolution or key details here to help others in the community. 

 

For any future questions or assistance, feel free to create a new thread in the Microsoft Fabric Community Forum - we’ll be happy to help.
We may consider closing this thread if there is no response.

Thank you for being part of the Microsoft Fabric Community.

Helpful resources

Announcements
May FBC25 Carousel

Fabric Monthly Update - May 2025

Check out the May 2025 Fabric update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.