Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us for an expert-led overview of the tools and concepts you'll need to become a Certified Power BI Data Analyst and pass exam PL-300. Register now.

Reply
WrigleyMedia
Frequent Visitor

Unexpected error occurred while creating your data destination. Please try again later.

We are running a PowerBI trial and have created a data flow to pull data from another data source into PowerBI. Dataflow publishing fails with the error below with a Lakehouse data destination. Please advise

 

Dataflow ID: a233eb75-53ac-4ddd-a00a-2d95ae478804
Details: Query: FileMakerData
Session ID: 871a5012-3474-a64c-d477-ea05f216de02
Root activity ID: 5bbb459b-5aa7-49b1-bac7-8002a0283f9d
Time: 2025-02-22T17:36:47.533Z

1 ACCEPTED SOLUTION

Hi @WrigleyMedia,
Thanks for following up.

Since you have already validated the data types and confirmed that the API connection is working correctly, let’s explore other possible causes for the "Unexpected Error" when publishing to the Lakehouse.

  • Navigate to Dataflow Gen2 → recent runs and check for any warnings or hidden error messages beyond the "Unexpected Error." If you see an error code or additional details, please share them here for further analysis.
  • Create a new test Lakehouse and attempt to publish the Dataflow there. If it works, the issue may be specific to your existing Lakehouse.
  • Check you have the necessary write permissions for the Lakehouse.
  • Try publishing a small subset of data to see if the issue is related to specific rows or data volume limits.


I trust this information proves useful. If it does, kindly Accept it as a solution and give it a 'Kudos' to help others locate it easily.
Thank you.

View solution in original post

11 REPLIES 11
WrigleyMedia
Frequent Visitor

The issue still exists. I went throug the wizard of publishing data. I made sure to only enable the fields that appear to have no issues with the Lakehouse I created. The fields include numbers, text, and date data. Publishing is still failing and there is no specific error. I'm only getting the "Unexpected Error" message as always.

 

This dataflow is connecting our data source to PowerBI Service via API. The API connection is working perfectly.

Hi @WrigleyMedia,
Thanks for following up.

Since you have already validated the data types and confirmed that the API connection is working correctly, let’s explore other possible causes for the "Unexpected Error" when publishing to the Lakehouse.

  • Navigate to Dataflow Gen2 → recent runs and check for any warnings or hidden error messages beyond the "Unexpected Error." If you see an error code or additional details, please share them here for further analysis.
  • Create a new test Lakehouse and attempt to publish the Dataflow there. If it works, the issue may be specific to your existing Lakehouse.
  • Check you have the necessary write permissions for the Lakehouse.
  • Try publishing a small subset of data to see if the issue is related to specific rows or data volume limits.


I trust this information proves useful. If it does, kindly Accept it as a solution and give it a 'Kudos' to help others locate it easily.
Thank you.

Hi @WrigleyMedia,
I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If my response has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.
Thank you.

v-ssriganesh
Community Support
Community Support

Hi @WrigleyMedia,

Could you please confirm if your query have been resolved? If they have, kindly mark the helpful response and accept it as the solution. This will assist other community members in resolving similar issues more efficiently.

Thank you.

WrigleyMedia
Frequent Visitor

Issue is not yet resolved. We are considering an ODBC connection through a gateway on a Windows device as an alternative. Does that approach work and provide better flexibility?

Hi @WrigleyMedia,

Thanks for your update. Yes, using an ODBC connection through an on-premises data gateway can work as an alternative and may offer more flexibility depending on your requirements.

  • Power BI supports ODBC connections via the On-Premises Data Gateway, allowing you to connect to FileMaker and pull data into Power BI.
  • This approach gives you more control over the connection settings and can be useful if native connectors are unavailable.

If you decide to proceed with this approach, make sure the ODBC driver is set up correctly and run performance tests to make sure it satisfies your requirements.

I trust this information proves useful. If it does, kindly Accept it as a solution and give it a 'Kudos' to help others locate it easily.
Thank you.

Hi @WrigleyMedia,
I hope this information is helpful. Please let me know if you have any further questions or if you'd like to discuss this further. If this answers your question, please accept it as a solution and give it a 'Kudos' so other community members with similar problems can find a solution faster.
Thank you.

v-ssriganesh
Community Support
Community Support

Hi @WrigleyMedia,

May I ask if you have resolved this issue? If so, please mark the helpful reply and accept it as the solution. This will be helpful for other community members who have similar problems to solve it faster.

Thank you.

v-ssriganesh
Community Support
Community Support

Hi @WrigleyMedia,
Thanks for posting your query in Microsoft fabric community forum.

I want to acknowledge @GilbertQ response it's partially correct, as checking the data destination and data types is an important step. However, I want to provide additional clarity regarding text data support in Fabric Lakehouse.

Text fields are supported in Microsoft Fabric Lakehouse, which is based on Delta Tables. Along with text, it supports numbers, currency, date, and time fields. The confusion might come from how Dataflow Gen2 maps source data types to the Lakehouse, which can sometimes cause issues if not mapped correctly.

Consider the below Steps:

  • Open your Dataflow Gen2 in Power Query.
  • Ensure that each column is mapped to the correct type before publishing.
  • Go to the Dataflow settings → Destination (Lakehouse) and check if there are any warnings about unsupported data types.
  • If the issue persists, load the data into a staging table first and then apply transformations before moving it into your final Lakehouse table.
  • Try publishing with just a few rows to see if a specific field is causing the issue.

Here’s a helpful Microsoft Learn article on supported types:
https://learn.microsoft.com/en-us/fabric/data-factory/dataflow-gen2-data-destinations-and-managed-se...


If this helps, then please Accept it as a solution and dropping a "Kudos" so other members can find it more easily.

Thank you.

GilbertQ
Super User
Super User

Hi @WrigleyMedia 

 

Can you please go into the data flow settings? Click on the destination and follow the wizard to make sure that there are no data types that are unsupported when trying to write to the lake house.

 

Dataflow Gen2 data destinations and managed settings - Microsoft Fabric | Microsoft Learn





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!







Power BI Blog

According to the Microsoft documentation, there are no storage solutions that support text data. Is this correct? I have a Dataflow Gen2 with three queries that ultimately pull data via API. The data consists of fields formatted as numbers, currency, date, time, and text. It doesn't see there is an option for a data solution that can handle this type of data. Unless I am missing something, which is likely. 

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June 2025 Power BI Update Carousel

Power BI Monthly Update - June 2025

Check out the June 2025 Power BI update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.