Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us for an expert-led overview of the tools and concepts you'll need to become a Certified Power BI Data Analyst and pass exam PL-300. Register now.
We are running a PowerBI trial and have created a data flow to pull data from another data source into PowerBI. Dataflow publishing fails with the error below with a Lakehouse data destination. Please advise
Dataflow ID: a233eb75-53ac-4ddd-a00a-2d95ae478804
Details: Query: FileMakerData
Session ID: 871a5012-3474-a64c-d477-ea05f216de02
Root activity ID: 5bbb459b-5aa7-49b1-bac7-8002a0283f9d
Time: 2025-02-22T17:36:47.533Z
Solved! Go to Solution.
Hi @WrigleyMedia,
Thanks for following up.
Since you have already validated the data types and confirmed that the API connection is working correctly, let’s explore other possible causes for the "Unexpected Error" when publishing to the Lakehouse.
I trust this information proves useful. If it does, kindly Accept it as a solution and give it a 'Kudos' to help others locate it easily.
Thank you.
The issue still exists. I went throug the wizard of publishing data. I made sure to only enable the fields that appear to have no issues with the Lakehouse I created. The fields include numbers, text, and date data. Publishing is still failing and there is no specific error. I'm only getting the "Unexpected Error" message as always.
This dataflow is connecting our data source to PowerBI Service via API. The API connection is working perfectly.
Hi @WrigleyMedia,
Thanks for following up.
Since you have already validated the data types and confirmed that the API connection is working correctly, let’s explore other possible causes for the "Unexpected Error" when publishing to the Lakehouse.
I trust this information proves useful. If it does, kindly Accept it as a solution and give it a 'Kudos' to help others locate it easily.
Thank you.
Hi @WrigleyMedia,
I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If my response has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.
Thank you.
Hi @WrigleyMedia,
Could you please confirm if your query have been resolved? If they have, kindly mark the helpful response and accept it as the solution. This will assist other community members in resolving similar issues more efficiently.
Thank you.
Issue is not yet resolved. We are considering an ODBC connection through a gateway on a Windows device as an alternative. Does that approach work and provide better flexibility?
Hi @WrigleyMedia,
Thanks for your update. Yes, using an ODBC connection through an on-premises data gateway can work as an alternative and may offer more flexibility depending on your requirements.
If you decide to proceed with this approach, make sure the ODBC driver is set up correctly and run performance tests to make sure it satisfies your requirements.
I trust this information proves useful. If it does, kindly Accept it as a solution and give it a 'Kudos' to help others locate it easily.
Thank you.
Hi @WrigleyMedia,
I hope this information is helpful. Please let me know if you have any further questions or if you'd like to discuss this further. If this answers your question, please accept it as a solution and give it a 'Kudos' so other community members with similar problems can find a solution faster.
Thank you.
Hi @WrigleyMedia,
May I ask if you have resolved this issue? If so, please mark the helpful reply and accept it as the solution. This will be helpful for other community members who have similar problems to solve it faster.
Thank you.
Hi @WrigleyMedia,
Thanks for posting your query in Microsoft fabric community forum.
I want to acknowledge @GilbertQ response it's partially correct, as checking the data destination and data types is an important step. However, I want to provide additional clarity regarding text data support in Fabric Lakehouse.
Text fields are supported in Microsoft Fabric Lakehouse, which is based on Delta Tables. Along with text, it supports numbers, currency, date, and time fields. The confusion might come from how Dataflow Gen2 maps source data types to the Lakehouse, which can sometimes cause issues if not mapped correctly.
Consider the below Steps:
Here’s a helpful Microsoft Learn article on supported types:
https://learn.microsoft.com/en-us/fabric/data-factory/dataflow-gen2-data-destinations-and-managed-se...
If this helps, then please Accept it as a solution and dropping a "Kudos" so other members can find it more easily.
Thank you.
Can you please go into the data flow settings? Click on the destination and follow the wizard to make sure that there are no data types that are unsupported when trying to write to the lake house.
Dataflow Gen2 data destinations and managed settings - Microsoft Fabric | Microsoft Learn
According to the Microsoft documentation, there are no storage solutions that support text data. Is this correct? I have a Dataflow Gen2 with three queries that ultimately pull data via API. The data consists of fields formatted as numbers, currency, date, time, and text. It doesn't see there is an option for a data solution that can handle this type of data. Unless I am missing something, which is likely.
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Power BI update to learn about new features.
User | Count |
---|---|
58 | |
36 | |
27 | |
27 | |
25 |
User | Count |
---|---|
62 | |
53 | |
30 | |
24 | |
23 |