October 28 & 29: Experts share their secrets on how to pass the Fabric Analytics Engineer certification exam—live. Learn more
Hello!
I'm using dataflow 2 gen in order to optimize the updating of queries with custom functions.
For the function I have a sample file (for transforming other files) that must be in binary format (according to https://learn.microsoft.com/en-us/power-query/custom-function).
For some reason, when publishing the dataflow in a query with sample file two more steps are being added (From Value, Remove Columns):
#"Remove Columns"
The publishing is successful but the updating fails:
Mashup Exception Expression Error: Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: Expression.Error: Can not convert a Table type into a Binary type. Details: Reason = Expression.Error;Value = #table({}, {});Microsoft.Data.Mashup.Error.Context = User.
I have checked several posts with similar problems but haven't managed to solve mine one.
Ask you for help me with using of functions in a data flow.
Go through the refresh history and see in which table / query that error is raised: that should tell you a query that perhaps you have setup as a table that should probably be a text or how a function expects a text but gets passed a table value instead
After I disabled staging in the query with resulting tables, an error occurred in the query with an example of file transforming:
Mashup Exception Data Format Error Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: DataFormat.Error: Error when replacing table contents with new data in version: #{0}., InnerException: Could not be converted to a number., basic error: Not it was possible to convert to a number. Details: Reason = DataFormat.Error;Message = Failed to convert to a number.;Detail = Ost.5;Message.Format = Could not be converted to a number.;ErrorCode = 10041;Microsoft.Data.Mashup.Error.Context = User
This seems to be an error different to the previous one.
Do you think that you could share the M script of the relevant queries so we can better understand how to reproduce the issue or get the same results?
I translated the steps but other names I left in the original language
1. Transforming File Sample
Could you also share the m script for #"Function"?
also, where exactly are you seeing the error when in the power query editor? It should raise an error in the authoring time
The error is raised while updating:
You do not see any queries with errors whilst inside the power query editor ?
You should be able to see the queries with errors. You could hit the "refresh preview" button in the ribbon to make sure that the queries are evaluated using the latest state and see if they're getting any errors.
if the errors are only happening during the refresh, then that would mean that the issue is not with the functions but rather with the destination configuration that you have. I'd recommend removing the data destination to rule out the issue on the destination configuration and doing a refresh - if you don't get any issues then the new error is at the destination configuration and it's a completely new issue than the original that was reported
No error occurs in the queries. I used the Save rows option for each query to check it and the result was an empty table.
After I removed the destination configuration it updated successfully.
But when I turned on staging in the query with the resulting tables the error ocurred (while updating, not in the query):
Mashup Exception Expression Error: Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: Expression.Error: Couldn't insert table., InnerException: Can not convert a Table type into a Text type.
Do I get it right that staging is not a necessary option and it is for updating optimisation?
Сan I disable it without heavy losses for the dataflow?
Other dataflows are connected to Lakehouse destinations (each dataflow to its Lakehouse).
Could you tell how to create a Lakehouse and correctly connect the dataflow to it?
In the Lakehouse that I connected to the queries of dataflow I found the error:
The SQL analytics endpoint could not be created.
The conversion error more than likely happens because you have a column that contains a table type whereas the destination expects a text or the conversion that you have for such column couldn't convert a table into text
I thoroughly examined the query I assigned the Lakehouse destionation in, there are columns only of three types: text, dates and numbers, and I do not see any conversion of table type into text type. I created a new empty Lakehouse but anyway until I assign the destination the query updates successfully, then the error occures: Can not convert a Table type into a Text type.
Ask you to help me with that question.
I'm unable to repro your scenario. If you have a way in which I could repro it with a given sample and / or public data source, please do share it.
otherwise, I'd recommend raising a support ticket so the support team can help troubleshoot and look at this scenario much deeper. Below is the link to open a support ticket:
The error is raised in the query where should be transformed files.
This query must be a table as I call the custom function there and it transforms the files by a sample file parameter.
Can you check if any of the queries in your customer function logic (sample file, sample transform file or anything else) have staging enabled? If any of them do, then disable staging on them
As you said, I disabled staging on all files included in the function (sample file, sample transform file). Now steps From Value and Remove Columns are not being added but another error occurred while updating:
Mashup Exception Expression Error: Couldn't refresh the entity because of an issue with the mashup document MashupException.Error: Expression.Error: Couldn't insert table., InnerException: Can not convert a Table type into a Text type.
Is "Transitions" of one of these types?
type table, type record, type list, type nullable binary, type binary, type function
Looks like they only accept tables of basic types for further processing.
Transitions is the step with a binary excel file.
I also have read that only tables of basic types are allowed, but in several other posts people seem to manage to find the way of processing binary files.