Get certified in Microsoft Fabric—for free! For a limited time, the Microsoft Fabric Community team will be offering free DP-600 exam vouchers. Prepare now
I'm trying to recreate in dataflows some PQ processes I have in Excel and Power BI, and am suprised by the fact using an M function (e.g. a call to a REST API) is bringing up a reference to "computed entities", which are limited to Power BI Premium.
The same function can be called and refreshed on a regular Power BI Pro dataset, why would using M functions in Dataflows be limited to Premium?! I'm not trying to calculate aggregations here, I'm just automating API calls.
Also, even though the function (creating by pasting M code in a blank query) seems to be recognized as such at creation:
It is then automatically turned into a table. Are functions even supported in Dataflows at the moment? I really want to love dataflows but the beta experience has been pretty rough so far.
Here's my function just for reference:
let ReturnDetails = (placeid as text) => Json.Document(Web.Contents("https://maps.googleapis.com/maps/api/place/details/json?placeid=" & placeid & "&fields=website,formatted_phone_number,url,rating&language=en-Au®ion=au&key=xyz")) in ReturnDetails
Solved! Go to Solution.
Hi @otravers
What you need to do, is to disable the loading of your function.
Here is a great blog post explaining how to fix it and why it happens.
https://ssbipolar.com/2018/11/23/power-bi-dataflows-reuse-without-premium/
The actual issue is that PowerBI Data Flow cannot handle variables in the url argument. If you place the variables to another argument of the Web.Contents() function, there are no issues:
let ReturnDetails = (placeid as text) => Json.Document(Web.Contents("https://maps.googleapis.com/maps/api/place/details/",[RelativePath="json?placeid=" & placeid & "&fields=website,formatted_phone_number,url,rating&language=en-Au®ion=au&key=xyz"])) in ReturnDetails
The actual issue is that PowerBI Data Flow cannot handle variables in the url argument. If you place the variables to another argument of the Web.Contents() function, there are no issues:
let ReturnDetails = (placeid as text) => Json.Document(Web.Contents("https://maps.googleapis.com/maps/api/place/details/",[RelativePath="json?placeid=" & placeid & "&fields=website,formatted_phone_number,url,rating&language=en-Au®ion=au&key=xyz"])) in ReturnDetails
I would love to use this solution, but I'm using oData connector and it seems it is not supported. Do you know if there's any workaround?
Hello,
I am also using Odata and facing the same issue, did you manage to find a solution ?
Hello! I did but did not use this method. At first, I split my queries into several ones and then appended them within the desktop. Nowadays, I'm doing the Append step inside dataflow, as my workspace was upgraded to Premium.
Anonymous answer above should be marked as solution
How do you manage it when you need to call the API multiple time due to pagination ? It used to work during the preview, but I can no longer save it.
Hi @otravers
What you need to do, is to disable the loading of your function.
Here is a great blog post explaining how to fix it and why it happens.
https://ssbipolar.com/2018/11/23/power-bi-dataflows-reuse-without-premium/
@GilbertQ found a worthwhile improvement/ update to this one
It follows on, extends the hack to disable loading of a dataflow entity. as per the afore mentioned blog posts...
I've recently found that you can ammend the M query in a dataflow to reference an entity from another dataflow via entering the workspace directly.
I discovered this by testing whether the copied M query from PBI desktop of an entity could be pasted into the query editor of a different dataflow in power bi service. It seems to work.
Just another deficit and hacky nature of Power BI that I find annoying.... they are still not supporting metadata management and content reuse as a first class capapbility...
I think there should be a dedicated option to add a workspace and an entity from another workflow directly.
So Im concerned and wouldn't be surprised if the dataflow ETL that I built today will be invalidated in the future...
So by example the code in BI service in a dataflow looks like below.
Once the dataflow param is set to null and the workspace connects to your workspace ID you can then navigate the various dataflows.
By disabling load as others have found you can then start to maintain a core registry of logic in a stage area and resuse more widely. This is getting a lot closer to adequate re-use of code and logic.
let
Source = PowerBI.Dataflows(null),
#"1234" = Source{[workspaceId = "1234"]}[Data],
#"5678" = #"1234"{[dataflowId = "5678"]}[Data],
CIRAM_STG_tblVehicle = #"95678"{[entity = "CIRAM_STG_tblVehicle"]}[Data],
#"Renamed columns" = Tabl.RenameColumns(CIRAM_STG_tblVehicle, {{"fldRegistrationNo", "Rego"}})
in
#"Renamed columns"
I've previously used this method to get functions working with Dataflows. However, now whenever I try to add a function to an entity I get the following message:
Perhaps similar issue @GilbertQ @otravers
Trying to produce a dataflow that gives a single table after scanning over a folder of flat files using the method here
https://ssbipolar.com/2018/12/13/power-bi-dataflows-and-additional-data-sources/
The data, function, and sample file have been added as separate "blank" queries and appear to be working, but when the dataflow is returned in desktop, it returns a blank table. Review of the dataflow status in the service has a warning that a computed entity requires premium capacity.
Checked the linked below for the description of computed entity, went back to the dataflow to ensure that only the data table is loaded, and function and sample file are not loaded (at least the menu option is unchecked for these)
https://ssbipolar.com/2018/10/23/dataflows-in-power-bi-overview-part-6-linked-and-computed-entities/
Now I'm unclear why this would be a "computed entity" if the only table that is loaded is the final table - not the function or the sample file.
Any help appreciated ?
The following warning now also comes up "The query contains columns with complex types that cannot be loaded"
Thanks @GilbertQ ,
That prompted a review and comparison of the PQ script running in desktop vs PQ online (the dataflow).
It appears that the online version cannot handle some of the text value replace functions ???
= Table.ReplaceValue(#"Parsed Date1","-",":",Replacer.ReplaceText,{"Last Trade Time"} works in PQ in desktop , but trips up the PQ online when trying to go from a text to a time datatype.
As a test, I removed the step and all following from the dataflow.
The "complex datatype" message then went away and the dataflow then worked fine.
Yeah, in my experience PQW's handling of datatypes is a bit of a mess.