Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get certified in Microsoft Fabric—for free! For a limited time, the Microsoft Fabric Community team will be offering free DP-600 exam vouchers. Prepare now

Reply
otravers
Community Champion
Community Champion

Dataflow: use of M function turns workspace into Premium capacity?

I'm trying to recreate in dataflows some PQ processes I have in Excel and Power BI, and am suprised by the fact using an M function (e.g. a call to a REST API) is bringing up a reference to "computed entities", which are limited to Power BI Premium.

 

The same function can be called and refreshed on a regular Power BI Pro dataset, why would using M functions in Dataflows be limited to Premium?! I'm not trying to calculate aggregations here, I'm just automating API calls.

 

Also, even though the function (creating by pasting M code in a blank query) seems to be recognized as such at creation:

pqo-creation.png

 

It is then automatically turned into a table. Are functions even supported in Dataflows at the moment? I really want to love dataflows but the beta experience has been pretty rough so far.

 

Here's my function just for reference:

 

 

let
ReturnDetails = (placeid as text) =>
Json.Document(Web.Contents("https://maps.googleapis.com/maps/api/place/details/json?placeid=" & placeid & "&fields=website,formatted_phone_number,url,rating&language=en-Au&region=au&key=xyz"))
in
ReturnDetails

 

 

------------------------------------------------
1. How to get your question answered quickly - good questions get good answers!
2. Learning how to fish > being spoon-fed without active thinking.
3. Please accept as a solution posts that resolve your questions.
------------------------------------------------
BI Blog: Datamarts | RLS/OLS | Dev Tools | Languages | Aggregations | XMLA/APIs | Field Parameters | Custom Visuals
2 ACCEPTED SOLUTIONS
GilbertQ
Super User
Super User

Hi @otravers


What you need to do, is to disable the loading of your function.

 

Here is a great blog post explaining how to fix it and why it happens.

 

https://ssbipolar.com/2018/11/23/power-bi-dataflows-reuse-without-premium/





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!







Power BI Blog

View solution in original post

Anonymous
Not applicable

The actual issue is that PowerBI Data Flow cannot handle variables in the url argument. If you place the variables to another argument of the Web.Contents() function, there are no issues:

 

let
ReturnDetails = (placeid as text) =>
Json.Document(Web.Contents("https://maps.googleapis.com/maps/api/place/details/",[RelativePath="json?placeid=" & placeid & "&fields=website,formatted_phone_number,url,rating&language=en-Au&region=au&key=xyz"]))
in
ReturnDetails

View solution in original post

17 REPLIES 17
Anonymous
Not applicable

The actual issue is that PowerBI Data Flow cannot handle variables in the url argument. If you place the variables to another argument of the Web.Contents() function, there are no issues:

 

let
ReturnDetails = (placeid as text) =>
Json.Document(Web.Contents("https://maps.googleapis.com/maps/api/place/details/",[RelativePath="json?placeid=" & placeid & "&fields=website,formatted_phone_number,url,rating&language=en-Au&region=au&key=xyz"]))
in
ReturnDetails

I would love to use this solution, but I'm using oData connector and it seems it is not supported. Do you know if there's any workaround?

Hello,
I am also using Odata and facing the same issue, did you manage to find a solution ?

Hello! I did but did not use this method. At first, I split my queries into several ones and then appended them within the desktop. Nowadays, I'm doing the Append step inside dataflow, as my workspace was upgraded to Premium. 

Anonymous answer above should be marked as solution

How do you manage it when you need to call the API multiple time due to pagination ? It used to work during the preview, but I can no longer save it.

GilbertQ
Super User
Super User

Hi @otravers


What you need to do, is to disable the loading of your function.

 

Here is a great blog post explaining how to fix it and why it happens.

 

https://ssbipolar.com/2018/11/23/power-bi-dataflows-reuse-without-premium/





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!







Power BI Blog

Anonymous
Not applicable

@GilbertQ  found a worthwhile improvement/ update to this one

 

It follows on, extends the hack to disable loading of a dataflow entity. as per the afore mentioned blog posts...

 

I've recently found that you can ammend the M query in a dataflow to reference an entity from another dataflow via  entering the workspace directly.

 

I discovered this by testing whether the copied M query from PBI desktop of an entity could be pasted into the query editor of a different dataflow in power bi service. It seems to work.

 

Just another deficit and hacky nature of Power BI that I find annoying.... they are still not supporting metadata management and content reuse as a first class capapbility...

 

I think there should be a dedicated option to add a workspace and an entity from another workflow directly.

 

So Im concerned and wouldn't be surprised if the dataflow ETL that I built today will be invalidated in the future... 

 

So by example the code in BI service in a dataflow looks like below.

 

Once the dataflow param is set to null and the workspace connects to your workspace ID you can then navigate the various dataflows.

 

By disabling load as others have found you can then start to maintain a core registry of logic in a stage area and resuse more widely. This is getting a lot closer to adequate re-use of code and logic.

 

let
Source = PowerBI.Dataflows(null),
#"1234" = Source{[workspaceId = "1234"]}[Data],
#"5678" = #"1234"{[dataflowId = "5678"]}[Data],
CIRAM_STG_tblVehicle = #"95678"{[entity = "CIRAM_STG_tblVehicle"]}[Data],
#"Renamed columns" = Tabl.RenameColumns(CIRAM_STG_tblVehicle, {{"fldRegistrationNo", "Rego"}})
in
#"Renamed columns"

Thanks for the updates and hopefully it will work and get better as time goes on.




Did I answer your question? Mark my post as a solution!

Proud to be a Super User!







Power BI Blog

I've previously used this method to get functions working with Dataflows. However, now whenever I try to add a function to an entity I get the following message:

 

Can't save dataflow

 

Your dataflow can't be saved because one or more entities references a dynamic data source.

Have Microsoft removed this functionality for Pro Users!

Hi there

You have to make sure that your function has been set to "Do not load data"




Did I answer your question? Mark my post as a solution!

Proud to be a Super User!







Power BI Blog

Perhaps similar issue @GilbertQ @otravers 

Trying to produce a dataflow that gives a single table after scanning over a folder of flat files using the method here

https://ssbipolar.com/2018/12/13/power-bi-dataflows-and-additional-data-sources/

 

The data, function, and sample file have been added as separate "blank" queries and appear to be working, but when the dataflow is returned in desktop, it returns a blank table. Review of the dataflow status in the service has a warning that a computed entity requires premium capacity.

 

Checked the linked below for the description of computed entity, went back to the dataflow to ensure that only the data table is loaded, and function and sample file are not loaded (at least the menu option is unchecked for these)

https://ssbipolar.com/2018/10/23/dataflows-in-power-bi-overview-part-6-linked-and-computed-entities/

 

Now I'm unclear why this would be a "computed entity" if the only table that is loaded is the final table - not the function or the sample file.

 

Any help appreciated ?

The following warning now also comes up "The query contains columns with complex types that cannot be loaded"

I would suggest working back through the steps to see where it is happening.




Did I answer your question? Mark my post as a solution!

Proud to be a Super User!







Power BI Blog

Thanks @GilbertQ ,

That prompted a review and comparison of the PQ script running in desktop vs PQ online (the dataflow).

It appears that the online version cannot handle some of the text value replace functions ???

= Table.ReplaceValue(#"Parsed Date1","-",":",Replacer.ReplaceText,{"Last Trade Time"} works in PQ in desktop , but trips up the PQ online when trying to go from a text to a time datatype.

As a test, I removed the step and all following from the dataflow.

The "complex datatype" message then went away and the dataflow then worked fine.

 

Yeah, in my experience PQW's handling of datatypes is a bit of a mess.

------------------------------------------------
1. How to get your question answered quickly - good questions get good answers!
2. Learning how to fish > being spoon-fed without active thinking.
3. Please accept as a solution posts that resolve your questions.
------------------------------------------------
BI Blog: Datamarts | RLS/OLS | Dev Tools | Languages | Aggregations | XMLA/APIs | Field Parameters | Custom Visuals

Hi there thanks for letting us know.

I do know that currently there is not 100% parity between dataflows and Power BI Desktop, which would explain the above.




Did I answer your question? Mark my post as a solution!

Proud to be a Super User!







Power BI Blog

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early Bird pricing ends December 9th.

Nov PBI Update Carousel

Power BI Monthly Update - November 2024

Check out the November 2024 Power BI update to learn about new features.

October NL Carousel

Fabric Community Update - October 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors
Top Kudoed Authors