I've created a dataflow that simply pulls in data from selected Salesforce objects (parameterised so that we can swap between Production and Sandbox) I've now set up a child dataflow that performs filtering and transformation. I've also set up parameters for the workspace and source dataflow. To avoid repeating steps when pulling in the tables from the SFData dataflow, I have a base query: let
Source = PowerBI.Dataflows(null),
FilterWorkspace = Table.SelectRows(Source, each [workspaceName] = DataFlowWorkspace),
ExpandDataflows = Table.ExpandTableColumn(FilterWorkspace, "Data", {"dataflowId", "dataflowName", "Data"}, {"dataflowId", "dataflowName", "Data"}),
FilterDataFlow = Table.SelectRows(ExpandDataflows, each [dataflowName] = DataFlowName),
ExpandDataFlowObjects = Table.ExpandTableColumn(FilterDataFlow, "Data", {"entity", "Data"}, {"entity", "Data"}),
RemoveOtherColumns = Table.SelectColumns(ExpandDataFlowObjects, {"entity", "Data"})
in
RemoveOtherColumns This leaves me with a list of table objects: entity Data Account [Table] Buyer Group Member [Table] etc. etc. I then reference to import the SFData tables, e.g.: let
Source = #"Dataflow Base",
Navigation = Source{[entity = "Account"]}[Data]
in
Navigation The issue I have is that PBO requires that all columns must have a type, otherwise it can't be loaded. If I don't do anything, the Data column is converted to text, but generates an error (which then gets converted to a null). There doesn't appear to be a data type I can set the Data column to, to prevent this. This means for every table I'm importing from the data flow (and any subsequent data flows), I need to use the full navigation steps to get to the entity list and then navigate to the object I want. Is there a way around this?
... View more