Don't miss your chance to take the Fabric Data Engineer (DP-600) exam for FREE! Find out how by attending the DP-600 session on April 23rd (pacific time), live or on-demand.
Learn moreNext up in the FabCon + SQLCon recap series: The roadmap for Microsoft SQL and Maximizing Developer experiences in Fabric. All sessions are available on-demand after the live show. Register now
Hi there,
For some reason I am not able to refresh my published report, which has a Data Lake Store as datasource.
In desktop I have setup a simple report using a Data Lake Store which reads multiple csv's from a folder and combines them into a single table. I am able to refresh the data in the desktop version.
But when I publish this to the service I see this message at the dataset settings:
You can't schedule refresh for this dataset because one or more sources currently don't support refresh.
Query contains unknown or unsupported data sources.
I used my personal AAD account (with 2FA) to login to the Data Lake Store.
Any ideas how I can get this to work?
Thanks in advance.
I am facing the same problem. Please keep us updated if you find a solution or work around for the problem
Hi @Anonymous,
I think this issue may due to combine operate. If you only import single csv file without any operation(combine, expand, reference), refresh operation may works smoothly.
In addition, MFA verification also can be a reason of failed refresh, you can also turn off MFA and test combined source.
Regards,
Xiaoxin Sheng
@Anonymous
Disabled 2FA and tried to only read a single file, but still not working...
Refresh in the service should work with Data Lake Store as datasource right?
This is my query by the way:
let
Source = DataLake.Contents("adl://datalakerbot.azuredatalakestore.net", null),
#"Raw Data" = Source{[Name="Raw Data"]}[Content],
#"2017" = #"Raw Data"{[Name="2017"]}[Content],
#"Afvaldata log 01-2017 txt" = #"2017"{[Name="Afvaldata log 01-2017.txt"]}[Content],
#"Imported CSV" = Csv.Document(#"Afvaldata log 01-2017 txt",[Delimiter=";", Columns=7, Encoding=1252, QuoteStyle=QuoteStyle.None]),
#"Promoted Headers" = Table.PromoteHeaders(#"Imported CSV", [PromoteAllScalars=true]),
#"Changed Type" = Table.TransformColumnTypes(#"Promoted Headers",{{"timestamp", type text}, {"BA [ton]", type number}, {"HHAWEG [ton]", type number}, {"GA [ton]", type number}, {"IMP [ton]", type number}, {"HHABoot [ton]", type number}, {"HHAtotaal [ton]", type number}})
in
#"Changed Type"
Just found out this is a current PowerBI issue:
http://community.powerbi.com/t5/Issues/Azure-Data-Lake-refresh-issue/idc-p/438555
If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.
A new Power BI DataViz World Championship is coming this June! Don't miss out on submitting your entry.
Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.