March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early Bird pricing ends December 9th.
Register NowGet certified in Microsoft Fabric—for free! For a limited time, the Microsoft Fabric Community team will be offering free DP-600 exam vouchers. Prepare now
Hi all,
situation: I want to create a dashboard and afterwards publish it to public web.
The datasource of the report is a lakehouse. The lakehouse (and it's semantic model) get's filled by a data pipeline.
Source: Geoserver, which provides a (geo-)json file.
Destination: A table in the semantic model.
My problem: I'm not able to publish the report, because of the semantic model used. (see also here: https://learn.microsoft.com/en-us/power-bi/collaborate-share/service-publish-to-web#considerations-a...)
Does anyone of you sees a solution for that?
Consume a geojson (time scheduled) --> write it to a datatable in Power BI --> create a report with the table as datasource --> publish to public web.
Thanks in advance for your support
Ronny
Solved! Go to Solution.
Hi Tutu_in_YYC ,thanks for the quick reply, I'll add more.
Hi @Ronny_Dittmann ,
Regarding your question, my idea is to use DataFlow.You can set the run time of the data pipeline through the "Schedule" button.This way the data stream will get the latest data.
1.
2.Use DataFlow as a data source in Power BI Desktop and publish to Power BI Service
3.Configuring scheduled refresh of the semantic model
According to my test, automatic refresh should be possible
Best Regards,
Wenbin Zhou
Thanks, but how to refresh the data automatically? Is there a way?
Hi Tutu_in_YYC ,thanks for the quick reply, I'll add more.
Hi @Ronny_Dittmann ,
Regarding your question, my idea is to use DataFlow.You can set the run time of the data pipeline through the "Schedule" button.This way the data stream will get the latest data.
1.
2.Use DataFlow as a data source in Power BI Desktop and publish to Power BI Service
3.Configuring scheduled refresh of the semantic model
According to my test, automatic refresh should be possible
Best Regards,
Wenbin Zhou
Hello @v-zhouwen-msft,
one topic remains with your solution: I added the data pipeline as suggested, with a "copy data" item, which copies my json data into a datalake + the dataflow item.
The dataflow looks like:
I also created the report in Power BI Desktop + published it to web.
My problem is, that the data of my report doesn't refresh automatically. I think, the problem is that publishing the report also a new semantic model will be created. And the dataflow doesn't update it.
Did I do anything wrong?
Hi @Ronny_Dittmann ,
Is a scheduled refresh set for the semantic model? I tested it by refreshing on demand and it was able to display new data (remember to refresh the browser and refresh the visual objects), which shows that scheduled refresh should also be OK. Is there a record of scheduled refresh in your refresh history? Is the time zone selected for scheduled refresh correct?
Are the credentials correct? When I opened it for the first time, it gave an error message. I updated the credentials.
Best Regards,
Wenbin Zhou
Hello @v-zhouwen-msft,
same happened to me: The refresh of the semantic model of the report failed because of missing credentials. I added them and will check, whether the update works.
Thanks again!
Hello @v-zhouwen-msft, I would have never thought of this solution, but it works nice!
Thanks a lot for your support
Ronny
Sounds like you are using DirectLake mode. An option would be to convert the report to import mode ie rebuild the semantic model on power bi desktop.
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early Bird pricing ends December 9th.
Check out the November 2024 Power BI update to learn about new features.
User | Count |
---|---|
48 | |
38 | |
37 | |
33 | |
20 |