Power BI is turning 10! Tune in for a special live episode on July 24 with behind-the-scenes stories, product evolution highlights, and a sneak peek at what’s in store for the future.
Save the dateEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Hello everyone,
I'm new to Fabric Data Factory and Power BI so I'm a bit struggling here.
What I want to do is :
And here I am, I think there are a lot of ways to do it, but I'd like to know what is the preferred scenarios to use in this case.
For now what I've been doing is :
But I'm not sure it is the best way to do it (both in term of efficiency, ease of use and cost), so I'm looking forward if you have some recommendation of other things I can do that may be better.
I've been told for example to use Azure DLS Gen2 (Data Lake Storage) but it looks like I can't use Direct Query with it, so it means I need to update manually the dashboard daily on PBI desktop in order for the PBI.com user to view the up to date data, or I'm missing something 😫
Thanks for your help ! 😊
Solved! Go to Solution.
Assuming you already have the data in Azure SQL:
Use Power BI desktop to connect to the data in Azure SQL ("Get Data").
Import mode is generally the recommended option for Power BI storage mode. Usually, DirectQuery is not recommended.
Perform transformations in Power Query as required ("Transform data").
Do the data modelling in Power BI Desktop.
Apply Row Level Security (RLS) as required. (Use Tabular Editor if you need OLS).
Create the visuals and report pages.
From Power BI desktop, publish your report to a workspace in Powerbi.com (aka Power BI service).
In Powerbi.com, select the workspace and in there find the semantic model you have published. Then enable scheduled refresh for the semantic model. That way, the data in the report will get automatically refreshed according to a schedule.
If you have Fabric license, then you can consider using Fabric Lakehouse instead of Azure SQL database.
"so it means I need to update manually the dashboard daily on PBI desktop in order for the PBI.com user to view the up to date data, or I'm missing something" You don't need to refresh the data in Power BI Desktop. You can enable scheduled refresh in Powerbi.com (Power BI service). See e.g. here: https://learn.microsoft.com/en-us/training/modules/manage-datasets-power-bi/5-dataset-refresh
Assuming you already have the data in Azure SQL:
Use Power BI desktop to connect to the data in Azure SQL ("Get Data").
Import mode is generally the recommended option for Power BI storage mode. Usually, DirectQuery is not recommended.
Perform transformations in Power Query as required ("Transform data").
Do the data modelling in Power BI Desktop.
Apply Row Level Security (RLS) as required. (Use Tabular Editor if you need OLS).
Create the visuals and report pages.
From Power BI desktop, publish your report to a workspace in Powerbi.com (aka Power BI service).
In Powerbi.com, select the workspace and in there find the semantic model you have published. Then enable scheduled refresh for the semantic model. That way, the data in the report will get automatically refreshed according to a schedule.
If you have Fabric license, then you can consider using Fabric Lakehouse instead of Azure SQL database.
"so it means I need to update manually the dashboard daily on PBI desktop in order for the PBI.com user to view the up to date data, or I'm missing something" You don't need to refresh the data in Power BI Desktop. You can enable scheduled refresh in Powerbi.com (Power BI service). See e.g. here: https://learn.microsoft.com/en-us/training/modules/manage-datasets-power-bi/5-dataset-refresh
Here is an another way that you can do it:
Create a copy pipeline in data factory and copy the data from the json file to the SQL Server. Create a trigger for it to run it automatically every day.
Once you get the data in the SQL Server database, connect it with power bi using import connection and you can perform all the necessary transformation and create measures whatever is needed the report.
Save this power bi file as a semantic model in the workspace.
Create a new report and connect it to the semantic model through live connection. Whatever the changes happen in the power bi dataset, it will get reflected on your report.
You can set up an automatic refresh for the semantic model on a daily basis in the power bi service.
If my post helps, then please consider Accept it as the solution to help the other members find it more quickly.