Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.
We have multiple data sources, Saleforce, excel and other databases. Currently, we have created semantic models based on the need of the report needed at the time. This has cause many reports to connect to the same data sources and/or tables multiple times.
What is the best practice to pull data from the source once a day and create multiple BI reports?
We have tried Lakehouse, but we are on a trial version and do not want to pay for another subscription.
Solved! Go to Solution.
Hi @tomperro ,
Yeah, this is a super common situation — especially when reports are built ad-hoc and semantic models grow organically over time.
Here are a few best practices you might want to consider:
Instead of building a new model for each report, try to create shared, reusable semantic models (aka “golden datasets”) in the Power BI Service. These can be:
If you’re not using Lakehouse and want to avoid extra cost:
If multiple reports are hitting Salesforce or other APIs directly, it can cause throttling or performance issues. Use:
Start documenting which reports use which sources and models. This helps you consolidate and reduce redundancy over time.
Let me know if you want help setting up a shared model or dataflow strategy — happy to walk through it.
If my response resolved your query, kindly mark it as the Accepted Solution to assist others. Additionally, I would be grateful for a 'Kudos' if you found my response helpful.
This response was supported by AI for translation and text editing.
Hi @tomperro ,
Thank you for reaching out to the Microsoft Fabric Community. Also thank you @MasonMA , @burakkaragoz and @lbendlin for you insights.
Just to clarify, Power BI Dataflows are mainly used for getting and preparing data, but they don’t support creating DAX measures. This is because dataflows only handle the data part and don’t have a modeling layer where you can define calculations like measures. If you want to create shared measures that others can use across different reports, the best approach is to first connect your dataflows to a Power BI Desktop file (PBIX), create your DAX measures there, and then publish that file to the Power BI Service. Once published, others can connect to this dataset using a live connection, and they’ll be able to use all the measures you’ve already built without having to create them again. This helps keep everything consistent and easier to manage.
Hope this helps. Please reach out for further assistance.
If this post helps, then please consider to Accept as the solution to help the other members find it more quickly and a kudos would be appreciated.
Thank you.
As @v-tsaipranay mentioned- 'the best approach is to first connect your dataflows to a Power BI Desktop file (PBIX), create your DAX measures there'?
This does not sound like the situation @tomperro has.
Hi @tomperro ,
Yeah, this is a super common situation — especially when reports are built ad-hoc and semantic models grow organically over time.
Here are a few best practices you might want to consider:
Instead of building a new model for each report, try to create shared, reusable semantic models (aka “golden datasets”) in the Power BI Service. These can be:
If you’re not using Lakehouse and want to avoid extra cost:
If multiple reports are hitting Salesforce or other APIs directly, it can cause throttling or performance issues. Use:
Start documenting which reports use which sources and models. This helps you consolidate and reduce redundancy over time.
Let me know if you want help setting up a shared model or dataflow strategy — happy to walk through it.
If my response resolved your query, kindly mark it as the Accepted Solution to assist others. Additionally, I would be grateful for a 'Kudos' if you found my response helpful.
This response was supported by AI for translation and text editing.
@burakkaragoz Thank you for this information.
Is it possible to create shared measures in the dataflows?
For example:
If I create a dataflow to pull data from the Contact Object and another dataflow to pull data from the Position Object, I know that I can create measures after connectiong BI Desktop to these two dataflows, but how can I create the measure upstream, so that other users will have access to use the measure if they connect to the dataflows?
Same challenges where i work. What we did at my department is:
1.Create one centralized ETL using Power BI Dataflows (Gen1) (Gen2 would be available later along with Fabric licences)
Use Power BI Dataflows Gen1 to extract and transform your data only once. Extract once from Sharepoint, SQL DBs, Microsoft List etc., transform using Power Query and store it in cloud, not in each .pbix file. and then schedule refresh once daily.
2. Build a Centralized Semantic Model
Use this centralized semantic model that connects to your dataflows and publish it to a certified workspace so that it can be reused across reports via Live Connection.
Thanks
@MasonMA Thank you for this information.
Is it possible to create shared measures in the dataflows?
For example:
If I create a dataflow to pull data from the Contact Object and another dataflow to pull data from the Position Object, I know that I can create measures after connectiong BI Desktop to these two dataflows, but how can I create the measure upstream, so that other users will have access to use the measure if they connect to the dataflows?
No, you cannot. Measures are not available in Dataflow:(
It's ETL layer, not semantic model layer.
Remember that Dataflows are glorified CSV files. Reusability is certainly a valid use case but often not having Dataflows is the better option.
Do you have a SharePoint or OneDrive storage option?
Yes, I do have a SharePoint or OneDrive storage option.
Do you know if SQL Database (preview) can be used?