Join us for an expert-led overview of the tools and concepts you'll need to pass exam PL-300. The first session starts on June 11th. See you there!
Get registeredPower BI is turning 10! Let’s celebrate together with dataviz contests, interactive sessions, and giveaways. Register now.
Hi Team,
We are developing a number of Power BI reports using Snowflake as the data source. This is a migration project and we are migrating from OLAP cubes with excel reporting to Snowflake with Power BI reporting. IIn Snowflake we are storing the data in Star Schema with Facts and dimesions. The business users are more used to working with the measures for reporting as they were having in the cubes. We have multiple complex reports to be created. Would like to understand which would be the right approach for the report development
1. Directly connect to the Snowflake as data source and create the required reports.(Direct query mode due to Security and Real time data requirments)
2.Create Power BI datasets with the required measures and publish that data set. Create all the reports using this data set as as the data source.
3. Are there any template driven report development approach
Would there be any limitations in these approach(like in publishing the report to the service etc..)
hi, @Anonymous
For point 1 and 2:
When you use power bi desktop connect to Snowflake as the data source. You could choose use "Import" or "Directquery" for it.
For the advantages and disadvantages between "Import" and "Directquery", please refer to this document:
https://docs.microsoft.com/en-us/power-bi/desktop-directquery-about
Whichever way you choose, you could transform data and create the data model for it, include measures or column.
For "import", the dataset couldn't be larger than 1GB for Free and pro user.
For "Directquery", there some Limited data transformations and Modeling limitations. but for the dataset is not in the local, so there is no dataset size limitation but it limit of one million rows returned on any query.
For point 3:
For the different report has a different data structure, for example table name, column name, Measrue name, so there is no template driven report development.
Best Regards,
Lin
Thank You Lin. I would like to understand which is the better option for creating a number of reports (These reports are complex reports) :-
1. Directly connecting to the Snowflake Star Schema and creating the reports
2. Publishing a power bi data set with pre-created measures and creating all the reports using live connection to this data set.
Are there any limitations in each approach and which would be recommended way
hi, @Anonymous
If the dataset is not very big(1gb or if you have premium capacities you could use the pbix file up to a maximum of 10 GB in size), I would suggest you use "Import" model,
and if Data is very large and changing frequently, and near ‘real-time’ reporting is needed I would suggest "DirectQuery".
Best Regards,
Lin
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Power BI update to learn about new features.
User | Count |
---|---|
84 | |
76 | |
73 | |
42 | |
36 |
User | Count |
---|---|
109 | |
56 | |
52 | |
48 | |
43 |