Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredGet Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Learn more
I have a dashboard made out of one csv file . and as I have to do a lot of filtering and manipulation . I did a data model merely using the powerquery tool.
So I have done a lot of merges in order to make a small snowflake data schema .
I will shortly add complexity to the model and that is making me worry if that will be still optimised to do so .
I'm actually having a problem with a dax measure that gives me a blank value for a small portion of the data, add with the other inputs ( Knowing that I checked the the relationships , I don't seem to find the problem)
I come to ask you if there is a way to dissociate the modeling,especially for a model that should be made out of csv files, and what are the tools you suggest to use in my case .
thanks in advance
@latrous98 , You can consider Azure analysis services or Sql server analysis services for that.
Or do the transformation in on pbix and modeling in one pbix and report creation in another
https://docs.microsoft.com/en-us/power-bi/connect-data/desktop-report-lifecycle-datasets
I guess the more scalable option and more felexible are the first ones (and I guess sql serveur is the free one if I'm not mistaken if I want to make a local database)
I actually thought of etl solutions like talend but it seems that there is straight way to import the data
I don't know if you have already encountered that ?
thanks
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!
Check out the October 2025 Power BI update to learn about new features.