The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredCompete to become Power BI Data Viz World Champion! First round ends August 18th. Get started.
I'm doing a migration from QlikView to Power BI and have found several difficulties
I am finding it very difficult to make a full charge only once my DW, I have 7 fact tables and more than 10 dimenssões and yet got no time to perform a full load at once, there is always some error during loading and the only log that is displayed is "unexpected error occurred" when I load Apparel Apparel by I can have all the data you need but make the load in this way is unfeasible because in the future I will use the automatic charging feature.
I think my biggest problem is to concatenate all the facts that I use only a single fact at the end of the load, but I found another way to do this, I am looking for documentation on best practices on how to load large volume of data on the power bi but found nothing so far.
If anyone can help me I'm thankful that I'm considering the idea of no longer use the Power BI.
as people are suggesting to have a directquery refresh, just want to understand does it have refresh restirction only two three data source as of now tru gateway. So if the source is other than these three directquery refresh cant be implemented?
Data Source Types:
SQL Server, SAP HANA, and Analysis Services
Hi @virgo
I have the same question, but I believe will or rather imagine navigating the application when we have more of a different database with real-time reading mode should be slow and believe it will not be worth it, I will do more I report herein and testing.
@virgo - yes, those are the current supported data sources for direct query/live connection (Azure SQL Database and Azure SQL Data Warehouse included)
@Anonymous Many others have already chimed in about using direct query or live connection due to the 250 mb size limitation when importing your data into the Desktop. I would look at the level of effort it would take to get your data into a SQL DB, and direct connecting, or skip that and import your data into a Tabular model (or multidimensional) and manage your business layer there. I've had great success, and performance, in large scale implementations using tabular models with Power BI, and it is usually the recommended route I suggest if you have the SQL licensing.
Couple things to note with these approaches: Direct Query - your mesaures and calc columns will exist in PBI Desktop files. With Live Connection to SSAS - your model can be fully built and managed in one location, and all you need to do with Power BI is connect to it.
Hi @Anonymous ,
I have gone through your post and it seems you are using "Import" option in Connection String. This means that you are askign Power BI to do following :
1) Bring Data into Power BI from Sybase. That means data gets copied.
In my opinion, issue can occur because of Data limits in Power BI. Single Power BI file can not have data more than 250 MB.
Please do following :
1) Check total size of your tables. Are they more than 250 MB ? Try to see if it possible to minimize the data
2) For testing , you can also try to make mirror copy of these tables and just transfer few rows for month from original tables.
Then make reports from these mirrored tables. This will check both : size limits of Power BI and relationships too.
However, I dont think relationships shall be issue.
BR,
Achin
Hi @achinm45
My application has 98 mb, but now that you talked about the 250mb limit was concerned for the well near future, will have to develop a BI where the fact table will have 2 gb, I have to think what to do when the burden of my application could improve performance after upgrading VPN driver and Sybase, but I have a call with the power bi support open on this issue, as I discover something I'll post here, I'll leave this open topic for now
@Anonymous are you trying to load your data mart in Power BI using import feature? if yes then the error could be because you have limited resources on that machine e.g. Memory.
Try to load it using DirectQuery if you are reading it directly from RDBMS or use direct connection if you are using OLAP cubes.
Secondly, as best practise in star schema you should not combine all the facts which are on different grain levels into one fact table, this is violation of basic star schema rules. If you design is correct, star shema should not be an issue to load the data.
We can help you more if you elaborate your question with examples and screen shots of your model.
Regards
Harris
Sorry I forgot to mention, follows a picture of my options loads of power bi, left virtually disabled all load resources so there is not much processing during the load in order to optimize the load is in Portuguese but the order of option are the same as those in any other language
thanks again!!!!!!!!
@HarrisMalik and @ankitpatira thank for your help.
My machine has 8gb of ram and processor i7 3.6 GHz for sure the reason for the slowness in charge is not because of my computer, so that other programs like QlikView and tableu can carry the same DW no problem and fast way.
Here is a picture of how is my star schema, sorry the image resolution but it was what I could do.
As can also be noted there are several tables without relationships, I had to do this because as I was informed concatenei all my tables facts to form a single Fact table.
In short, I have 11 tables Facts and over 10 tables dimensions.
I get this data from a DW that is in a Sybase database, and is not a cube and make the load via SQL Query using an ODBC driver, following an image to illustrate.
@Anonymous If you've such large number of tables to work with each having PK FK relations then DirectQuery is the best option. When Powerbi desktop import tables it doesn't look at your PK, FK relations but tries to determine relations on its own based on few rules such as it uses field names to determine relations. For example P_ID in one table will be linked with P_ID in other table. So then it would try and determine as many relations it can and leave other blank or inactive (inactive due cardinality conflicts).
If you absolutely needs to use Import then I would suggest create relations between each tables manually via relationship editor.
As you can see in the last picture I let disable the automatic creation of relationships between the tables at the time of loading, the dimensions of the relationship to the fact table were done manually after loading the data
@Anonymous also ensure cross filter direction is set to Single for all the relations and then you will be able to also add relations for the tables that is missing currently. If you have cross filter direction to Both unfortunately you won't be able to set relations for all the tables.
If you're moving from a working Qlickview to PowerBI I see no reason at all to modify table structures. It's a reporting-system (read only), serving the same purpose just like Power BI and if it has been designed according to good practices and performs alright - no reason to adjust the model.
From my personal experience I can say that if peoply don't really understand data modelling (including basic understanding of the business processes involved), there is a very high chance that they only make things worse they start to modify it.
My recommendation: Build a proper foundation of data modelling understanding before you get going on this terrain. I wouldn't argue against jumping head-first into many IT areas, but this is one of the few where I strongly recommend against!
Imke Feldmann (The BIccountant)
If you liked my solution, please give it a thumbs up. And if I did answer your question, please mark this post as a solution. Thanks!
How to integrate M-code into your solution -- How to get your questions answered quickly -- How to provide sample data -- Check out more PBI- learning resources here -- Performance Tipps for M-queries
@Anonymous are you saying you are having problems with loading data and maintaining relationships between fact and dimensions table? I have faced similar issue earlier and can help you. Please provide some more detail on what exactly the problem is.