Power BI is turning 10! Tune in for a special live episode on July 24 with behind-the-scenes stories, product evolution highlights, and a sneak peek at what’s in store for the future.
Save the dateEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
We have hundreds of .txt files which ends up being about 16M records. Scheduled refresh fails due to XML timeout (even with premium capacity). Makes sense loading that many .txt files from SharePoint data source in an online refresh. My thought is to load all of those records once into a semantic model of it's own and then using that model to replace the loading of all those .txt files.
Am I on the right track here?
My weakness would be in reconfiguring the existing model to use the new model as a data source instead of all those .txt files.
Any ideas or help would be appreciated.
Hi @TomRobbins ,
I hope this information is helpful. Please let me know if you have any further questions or if you'd like to discuss this further. If this answers your question, please accept it as a solution and give it a 'Kudos' so other community members with similar problems can find a solution faster.
Thank you.
Hi @TomRobbins ,
I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If my response has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.
Thank you.
Hi @TomRobbins ,
May I ask if you have resolved this issue? If so, please mark the helpful reply and accept it as the solution. This will be helpful for other community members who have similar problems to solve it faster.
Thank you.
Hi @TomRobbins ,
Did the above suggestions help with your scenario? if that is the case, you can consider Kudo or Accept the helpful suggestions to help others who faced similar requirements.
If these also don't help, please share more detailed information and description to help us clarify your scenario to test.
How to Get Your Question Answered Quickly
Regards,
Xiaoxin Sheng
If you have a date created for the text files you could use incremental refresh. You would create a policy and publish to the service. Connect with tabular editor to apply the policy to create the partitions. Then connect with SQL server management server and manually load each partition. On subsequent refreshes only new files would be loaded in the active partitions.
The best option would most likely load these files into a warehouse or lakehouse and load from there
@TomRobbins If you new model is exactly the same as your old model then what I would do is create a brand new Power BI Desktop file. Connect it to your new semantic model. Then, go to your old report and use Ctrl-A, Ctrl-C to copy all the visuals on the page. Paste them onto a page in the new report. Repeat for each page. If everything is the same then everything should just work.
User | Count |
---|---|
5 | |
4 | |
3 | |
2 | |
2 |
User | Count |
---|---|
8 | |
6 | |
4 | |
4 | |
4 |