The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Hi,
I recieve the following error message Dataset_Import_FailedToImportDataset when syncing new changes on the semantic model to the Git workspace in Fabric.
I´ve read previous post in this forum regarding workarounds - do Microsoft have any solution to this yet, or are they working on a solution on this issue?
We first receieved this issue, when we made changes on a existing semantic model, where we simply just changed the schema and view name in Power Query for the data source. My initial workaround was to manually publish the report and semantic model to the workspace and sync back into Git, with Keep Incoming Changes - which solved the issue.
Solved! Go to Solution.
Hi @Barre ,
We have reported this issue and submitted it to the product team. They have been aware of the issue and fixed it. Please check it later.
Best Regards,
Ada Wang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @Barre ,
We have reported this issue and submitted it to the product team. They have been aware of the issue and fixed it. Please check it later.
Best Regards,
Ada Wang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @Barre ,
I am glad to know that the manual solution you used temporarily resolved the issue.
I know you are looking for a more permanent solution, we have reported this issue and submitted it to the product team. They have been aware of the issue and the engineers will do their best to resolve it.
I will update here if there is any progress, so please be patient.
Best Regards,
Ada Wang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hi @Anonymous ,
Thank you for the response.
Glad that the issue has been reported already to the product team - hopefully they got a solution in near future.