Power BI is turning 10! Tune in for a special live episode on July 24 with behind-the-scenes stories, product evolution highlights, and a sneak peek at what’s in store for the future.
Save the dateEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Hi,
I had created this post before but for some reason it has disappeared and I can't see it in forum, hence posting again 🙂
I have migrated from synaspe to fabric. In synapse I had a copy activity which read data through a stored procedure from Azure SQL Database. I would this file as parquet and later on use spark to save it as delta.
Coming to fabric I realised I can pick table as destination in lakehouse instead of files. So I thought I could utilise it to partition the data already in ingestion.
This works perfectly fine when file is selected but when table is selected it fails.
My stored procedure has couple of interim temp tables that create the final logic in a final temp table and I just select from it.
A snippet of the SP is found here:
In the last step a simple select is done
I face the following error:
Failure happened on 'Source' side. ErrorCode=SqlOperationFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=A database operation failed with the following error: 'Invalid object name '#Conf_Comparison'.',Source=,''Type=System.Data.SqlClient.SqlException,Message=Invalid object name '#Conf_Comparison'.,Source=.Net SqlClient Data Provider,SqlErrorNumber=208,Class=16,ErrorCode=-2146232060,State=0,Errors=[{Class=16,Number=208,State=0,Message=Invalid object name '#Conf_Comparison'.,},],'
It is hinting that final table for select is invalid (analytics.[#conf_Comparison]
Any insights is appreciated.
Thanks
Solved! Go to Solution.
The problem was reading from temp tables. When I switched to table variable in SQL then it went fine.
However, I have another SP which uses temp table and the temp table will have much more data, so table variable is not really useful in that scenario.
For now, I'm pushing some of this transformation to be done by spark instead
p.s. this looks to be a similar issue in ADF in the past as well
https://learn.microsoft.com/en-us/answers/questions/129102/temporary-table-query-error-in-copy-data-...
HI @Anonymous ,
AFAIK, the Lakehouse table level does not support to add subfolders. BTW, what type of data are you try to import?
Current the table level seems only support delta table, so the import data is better to use PARQUET or CSV file types which can be recognized and transform to delta.
Lakehouse Load to Delta Lake tables - Microsoft Fabric | Microsoft Learn
If your operation requires some temporary cache or used the not support data format, you can store them into the file level. Then you can use a notebook read/convert them and save to the table level.
Load data into your lakehouse with a notebook - Microsoft Fabric | Microsoft Learn
Regards,
Xiaoxin sheng
The problem was reading from temp tables. When I switched to table variable in SQL then it went fine.
However, I have another SP which uses temp table and the temp table will have much more data, so table variable is not really useful in that scenario.
For now, I'm pushing some of this transformation to be done by spark instead
p.s. this looks to be a similar issue in ADF in the past as well
https://learn.microsoft.com/en-us/answers/questions/129102/temporary-table-query-error-in-copy-data-...
HI @Anonymous,
I'm glad to hear you find the root cause of the issue.
Regards,
Xiaoxin Sheng
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
User | Count |
---|---|
5 | |
4 | |
2 | |
2 | |
2 |