Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
I created a Notebook and ran it manually without issues. The Notebook executes an AI function (ai.classify), accepts input parameters, and then inserts the results into the Lakehouse.
However, I encounter a problem when using the REST API. I created a Service Principal and added the scopes as instructed on the website (Job Scheduler - Run On Demand Item Job - REST API (Core) | Microsoft Learn).
In View Recent Runs, the status shows Success.
But when I check the run details, I see the following error: Spark_System_MetaStore_UnableToFetchMWCToken (Error inserting data using the Metastore. The Metastore keeps track of tables and views used by Spark. This error typically means that your code referred to a table or view that did not exist, or a column within a table or view that does not exist. 1. Ensure the table or view you are referring to exists. 2. Ensure all columns that that are being referenced in the insert exist in the target tables. 3. Check the logs for this Spark application. Inspect the logs for a clearer indication of which table relation is causing this issue.)
If I use the bearer token of my account (not service principal), the run is good.
I’m wondering which scope or additional step I need to configure to ensure the run completes successfully.
Solved! Go to Solution.
I add the service principal as Contributor in "Manage access" and it's good now.
I add the service principal as Contributor in "Manage access" and it's good now.
Hi @qnguyen12,
It looks like you still need to add permissions to the service principal on the ADLS Gen 2 storage in Azure.
Are you using an external ADLS Gen 2 storage area outside of Fabric?
If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution.
Hi @tayloramy , I don't use ADLS Gen 2 storage area outside of Fabric.
I tried to add Azure Data Lake permission but failed because I don't use the service in Azure?
I added permission to lakehouse for the service princial but got new errors