The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
I have a pipeline with a Copy Activity followed by a notebook that loads the copied files to Delta tables. There is a service account on the workspace with contributor role. When logging in manually as the service account, it can successfully run the entire pipeline. However a scheduled overnight run results in an error "User is not authorised for this artifact xxxx-xxxx...." - the artifact ID is that of the notebook.
The Copy activity successfully copies into the Files section of the Lakehouse, and the Load_to_Tables notebook is (meant to be) loading tables into the same lakehouse (in a specific schema).
Direct access permissions on the notebook are Read, Write, Execute (from contributor role).
The service account is used to set the pipeline schedule.
Notebook run log shows no instance of the notebook starting at scheduled time - presumably because the account doesn't have access to it in order to run it.
What else can I check/do to get the schedule running?
Solved! Go to Solution.
Hi @Kesahli ,
I have the following suggestions:
Make sure the service account has permissions everywhere it is needed, not just the workspace, but also the specific Lakehouse or Delta tables. The Storage Blob Data Contributor role may be required.
Specific information on this point of advice can be found at the link below:
Issue when running notebook from pipeline - Microsoft Q&A
Try reconfiguring the pipeline or re-adding the notebook, sometimes this can solve the problem.
If that doesn't work, try creating a new notebook and re-adding it to the pipeline.
Best Regards,
Yang
Community Support Team
If there is any post helps, then please consider Accept it as the solution to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!
Hi @Kesahli ,
I have the following suggestions:
Make sure the service account has permissions everywhere it is needed, not just the workspace, but also the specific Lakehouse or Delta tables. The Storage Blob Data Contributor role may be required.
Specific information on this point of advice can be found at the link below:
Issue when running notebook from pipeline - Microsoft Q&A
Try reconfiguring the pipeline or re-adding the notebook, sometimes this can solve the problem.
If that doesn't work, try creating a new notebook and re-adding it to the pipeline.
Best Regards,
Yang
Community Support Team
If there is any post helps, then please consider Accept it as the solution to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!