Join us for an expert-led overview of the tools and concepts you'll need to pass exam PL-300. The first session starts on June 11th. See you there!
Get registeredJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
Hi,
I setup (manually :(( ) rules for deployments of notebook to another workspace using deployment pipelines. I have few observations:
1. its UI interface. can not define it within a code in my repo files 😞
2. its really not UX friendly as it ask for ids of lakehouse (everywhere else you can choose workspase and lakehouse by name)
3. it works only for default lakehouse. many times I have two lakehouses connected one for reading one for write. It should support both, right?
4. It disappear after deployment and I have to click 😞 it again ? really ? are you sure ? is it bug or feature ?
Thanks
Hi, its very simple..
Right now we have to have development workspace connected to main. Cause as you really know, you can work in the same workspace each developer in own branch. its somehow always connected 1:1 to the git branch 😄
So you you have to create your own workspace to work on new feature in the new (your) branch. You can do so locally ofc. but still you need separted workspace with lakehouse to write. (ofc. I can prepare local env. for each with delta lake and catalog populated, but it not what you expect from SaaS Data Platform).
Ok so I prepared a code in my branch with my workspace and lakehouse.
the next step you want to merge it to main. thats happen over PR. but surprise...
my code overwrote all connections (lakehouse to notebooks. ) ... cause its always part of the "notebook" file. and not connected to environment/workspace. 😞
To be honest. I am skeptical to write Ideas and Feedback. why ? its not a new idea. its something working fine in Azure Data Factory, working great in Azure DevOps Pipelines.
Its not starship am asking for. everything already exists in former Azure services.
I really do not understand why some team decided to throw almost everything away and to start develop from scratch. they do no need my ideas. they need to implement what already existed there.
Now Data pipeline in Fabric Factory even does not have editable json on the backend 😄 (historical feature of Azure Data Factory) how do you want to implement DevOps here.
Maybe I will find a time to copy it there. But right now I have to go, and manually switch notebooks on Dev back to Dev lakehouses.
The thread could be same for:
- buildin resources
- spark environments
@Anonymous @LeeBenjamin sorry guys, it was reaction to your posts.
And I have to write, I was wrong. I can not overwrite lakehouse connections in DEV workspace 😄 its connected to main branch, which is protected. And lakehouse links are part of notebook code. So I have to create hotfix branch and overwrite ids there.
I am writing here to community forum, cause I still hope I am doing something wrong and I miss some big idea of authors how dev cycle should looks like in the Fabric.
Is it better with Spark jobs ?
Hi @evrise ,
Thanks for using Fabric Community.
1. Currently its a UI interface and we cannot do it with the help of code.
2. At present we have limitation inorder to add lakehouse - we need to provide its ids, you can share a feedback so we can improve it and it might be more user friendly.
3. At present this is not supported.
4. Please share this in feedback (when you are observing) and internal team will definitely look into this issue.
As per @LeeBenjamin said,
Appreciate if you could share the feedback on our feedback channel. Which would be open for the user community to upvote & comment on. This allows our product teams to effectively prioritize your request against our existing feature backlog and gives insight into the potential impact of implementing the suggested feature.
Hope this helps. Please let me know if you have any further queries.
@evrise ,
could you please elaborate on the disappearing part? this shouldn't happend.
As for the default LH - It is true you can switch manually between LHs as default, based on your specific logic, but rules do not support switching the default LH based on a defined logic. I suggest you open an idea on our community so Notebook product team could examine the request for this feature.
Thanks,
Lee
User | Count |
---|---|
80 | |
43 | |
16 | |
11 | |
7 |
User | Count |
---|---|
93 | |
88 | |
27 | |
8 | |
8 |