Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
I am doing a direct ingestion from event hub to event house. When I apply restricted view policy on the table that is used for ingestion, the ingestion stops. That completely makes sense. But even though I give them the unrestrictedviewer access and database admin, still the records are not being ingested, which is wierd. Has anyone faced with this situation and do you have some suggestions or alternatives that I can leverage?
Solved! Go to Solution.
Hi @haku99,
Thank you for the follow-up question.
That sounds like a practical approach given your organization’s preference for SAS authentication. I’m glad the explanation helped you make an informed decision.
If you ever decide to explore Managed Identity in the future, it can offer smoother integration with Fabric’s security layers, but your current setup should work just fine without the restricted view policy. Really appreciate you taking the time to share your outcome it’ll surely help others facing a similar situation.
Wishing you all the best with your Event house setup, and don’t hesitate to reach out again if any new questions come up.
Thank you for being part of the Microsoft Fabric Community Forum.
Hi @haku99,
Thank you for reaching out to the Microsoft Fabric Community Forum. Also, thanks to @datacoffee, for his inputs on this thread. I understand how confusing it can be when ingestion suddenly stops after applying a restricted view policy.
What is happening here is that the restricted view policy does not just apply to end users, it also applies to the ingestion identity. Even if you have granted unrestrictedviewer or admin at the database level, the ingestion process usually runs under a managed identity/service principal, and that identity is still being blocked by the policy. To move forward, I’d suggest:
First, check which identity is being used for the ingestion from Event Hub into Eventhouse. Then, add that identity as an exception to the restricted view policy so it can continue writing. Along with that, make sure it has the ingestor role unrestrictedviewer on its own will not allow writes.
If you want to avoid conflicts entirely, a common pattern is to ingest data into a staging/raw table without restrictions and then apply restricted view policies only on the curated/output tables that end users will query.
Refer these links for more information:
1. https://learn.microsoft.com/en-us/kusto/management/restricted-view-access-policy?view=microsoft-fabr...
2. https://learn.microsoft.com/en-us/kusto/management/update-policy?view=microsoft-fabric
3. https://learn.microsoft.com/en-us/fabric/real-time-intelligence/eventhouse
4. https://learn.microsoft.com/en-us/fabric/real-time-intelligence/manage-monitor-eventhouse
Hope this clears it up. Let us know if you have any doubts regarding this. We will be happy to help.
Thank you for using the Microsoft Fabric Community Forum.
Thanks for this great response @v-kpoloju-msft!!
Appreciate your thoughts. Yes I tried with ingestors as well, but I am not able to ingest data into it. Maybe I am doing something wrong in the process.
Currently I am finding the identity using:
.show ingestion failures
| top 100 by ingestion_time()
From the output I am finding the principal and providing access.
And to provide more context, my source is Azure event hub and my connection is using SAS not managed identity.
And my goal of using restricted policy in this scenario is, I am dumping all the data from event hub in this table and transferring the data to other tables after transformations using update policies. I dont want the users to access this data dump table and I am exploring ways to achieve this.
Hi @haku99,
Thank you for the detailed follow-up, that helps a lot in understanding your setup. I see now that your ingestion is happening via Azure Event Hub with SAS authentication (instead of Managed Identity).
Since SAS tokens do not carry an identity that restricted view policies can exempt, ingestion will always fail in that setup. The cleanest solution is either:
Remove the restricted view policy from your raw dump table and secure it by access control instead, OR Switch ingestion to Managed Identity/Service Principal so you can explicitly allow that identity under the restricted policy.
This should help unblock you while keeping your staging table hidden from end users.
Hope this clears it up. Let us know if you have any doubts regarding this. We will be happy to help.
Thank you for using the Microsoft Fabric Community Forum.
Hi @haku99,
Just checking in to see if the issue has been resolved on your end. If the earlier suggestions helped, that’s great to hear! And if you’re still facing challenges, feel free to share more details happy to assist further.
Thank you.
Hi @haku99,
Hope you had a chance to try out the solution shared earlier. Let us know if anything needs further clarification or if there's an update from your side always here to help.
Thank you.
Apologies for my delay in response. I havent tried that, as our org has the tendency to use SAS over managed identities. So we have decided to move forward without using restricted access policy. I deeply appreciate your insights and assistance in this matter!
Hi @haku99,
Thank you for the follow-up question.
That sounds like a practical approach given your organization’s preference for SAS authentication. I’m glad the explanation helped you make an informed decision.
If you ever decide to explore Managed Identity in the future, it can offer smoother integration with Fabric’s security layers, but your current setup should work just fine without the restricted view policy. Really appreciate you taking the time to share your outcome it’ll surely help others facing a similar situation.
Wishing you all the best with your Event house setup, and don’t hesitate to reach out again if any new questions come up.
Thank you for being part of the Microsoft Fabric Community Forum.
Hi @haku99,
Just checking in to see if the issue has been resolved on your end. If the earlier suggestions helped, that’s great to hear! And if you’re still facing challenges, feel free to share more details happy to assist further.
Thank you.
Hi @haku99,
Just wanted to follow up one last time. If the shared guidance worked for you, that’s wonderful hopefully it also helps others looking for similar answers. If there’s anything else you'd like to explore or clarify, don’t hesitate to reach out.
Thank you.
Thanks for this suggestion @v-kpoloju-msft. i will try to test it out and update how it went. Appreciate your assistance!
Hello
This is as expected. The database does not support any security functions to a streaming enabled table.
From the documentation we get this paragraph:
Row level security policy - Kusto | Microsoft Learn
To come around this, you need to remove the securty function on the table and, if needed for RLS or other purposes, create a medallion architecture, as I try to show in this blogpost:
Worlds Fastest Medallion Load
Appreciate your response @datacoffee .
I agree there were limitations but was trying to see if this is a known limitation or is it something thats Microsoft not aware. My goal of using restricted policy in this scenario is, I am dumping all the data from event hub in this table and transferring the data to other tables after transformations using update policies. I dont want the users to access this data dump table and I am exploring ways to achieve this.
Got it!
then I would look into a database split of data for ingestion and data for consumption. Then you can define the access on table level and database level on the consumption based database and keep the streaming data in the ingestion database
there are a lot of options to come around this, the split database option is the easy fix here 😊
Thats a great idea @datacoffee. But update policies dont work cross database I presume. And I have a constraint of not deduping data and maintatining real time latency. What would be a optimal way to achieve that?
A materialized view is doable across databases.
Even though you have the requirement to add a SUMMARIZE function to the view, you can always do that across all columns and do the agreegation on a dummy column.
It sounds "hacky" but it works like a charm
Thats an interesting option @datacoffee . I will try that out. Appreciate your insights!
Turn streaming data into instant insights with Microsoft Fabric. Learn to connect live sources, visualize in seconds, and use Copilot + AI for smarter decisions.