Currently, OneLake Security Roles must be configured individually within each Lakehouse. For organizations managing multiple Lakehouses based on data domains, this results in significant manual effort and duplication. Additionally, maintaining both Active Directory (AD) groups and separate OneLake Security Roles adds complexity and increases the risk of misalignment.
... View more
Submitted
2 hours ago
Submitted byMuhiddin
2 hours ago
We are aware that the Microsoft 365 Usage Analytics app is currently available only in English. It would be highly beneficial if this app could be localized into additional languages, such as Japanese, to better support users across different regions. Providing multilingual support would greatly improve usability and accessibility for non-English-speaking users.
... View more
Submitted
3 hours ago
Submitted byYevgenyM
3 hours ago
Ensure that for each source table and target schema (preferably per database), there is at most one Lakehouse shortcut. This prevents duplication of shortcuts pointing to the same source, simplifies data governance, and reduces confusion in downstream processes. Problem Statement: Currently, multiple shortcuts can be created for the same source table within the same schema, leading to redundancy and potential inconsistencies in data access and lineage tracking. Proposed Solution: Implement a validation mechanism or enforcement rule that checks for existing shortcuts before creating a new one. If a shortcut already exists for the given source table and target schema (or database), the system should either: Block the creation of a duplicate shortcut, or Prompt the user to reuse the existing shortcut. Benefits: Improves data consistency and governance. Reduces storage and maintenance overhead. Enhances clarity for developers and analysts working with Lakehouse shortcuts.
... View more
Submitted
5 hours ago
Submitted byNandhaKumarPBI
5 hours ago
When we publish a Schemantic Model from our local to service, it just says Publishing that file and we have to keep an eye on if that is getting published or not. Instead, if we can keep a loading bar as how far it has published based on the size of the file or total MB's uploaded out of total MB's of that Schemantic model could help users to understand how much time it is taking to upload the file to server. Thanks.
... View more
This is driving me absolutely crazy and should be the simplest fix, surely! I'm replicating a lot of what used to be very manual PowerPoint slides into a Power BI format, and the last piece of the puzzle that I need for this to be totally seamless for the rest of my team is simply just for the text boxes to function like you would expect a Microsoft text box to function in 2025 so they can write commentary as needed. It doesn't matter if I write the text first and then change the size, or change the size first, or start with bullet points selected or change to bullets after, no matter what I do, the bullet points remain a size 10. Same with numbers. How on earth was this shipped as a functional text box if it can't do the most basic thing that I'm pretty sure has been standard in Microsoft programs for like 30 years at this point? And please don't come at me for Power BI's function not needing to focus on text boxes because "the data should speak for itself" (as I've seen in other threads that have brought this up in the past) - I don't disagree, but I also don't think it's out of line to expect that if a text box is given as an option, it should work as intended.
... View more
Today, only Oauth 2.0 (user account) is supported as authentication option when setting up an Azure Key Vault reference.
Please add support for
Workspace Identity
Service principal
User assigned managed identity
... View more
It would be great if Power BI could automatically generate a color theme for reports based on an uploaded image, website URL, or logo. For example: Upload a company logo → Power BI generates a matching palette for all visuals Input a website URL → Extracts main colors and applies a cohesive theme Benefits: -Saves time manually choosing colors -Ensures brand consistency in reports -Helps users create visually appealing reports effortlessly This feature could be combined with existing themes and allow minor adjustments for personalization.
... View more
The Reactor DataDays webpage https://learn.microsoft.com/en-us/collections/2g42i3mww7z36g?wt.mc_id=1reg_26407_webpage_reactor&source=docs has a bad link to https://aka.ms/fabricdataydays when it should be https://aka.ms/fabricdatadays
... View more
I am unable to use field parameters in Power BI Report Server. It can be used normally in the desktop version of Power BI, but when publishing reports to the report server, the slicer for field parameters cannot take effect.
... View more
One of my clients is using Key Vault to retrieve secrets, which is a good thing. They also want to move to the integrated options, removing the need for them to use a private link that takes time to become available and slows down development and data processing. The main reason they can't do this is because Key Vault doesn't support RBAC yet, it's only ACL authentication. ACL is something the good people of Azure are advising against. Please enable RBAC support for this feature, using the Fabric workspace identity 🙂
... View more
Hi Team, Currently, Power BI allows assigning a single bookmark to a button, but there is no option to link a button directly to an entire bookmark group. I have created multiple bookmark groups, each containing several related bookmarks. It would be extremely helpful if a single button could trigger all bookmarks within a group — either by cycling through them or by executing the entire group sequence automatically. Why This Feature Is Needed Reduces the need to create multiple buttons for related views Makes report navigation smoother and more user-friendly Simplifies UX when handling complex bookmark-driven interactions Helps developers maintain cleaner and more manageable report layouts Suggested Feature Enhancement Please consider adding: a. “Assign Bookmark Group” — An option to choose a bookmark group instead of a single bookmark b. “Trigger Mode” — Ability to run all bookmarks sequentially or cycle through them This enhancement would be highly valuable for Power BI developers who rely on bookmarks for storytelling, navigation, and advanced UI customization. Thanks and Regards, Kanchan Giri
... View more
Could there be a predesigned naming template definition which can be attached to a capacity or a workspace? Availability of a configurable template in a capacity or workspace will ensure better adherence to naming standards for all Fabric items that gets created/authored. Capacity admin or workspace admin should have the authority to create/modify/attach naming templates.
... View more
Currently Microsoft Fabric does not have a window to monitor key metrics from Microsoft Purview. You need to access Microsoft Purview hub which is also very basic. Could Purview hub be more functional more integrated with Fabric at least for workspace admins? Idea is not to get into dedicated Purview activities via Purview portal, but I should be able to see some integrated Purview monitor right on my Fabric page.
... View more
Fabric home page - could this be designed with "my favorites", "my last 10 fabric items", "my last 5 fabric workspaces", "pipeline status", "report refresh status", "failures", etc? Actually the home page should not be a page to skip - it should be very relevant for a developer, designer, workspace admin with the sections/functions that they require to perform. Quick access is available at this moment, that too you need to scroll if you are using a small screen.
... View more
The admin REST api for List Item Access Details is misscoped to tenant.read.all. This prevents a capacity administrator or workspace administrator who is not a tenant level administrator from using this API to govern access to items under their area of control. There is no alternative API scoped to the workspace level. This API requires you to pass in a workspace id, so it should fail if you attempt to read data from a workspace you don't have access to. On the whole, Fabric needs management apis that are scoped to workspace that can also then work for capacity and/or tenant if appropriate rights are granted.
... View more
When publishing an Import-mode semantic model built on the Lakehouse SQL Analytics endpoint, the dataset fails to refresh with the error: “This semantic model uses a default data connection without explicit credentials.” The Data source credentials pane in the Service is greyed out. The Service provides no way to enter credentials. The UI gives no instructions for fixing this. After hours of debugging, the only working solution is: Go to Manage connections and gateways Create a Cloud connection Set Connection type = SQL Server Manually enter the Lakehouse SQL endpoint server + database Authenticate with OAuth Rebind the semantic model to this connection This is: Completely undocumented Counterintuitive (why SQL Server instead of Lakehouse or Fabric SQL?) Not discoverable through the UI Impossible for a normal user to figure out A huge blocker for Import mode adoption What’s worse: Users naturally try Fabric SQL database, Lakehouse, or Azure SQL — but these do not work and they never show the required server/database fields. Only SQL Server works, even though the endpoint is not SQL Server. Why this matters: The entire reason I used Import mode was because Direct Lake cannot be shared in a Power BI App unless every user is added to the workspace. Import mode should be the “safe/compatible” option — but in this case it simply breaks unless you know a hidden workaround. Ask / Proposal: Fix the refresh pipeline so Import models using Lakehouse SQL endpoints do not depend on a hidden SQL Server cloud connection OR Update the UX so that creating a Lakehouse SQL endpoint connection is intuitive: Expose server/database fields for Lakehouse connections Provide a visible “Fix this connection” button Remove or explain the “default data connection” state Add proper documentation explaining how to configure cloud connections for Lakehouse SQL endpoints This is a major UX gap that makes Fabric far more confusing and error-prone than it needs to be.
... View more
Direct Lake mode currently forces every report consumer to be added directly to the workspace in order to view reports that use Lakehouse tables. This defeats the entire purpose of: Power BI Apps Audience views Role-based access Separation of development from consumption Because of this limitation: I cannot publish a Power BI App containing Direct Lake reports unless every single user is added to the workspace. If I use App Audiences, users who are not workspace members get errors and cannot see the Direct Lake content. This forces me to either: Give dozens/hundreds of users workspace-level access (which is inappropriate and messy), OR Avoid Direct Lake entirely even though it is the recommended, optimized Fabric experience. This effectively blocks enterprise adoption of Direct Lake for any scenario where: Not all users belong in the dev workspace Audience segmentation is required RLS or scoped access is required Workspaces are used as development/staging areas Ask / Proposal: Allow Direct Lake models and reports to be consumed through Power BI Apps without requiring workspace membership — the same way Import and DirectQuery models work today. This is absolutely essential for: Data governance Security boundaries App-based distribution Enterprise scalability Real-world deployment patterns Direct Lake becomes unusable for most organizations without this.
... View more
As part of enabling Change Data Feed (CDF) on Fabric mirror databases, please consider exposing the Delta Lake table_changes table-valued function so that it can be invoked directly through the Lakehouse T-SQL endpoint. This functionality would allow users to query incremental changes using familiar T-SQL syntax, improving integration with downstream systems and simplifying change tracking workflows. Example Usage: -- Retrieve changes since version 2
SELECT * FROM table_changes('myschema.mytable', 2);
-- Retrieve changes since a specific timestamp
SELECT * FROM table_changes('myschema.mytable', '2025-11-14T12:00:00.000+0000')
ORDER BY _commit_version; Why this would be helpful: Enables consistent access to CDF data without requiring Spark or Python. Supports real-time analytics and ETL scenarios using standard SQL tools. Aligns with Fabric’s goal of providing unified query experiences across engines. -> Fabric-Ideas/Enable-Change-Data-Feed-CDF-on-a-Mirror-Database/idi-p/4500759
... View more
When drilling through, I want the new page to open in a new tab in my web browser. In the current solution the user has to keep pressing "back" to go back to the previous page. This is bad for UX.
... View more
Hi Community, When creating a new workspace item (report, dataset, dataflow, etc.), users often don’t know whether it requires Pro or Premium/Fabric. The service only informs you after you attempt to create the item, prompting you to upgrade. This wastes time and causes confusion. I suggest adding a license type label (similar to connector types in Power Apps) next to each item in the “New” menu, as illustrated in the capture below. It would also be helpful to include a drop‑down list to make searching easier. 💡Do you find it helpful? 👍 A vote would be appreciated 🟩 Follow me on LinkedIn
... View more