Hi Team, Currently, Power BI allows assigning a single bookmark to a button, but there is no option to link a button directly to an entire bookmark group. I have created multiple bookmark groups, each containing several related bookmarks. It would be extremely helpful if a single button could trigger all bookmarks within a group — either by cycling through them or by executing the entire group sequence automatically. Why This Feature Is Needed Reduces the need to create multiple buttons for related views Makes report navigation smoother and more user-friendly Simplifies UX when handling complex bookmark-driven interactions Helps developers maintain cleaner and more manageable report layouts Suggested Feature Enhancement Please consider adding: a. “Assign Bookmark Group” — An option to choose a bookmark group instead of a single bookmark b. “Trigger Mode” — Ability to run all bookmarks sequentially or cycle through them This enhancement would be highly valuable for Power BI developers who rely on bookmarks for storytelling, navigation, and advanced UI customization. Thanks and Regards, Kanchan Giri
... View more
Submitted
5 hours ago
Submitted byAnshumanB
5 hours ago
Could there be a predesigned naming template definition which can be attached to a capacity or a workspace? Availability of a configurable template in a capacity or workspace will ensure better adherence to naming standards for all Fabric items that gets created/authored. Capacity admin or workspace admin should have the authority to create/modify/attach naming templates.
... View more
Submitted
5 hours ago
Submitted byAnshumanB
5 hours ago
Currently Microsoft Fabric does not have a window to monitor key metrics from Microsoft Purview. You need to access Microsoft Purview hub which is also very basic. Could Purview hub be more functional more integrated with Fabric at least for workspace admins? Idea is not to get into dedicated Purview activities via Purview portal, but I should be able to see some integrated Purview monitor right on my Fabric page.
... View more
Submitted
6 hours ago
Submitted byAnshumanB
6 hours ago
Fabric home page - could this be designed with "my favorites", "my last 10 fabric items", "my last 5 fabric workspaces", "pipeline status", "report refresh status", "failures", etc? Actually the home page should not be a page to skip - it should be very relevant for a developer, designer, workspace admin with the sections/functions that they require to perform. Quick access is available at this moment, that too you need to scroll if you are using a small screen.
... View more
Submitted
10 hours ago
Submitted bytimothyeharris
10 hours ago
The admin REST api for List Item Access Details is misscoped to tenant.read.all. This prevents a capacity administrator or workspace administrator who is not a tenant level administrator from using this API to govern access to items under their area of control. There is no alternative API scoped to the workspace level. This API requires you to pass in a workspace id, so it should fail if you attempt to read data from a workspace you don't have access to. On the whole, Fabric needs management apis that are scoped to workspace that can also then work for capacity and/or tenant if appropriate rights are granted.
... View more
Submitted
12 hours ago
Submitted bybarztown
12 hours ago
When publishing an Import-mode semantic model built on the Lakehouse SQL Analytics endpoint, the dataset fails to refresh with the error: “This semantic model uses a default data connection without explicit credentials.” The Data source credentials pane in the Service is greyed out. The Service provides no way to enter credentials. The UI gives no instructions for fixing this. After hours of debugging, the only working solution is: Go to Manage connections and gateways Create a Cloud connection Set Connection type = SQL Server Manually enter the Lakehouse SQL endpoint server + database Authenticate with OAuth Rebind the semantic model to this connection This is: Completely undocumented Counterintuitive (why SQL Server instead of Lakehouse or Fabric SQL?) Not discoverable through the UI Impossible for a normal user to figure out A huge blocker for Import mode adoption What’s worse: Users naturally try Fabric SQL database, Lakehouse, or Azure SQL — but these do not work and they never show the required server/database fields. Only SQL Server works, even though the endpoint is not SQL Server. Why this matters: The entire reason I used Import mode was because Direct Lake cannot be shared in a Power BI App unless every user is added to the workspace. Import mode should be the “safe/compatible” option — but in this case it simply breaks unless you know a hidden workaround. Ask / Proposal: Fix the refresh pipeline so Import models using Lakehouse SQL endpoints do not depend on a hidden SQL Server cloud connection OR Update the UX so that creating a Lakehouse SQL endpoint connection is intuitive: Expose server/database fields for Lakehouse connections Provide a visible “Fix this connection” button Remove or explain the “default data connection” state Add proper documentation explaining how to configure cloud connections for Lakehouse SQL endpoints This is a major UX gap that makes Fabric far more confusing and error-prone than it needs to be.
... View more
Submitted
12 hours ago
Submitted bybarztown
12 hours ago
Direct Lake mode currently forces every report consumer to be added directly to the workspace in order to view reports that use Lakehouse tables. This defeats the entire purpose of: Power BI Apps Audience views Role-based access Separation of development from consumption Because of this limitation: I cannot publish a Power BI App containing Direct Lake reports unless every single user is added to the workspace. If I use App Audiences, users who are not workspace members get errors and cannot see the Direct Lake content. This forces me to either: Give dozens/hundreds of users workspace-level access (which is inappropriate and messy), OR Avoid Direct Lake entirely even though it is the recommended, optimized Fabric experience. This effectively blocks enterprise adoption of Direct Lake for any scenario where: Not all users belong in the dev workspace Audience segmentation is required RLS or scoped access is required Workspaces are used as development/staging areas Ask / Proposal: Allow Direct Lake models and reports to be consumed through Power BI Apps without requiring workspace membership — the same way Import and DirectQuery models work today. This is absolutely essential for: Data governance Security boundaries App-based distribution Enterprise scalability Real-world deployment patterns Direct Lake becomes unusable for most organizations without this.
... View more
Submitted
12 hours ago
Submitted bydxbennett
12 hours ago
As part of enabling Change Data Feed (CDF) on Fabric mirror databases, please consider exposing the Delta Lake table_changes table-valued function so that it can be invoked directly through the Lakehouse T-SQL endpoint. This functionality would allow users to query incremental changes using familiar T-SQL syntax, improving integration with downstream systems and simplifying change tracking workflows. Example Usage: -- Retrieve changes since version 2
SELECT * FROM table_changes('myschema.mytable', 2);
-- Retrieve changes since a specific timestamp
SELECT * FROM table_changes('myschema.mytable', '2025-11-14T12:00:00.000+0000')
ORDER BY _commit_version; Why this would be helpful: Enables consistent access to CDF data without requiring Spark or Python. Supports real-time analytics and ETL scenarios using standard SQL tools. Aligns with Fabric’s goal of providing unified query experiences across engines. -> Fabric-Ideas/Enable-Change-Data-Feed-CDF-on-a-Mirror-Database/idi-p/4500759
... View more
Submitted
12 hours ago
Submitted bytanjil
12 hours ago
When drilling through, I want the new page to open in a new tab in my web browser. In the current solution the user has to keep pressing "back" to go back to the previous page. This is bad for UX.
... View more
Submitted
14 hours ago
Submitted byDataVitalizer
14 hours ago
Hi Community, When creating a new workspace item (report, dataset, dataflow, etc.), users often don’t know whether it requires Pro or Premium/Fabric. The service only informs you after you attempt to create the item, prompting you to upgrade. This wastes time and causes confusion. I suggest adding a license type label (similar to connector types in Power Apps) next to each item in the “New” menu, as illustrated in the capture below. It would also be helpful to include a drop‑down list to make searching easier. 💡Do you find it helpful? 👍 A vote would be appreciated 🟩 Follow me on LinkedIn
... View more
Submitted
15 hours ago
Submitted byjmoko
15 hours ago
Please allow us to modify the email body that in sent from Power Bi . Our Company would like to be able to add our logo instead of Microsfoft and remove the Micosoft content ( Your opinion Matter ... ).
... View more
Submitted
15 hours ago
Submitted bypmscorca
15 hours ago
Hi, I'm trying to use the copy job to get data from a Dynamics 365 Business Central solution by the Rest API connector. As the Base URL I can indicate the OData link of the BC API service, but for the Relative URL it isn't possible to specify a table name parameter as a dynamic content and therefore I need to create one copy job for each BC table to read and not only one copy job. I think that it could be very important to be able to specify dynamic content for the Relative URL. In general it would be very useful to have a fully parameterizable copy job. Thanks
... View more
Submitted
16 hours ago
Submitted byBITomS
16 hours ago
I have a scenario where I have a lot of categories for a particular dimension in my model and the breakdown of all of these need to go in a donut/pie chart. I can't find a simple way of setting a threshold (say 3%), so if anything makes up a proportion less than this, it gets grouped into a single category of 'Other'. This would improve end user visibility as we still want to include what this other % is, but don't want all the segments as it gets very busy!
... View more
Submitted
16 hours ago
Submitted byNat-J
16 hours ago
I would like to request adding pattern-based visual formatting in Power BI visuals to improve accessibility for users with color-vision deficiencies. Many visuals and conditional formatting options rely mainly on color, which can be difficult for people who cannot distinguish certain color differences. I suggest adding an option to use non-color indicators such as simple patterns or textures. This would offer an additional cue that does not depend on color and would make the visuals more inclusive.
... View more
Submitted
17 hours ago
Submitted byBrettWist
17 hours ago
It would be helpful, when creating a schedule for a pipeline to be able to provide parameter values for the pipeline execution for that schedule. It would make the pipeline more functional and versatile to be able to have multiple schedules for the pipeline with different parameter values based on the purpose of the schedule.
... View more
Submitted
17 hours ago
Submitted byBrettWist
17 hours ago
The Deployment Pipeline Compare function is very useful to highlight differences between objects in the various deployment stages. But it would be useful to have an option to not identify objects as different if the only difference is in a schedule. It would be common for an object to have a schedule in a Production stage but not in others, or to have different schedules in different stages. If those are the only differences it would be nice for an option to not identify those objects as different in the stages.
... View more
Submitted
17 hours ago
Submitted byDamyana92
17 hours ago
The detection data type on the Power Query Editor for data flows, and the transformation data on the Power BI Desktop, don't account for all data when column profiling is set to the entire data set. It is problematic. It introduces refresh errors because it doesn't use distinct column values to ascertain the data type it should be.
... View more
Submitted
18 hours ago
Submitted byForeverAlone
18 hours ago
I can't be the only one who can't stand using this thing with all the panes on the right. It's always panes on the left, visual studio, SQL, File explorer, Windows settings just to name a few I use each day. Let us rearange this stuff it's insanely annoying. Consider my OCD triggered.
... View more
Submitted
18 hours ago
Submitted byMuhiddin
18 hours ago
“When a user accesses a report via a shared URL (using the Share button), they are unable to create a Usage metrics report, even if they hold an Admin, Member, or Contributor role in the workspace.” Please check there is "Usage metrics report "option. Please make that Admin, Member, or Contributor role user can create Usage metrics report no matter in what way they open report.
... View more
Submitted
20 hours ago
Submitted bybogdanfontana
20 hours ago
I am using multiple Excel and CSV files as data sources, and when I try to refresh my report, I get the following error message which doesn't specify which file it refers to: Data source error: DataFormat.Error: We were unable to load this Excel file because we couldn't understand its format. File contains corrupted data.. Microsoft.Data.Mashup.ErrorCode = 10942. ;We were unable to load this Excel file because we couldn't understand its format. File contains corrupted data.. The exception was raised by the IDbCommand interface. After checking all files, I found that they all have regular extensions (.csv or .xlsx), which doesn't match the error description. It would be helpful if the error message could specify which file is causing the problem.
... View more