Share your ideas and vote for future features
Suggest an idea
Options
- Mark all as New
- Mark all as Read
- Float this item to the top
- Subscribe
- Bookmark
- Subscribe to RSS Feed
0
Votes
Submitted
4 hours ago
Submitted by
litan1106
4 hours ago
Please integrate Apache Iceberg Catalog into Microsoft Fabric. Major platforms are adapting the open Iceberg Catalog for the open lakehouse standards. It's future proof when you can perfom cross-engine read and write interoperability via Iceberg Catalog format and RESTAPI. https://www.snowflake.com/en/blog/introducing-polaris-catalog/
... View more
See more ideas labeled with:
0
Votes
Submitted
5 hours ago
Submitted by
jovanpop-msft
5 hours ago

The BULK INSERT Statement in SQL Server/Azure SQL has a DATA_SOURCE option that enables importing files from the paths relative to the DATA_SOURCE location, and also adding a custom authentication in data source. Without the DATA_SOURCE option the BULK INSERT s limited to EntraID authentication only so SQL Server/Azure SQL BULK INSERT statements that use SAS, SPN, Managed Identity for authentication cannot be migrated to Fabric DW. We need a DATA_SOURCE to migrate this kind of statements BULK INSERT (Transact-SQL) - SQL Server | Microsoft Learn
... View more
See more ideas labeled with:
0
Votes
Submitted
6 hours ago
Submitted by
mnaud991
6 hours ago
It would be useful to be able to parameterize the "Connection type" dropdown in the Copy Activity. This would allow avoiding to duplicate distinct copy job for each source or destination data platforms, when trying to leverage metadata to populate the pipeline.
... View more
See more ideas labeled with:
0
Votes
Submitted
7 hours ago
Submitted by
v-beeras
7 hours ago

Customer wants to create Dataflow Gen2 directly in a folder without moving it, according to the document below: Create folders in workspaces - Microsoft Fabric | Microsoft Learn
... View more
See more ideas labeled with:
0
Votes
Submitted
7 hours ago
Submitted by
Anna_Kostenko
7 hours ago

I’d like to propose using Service Principal authentication for publishing reports. This would help avoid individual ownership of semantic models and eliminate the need for constant take over when collaborating on the same asset.
... View more
See more ideas labeled with:
0
Votes
Submitted
8 hours ago
Submitted by
psanchezalvial
8 hours ago
I like the HC feature, i use it a lot, but when i use the VSCode extension I can't connect to an existent HC session and it would be very useful. Please make it happen! 🙏
... View more
See more ideas labeled with:
Submitted
12 hours ago
Submitted by
wcurry
12 hours ago
In SSIS when more than one connector is added to an activity from preceding activities, the logic for the connectors can be an 'and' condition or an 'or' condition. This allows petter process flow between activities and also prevents the need to duplicate activities when preceding activities are a one or the other situation. Please add this in Fabric data pipelines.
... View more
See more ideas labeled with:
0
Votes
Submitted
13 hours ago
Submitted by
JohnAG
13 hours ago
I have been asked for this feature many times by users, they want to be able to enter month-year in the date filter instead of entering the full date as shown below. they don't want to select from a dropdown nor enter the long date. Thanks
... View more
See more ideas labeled with:
0
Votes
Submitted
16 hours ago
Submitted by
Reitse
16 hours ago

I've been digging into the Fabric Monitor hub, and one of the things that I'm missing in the first overview you get is the runtime. One of the issues we regularly encounter with customers is that the ETL process finishes late. To quickly determine which process took more time than expected, an overview of the runtimes in the first view would really help!
... View more
See more ideas labeled with:
0
Votes
Submitted
16 hours ago
Submitted by
Omar_alammar
16 hours ago
It would be extremely helpful if Power BI provided an option to automatically maintain a consistent spacing (e.g., 5–10 pixels) between visuals when stacking or aligning them vertically/horizontally. This would streamline dashboard design, improve consistency, and reduce manual adjustments when resizing or rearranging visuals. Suggested features: A setting to define default spacing between visuals Smart guides or auto-snap to spacing Support for equal spacing in "Align" and "Distribute" functions This small UX feature would save time and improve layout precision, especially for reports with many visuals.
... View more
See more ideas labeled with:
0
Votes
Submitted
yesterday
Submitted by
UriBarash
yesterday

Have one Eventhouse for Fabric monitoring for each capacity and not for each workspace. That will dramatically reduce my costs and make it easier to correlare the different telemtry streams.
... View more
See more ideas labeled with:
0
Votes
Submitted
Saturday
Submitted by
jasonhorner
Saturday
There is currently no reliable way to view the size of a data warehouse in Microsoft Fabric using T-SQL or the Fabric Explorer. This makes it difficult to manage storage, troubleshoot performance issues, and understand data usage. Recommended improvements: Fix existing DMVs so they return accurate storage usage and metadata for Fabric Data Warehouses. Update the Fabric Explorer Table Properties UI (specifically the Properties tab) to show: Current Size – the size of active data (such as optimized Delta files) Total Size – the size of all files, including older versions and unoptimized data V-Order Status – to provide insight into data layout optimization Last Compaction Timestamp Per-file Metrics – such as file sizes, modification times, and row counts Enhance the Fabric Explorer column list to indicate whether a column has a primary key, foreign key, or unique index defined. These changes would significantly improve visibility into storage and performance, making it easier to manage and optimize Fabric Data Warehouses. Note: While some of this information can be obtained via OneLake Explorer or a Spark notebook, the experience is currently cumbersome and not user-friendly. 🍍🍕
... View more
See more ideas labeled with:
0
Votes
Submitted
Saturday
Submitted by
DAXian
Saturday
Background: I created a KPI Card displaying data up to the quarter level in the date hierarchy. A label correctly reflects the applied filter/expansion level (e.g., "Current Quarter") at the tooltip. However, when using this label in conditional formatting for the title, it returns "Current Period" instead of "Current Quarter," indicating it defaults to the unfiltered state. Issue: The title does not inherit the expected "Current Quarter" value from the line chart visual's filter context, making dynamic title updates challenging. Desired Improvement: I want the title to dynamically reflect the filter level (e.g., "Current Quarter") based on the visual's date hierarchy, similar to the tooltip behavior. This would streamline the creation of dynamic visuals. Additional Request: I seek a function to retrieve a visual's width and height to optimize text title calculations. Current formula for title : Label = SWITCH( TRUE(), ISFILTERED('Calendar'[Week Ending]), "Week Ending", ISFILTERED('Calendar'[Day]), "Day", ISFILTERED('Calendar'[Month]), "Current Month", ISFILTERED('Calendar'[Quarter]), "Current Quarter", ISFILTERED('Calendar'[Year]), "Current Year", "Current Period" )
... View more
See more ideas labeled with:
0
Votes
Submitted
Friday
Submitted by
krutikapai
Friday
Hi Everyone, I am using Advanced Gantt Chart by Definitive Logic on Power BI. When a Phase and Milestone gantt chart is created using that view, the data labels are set for some on left, for some on right and for some in the center of Phase bars and Milestones. Currently, the formatting does not have an option to place data labels on a specific side and is automatically handled by the visual depending on space and layout. Can we have an option to place all labels for Phases and Milestones in the view on a specific side? All data labels will either be on left or all on right side of Phases and Milestones. Attached is the screenshot with dummy data and data labels (highlighted) which are placed by default to left, right and center.
... View more
See more ideas labeled with:
Submitted
Friday
Submitted by
Element115
Friday

ISSUE:
In a Fabric pipeline, after creating for the first time a new table or writing to a lakehouse table with a Copy Data activity, pipeline execution moves on to the next activity in the pipeline faster than the lakehouse has time to update and show that there is new data available to downstream activities. When these activities try to read from the lakehouse table, of course, this causes an error. If this happens right after the table was created for the first time, the pipeline environment does not see the table as existing yet since the execution thread is moving forward faster than the lakehouse can update and tell the pipeline that now it has a new table or new data.
POSSIBLE SOLUTIONS:
1__Have an option in the Settings tab that when checked, causes the Copy Data activity to start waiting for a response from the lakehouse. After the lakehouse sends the response that the all the data or table has been committed, clear the wait flag, and Copy Data activity completes its execution, and the pipeline execution resumes with the next activity.
OR
2__Provide a new activity called "Wait for I/O commit", which would pause the pipeline execution and wait for the lakehouse that is specified in its settings to send a message that the data is now available for read operations. Upon receipt of said message, this activity would allow pipeline executoin to resume.
2 simple, elegant, and user-friendly ways to once and for all get rid of this synchronization headache.
... View more
See more ideas labeled with:
0
Votes
Submitted
Friday
Submitted by
kimco
Friday
When a report uses RLS and a user with access to the app attempts to access it prior to being added to the Security screen, a message directing them to contact the "dataset owner" pops up. The problem is, the "dataset owner" is just the last person who "took ownership" of a given dataset....we have several people on our team who manage this, and we "take over" reports for various reasons throughout the day/week. We would much prefer to designate the contact information listed in that popup instead, the way we can for the "request access" popup when a user doesn't have access to the app...
... View more
See more ideas labeled with:
Submitted
Friday
Submitted by
lixinche2025
Friday

most of our data processing code was implemented in Scala. This choice was driven by Scala’s native integration with Spark, superior performance, full access to Spark features, strong compile-time safety, and more mature tooling and ecosystem support at the time. While PySpark has significantly improved—offering near API parity, better performance, and support for the Pandas API—migrating all our existing Scala code to Python might not provide the best user experience and could introduce unnecessary complexity. Could you provide Scala support in AI features?
... View more
See more ideas labeled with:
Submitted
Friday
Submitted by
lcordovab
Friday
Problem: Reordering the 'Explain By' dimension list in the Decomposition Tree currently requires navigating away to the Visualizations pane, interrupting the analysis workflow. Suggestion: Allow users to drag and drop the dimension options to reorder them directly within the opern columns. Benefit: This provides a much faster, more intuitive way to explore different analysis paths on-the-fly, improving user experience and analytical speed, especially with complex financial datasets.
... View more
See more ideas labeled with:
0
Votes
Submitted
Friday
Submitted by
AndyWald
Friday
Org Apps look promising, but the inability to disable the 'Share' permission for app users at the Tenant level keeps us from being able to use it. Please add the option to enable/disable this feature in the Admin Portal.
... View more
See more ideas labeled with:
0
Votes
Submitted
Friday
Submitted by
slittle4782
Friday
With Gen2 dataflows you can set a data destination, (e.g. a DWH table). However if you need to modify the dataflow to add, remove, rename, retype fields, there is no way to change the schema of the data destination. Its a horrible user experience as you need to use SSMS or some other tool to drop or modify the table, then go back into the dataflow and add the table again as new. Please make this simpler for the developers and allow them to modify the destination schema from within the dataflow UI. Or better yet, suggest the changes to the user and allow them to approve.
... View more
See more ideas labeled with:
Idea Statuses
- New 15,208
- Need Clarification 12
- Needs Votes 22,639
- Under Review 641
- Planned 270
- Completed 1,654
- Declined 227
Helpful resources
Latest Comments
-
Rakshhit_Sharma
on: In "Manage relationships" view I need to filter by...
- inwon on: SET QUERY_GOVERNOR_COST_LIMIT feature available on...
- querussus_1 on: Insert visual cue in the headers of columns that p...
- boxysean on: Allow cross-tenant Private Link access to Fabric
-
ToddChitt on: The list of available connections via a pipeline C...
-
miyake-san on: Enable use of Entra ID groups as recipients for su...
- Dvarga0914 on: Allow you to select which bookmarks should be disp...
- marciusmgouza on: Enable search for slicers that use numeric fields
- waveyoo6 on: Request for HTML code alteration in subscription m...
-
andyclap
on: Enable Page-Level Security for Power BI Desktop an...
-
Power BI
38,880 -
Fabric platform
548 -
Data Factory
451 -
Data Factory | Data Pipeline
302 -
Data Engineering
289 -
Data Warehouse
195 -
Data Factory | Dataflow
162 -
Fabric platform | Workspaces
137 -
Fabric platform | OneLake
127 -
Real-Time Intelligence
126 -
Fabric platform | Admin
126 -
Fabric platform | CICD
94 -
Fabric platform | Capacities
72 -
Real-Time Intelligence | Eventhouse and KQL
61 -
Fabric platform | Governance
55 -
Data Science
52 -
Real-Time Intelligence | Activator
52 -
Fabric platform | Security
52 -
Data Factory | Mirroring
39 -
Fabric platform | Support
34 -
Real-Time Intelligence | Eventstream
31 -
Databases | SQL Database
31 -
Fabric platform | Data hub
28 -
Databases
24 -
Data Factory | Apache Airflow Job
4 -
Fabric platform | Real-Time hub
3 -
Product
2