Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more
This specific function is disabled from a security standpoint.
Would you mind sharing a bit more about the scenarios where you use such function? There might be other options or other ways in which you could tackle such scenarios today
Thanks @ITmerrr for submitting this idea
as we are working on enhancements to Admin Portal, I'd be happy to better understand:
- Tenant settings - would you expect non admins to be able to access that page? those non admins will be able to change settings? will be happy to learn more about this use case
- usage metrics - that's a deperecated tab, we removed it, the change is rolling to production
- premium per user - is setup - would you expect non admins to be able to change it?
- Audit logs and Users redirect to external system (M365 ) - same quetsion , what would you expect the non admins to be able to see here?
You can use a DQ connection instead.
Can you please clarify if this is something you want from DW or from the SQL Server capability? It should be available in SQL Server as it is supported in other versions of SQL.
Hi @Muhiddin thank for submitting the idea.
Can you please clarify if you are looking for every user/ data owners/ developers view on their unused items or from Fabric tenant admin perspective.
Also I wanted to check if you are familiar with the govern tab as part of the 'OneLake Catalog' which currently target data owners and shows information about their data.
Power Query and Dataflows do not emit data formats, but rather data types. The formatting options that you see in Power BI Desktop are pretty specific to the reporting layer of the semantic model, but do those formats are not being emitted by Power Query and Power Query is not aware of them either.
Would you mind telling us a bit more about which scenarios you may need specific formats today for Dataflow Gen2 or where Dataflow Gen2 may fall short for the data types that it supports v
Hi!
After adding a new column to your query, what message do you receive when you go through column mapping of your destination? You should be able to see a message through this dialog that tells you about the schema difference and how you can only map to the existing columns in your destination.
The information about the column maping is different from the query itself. The query itself contains the logic for the output, but the column mapping and the destination dialog is the one that contains the logic as to how your query should be loading data to the destination. More information about this process can be found in the article below:
Dataflow Gen2 data destinations and managed settings - Microsoft Fabric | Microsoft Learn
have you tried using BYOLA for analysis services? This is the recommended way of obtaining memory data for AS.
@Anonymous could you share more where you got this error message? Was it for a Dataflow Gen2 without CI/CD in the Power Query editor? in the "Refresh history" / "Recent runs" dialog or was it perhaps in a different integration like the Dataflow activity in pipelines or the REST API?
You can do this today by adding a custom column that leverages the function below:
DateTime.LocalNow - PowerQuery M | Microsoft Learn
You should be able to leverage this even when FastCopy is at play. Would you mind giving this a try and share with us your findings?
Can your requirement be understood as adding the option of EN-GB (for example) in the language setting of Fabric so that the PBI report can follow the setting and adjust the display format of the time and date.
Can you think of make the date/time fomat fixed when creating the report? For example:
In Power BI Desktop, convert the date field to a text format using the Format() DAX function and specify a format such as dd/mm/yyyy
This way, the format is fixed and will not change based on the language settings.
We currently support Mirroring with Databricks, Microsoft Fabric Mirrored Catalog From Azure Databricks (Preview) - Microsoft Fabric | Microsoft Lea...
Is this what you are after? Or is it something else?
We have SAP Mirroring which is provided by 3rd party vendors, they are listed on this page: Open Mirroring Partner Ecosystem - Microsoft Fabric | Microsoft Learn
SAP is very big and very complex, so was there a specific SAP system , BW / Hana / you were interested in?
Can you clarify this concern? The endpoints you are listing are children of the SQL endpoint, yes?
We support monitoring all F,P,A,EM,Trial SKU's , you do need to be a capacity admin though. If you're still hitting please let us know and open a support ticket too so we can help you out. Thanks and sorry for any friction you're encountering.
I beliee this is already fixed.
Could you further explain your scenario and what you're trying to accomplish?
A Dataflow Gen2 is not effectively a storage item, but rather a data processing one. It connects to data, processes it and enables you to load it to a destination, but by itself Dataflow Gen2 doesn't truly store.
If perhaps this is more about adding multiple queries from the same source (being SQL Serve), you can do so today in many forms, but it also depends on what your needs are.
Looking forward to hearing from you!
Thanks for the input.
Trying to understand the scenario better - what would be the reason for doing this?
Eventhouse data and OneLake Availability is considered as one logical copy with same access level and retention. You can query the data in OneLake without using the Eventhouse compute for historical analysis, while querying the Eventhouse direcely for timeseries/real-time analytics.
If you would like to manage the data in lake seperately, then copying it outside Eventhouse might be a better way think about it.