Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

View all the Fabric Data Days sessions on demand. View schedule

Reply
PeterGoertz
Frequent Visitor

Getting "Failed to execute 'ariaNotify' on 'Document': Failed to read the 'priority' property from '

When assigning a Target(this is a table in a lakehouse) in a DataFlow, I get this error message:

---------- Message ----------
Failed to execute 'ariaNotify' on 'Document': Failed to read the 'priority' property from 'AriaNotificationOptions': The provided value 'none' is not a valid enum value of type AriaNotifyPriority.

---------- Stack ----------
React.useCallback(()@webpack://node_modules/@fluentui/react-aria/lib/AriaLiveAnnouncer/useAriaNotifyAnnounce.js:21:23
announce()@webpack://@powerquery/monolith-ui/ClientShared/Scripts/Components/Dialog/DialogRenderer.tsx:33:45

---------- Session ID ----------
17474dea-b5ca-45ca-aadf-eed9b914487d

---------- Mashup script ----------
[StagingDefinition = [Kind = "FastCopy"]]
section Section1;
shared HlpFeierTage = let
Source = Lakehouse.Contents(null),
Navigation = Source{[workspaceId = "cf11dd2e-5e10-4d8e-bd55-6ac46aa3a6bf"]}[Data],
#"Navigation 1" = Navigation{[lakehouseId = "61e44876-d4e2-45bc-b5a7-164b20ccc87a"]}[Data],
#"Navigation 2" = #"Navigation 1"{[Id = "HlpFeierTage", ItemKind = "Table"]}[Data]
in
#"Navigation 2";
[DataDestinations = {[Definition = [Kind = "Reference", QueryName = "Datum_DataDestination", IsNewTarget = true], Settings = [Kind = "Automatic", TypeSettings = [Kind = "Table"]]]}]
shared Datum = let
Today = Date.From(DateTime.LocalNow()),
ToYear = Date.Year(DateTime.LocalNow()),
FromYear = ToYear - 5,
firstDayofWeek = Day.Monday,
FromDate = #date(FromYear,1,1),
ToDate = #date(ToYear,12,31),
Source = List.Dates(
FromDate,
Duration.Days(ToDate-FromDate)+1,
#duration(1,0,0,0)
),
#"Converted to Table" = Table.FromList(Source, Splitter.SplitByNothing(), null, null, ExtraValues.Error),
#"Renamed Columns" = Table.RenameColumns(#"Converted to Table", {{"Column1", "Date"}}),
#"Changed Type" = Table.TransformColumnTypes(#"Renamed Columns", {{"Date", type date}}),
#"Insert DateId" = Table.TransformColumnTypes(Table.AddColumn(#"Changed Type", "DateId", each Date.Year([Date]) * 10000 + Date.Month([Date]) * 100 + Date.Day([Date])), {{"DateId", Int64.Type}}),
#"Inserted Year" = Table.AddColumn(#"Insert DateId", "Year", each Date.Year([Date]), Int64.Type),
#"Inserted Start of Year" = Table.AddColumn(#"Inserted Year", "StartOfYear", each Date.StartOfYear([Date]), type date),
#"Inserted End of Year" = Table.AddColumn(#"Inserted Start of Year", "EndOfYear", each Date.EndOfYear([Date]), type date),
#"Inserted Month" = Table.AddColumn(#"Inserted End of Year", "Month", each Date.Month([Date]), Int64.Type),
#"Inserted Start of Month" = Table.AddColumn(#"Inserted Month", "StartOfMonth", each Date.StartOfMonth([Date]), type date),
#"Inserted End of Month" = Table.AddColumn(#"Inserted Start of Month", "EndOfMonth", each Date.EndOfMonth([Date]), type date),
#"Inserted Days in Month" = Table.AddColumn(#"Inserted End of Month", "DaysInMonth", each Date.DaysInMonth([Date]), Int64.Type),
#"Inserted Day" = Table.AddColumn(#"Inserted Days in Month", "Day", each Date.Day([Date]), Int64.Type),
#"Inserted Day Name" = Table.AddColumn(#"Inserted Day", "DayName", each Date.DayOfWeekName([Date]), type text),
#"Inserted Day of Week" = Table.AddColumn(#"Inserted Day Name", "DayOfWeek", each Date.DayOfWeek([Date], firstDayofWeek), Int64.Type),
#"Inserted Day of Year" = Table.AddColumn(#"Inserted Day of Week", "DayOfYear", each Date.DayOfYear([Date]), Int64.Type),
#"Inserted Month Name" = Table.AddColumn(#"Inserted Day of Year", "MonthName", each Date.MonthName([Date]), type text),
#"Inserted Quarter" = Table.AddColumn(#"Inserted Month Name", "Quarter", each Date.QuarterOfYear([Date]), Int64.Type),
#"Inserted Start of Quarter" = Table.AddColumn(#"Inserted Quarter", "StartOfQuarter", each Date.StartOfQuarter([Date]), type date),
#"Inserted End of Quarter" = Table.AddColumn(#"Inserted Start of Quarter", "EndOfQuarter", each Date.EndOfQuarter([Date]), type date),
#"Inserted Week of Month" = Table.AddColumn(#"Inserted End of Quarter", "WeekOfMonth", each Date.WeekOfMonth([Date], firstDayofWeek), Int64.Type),
#"Inserted Start of Week" = Table.AddColumn(#"Inserted Week of Month", "StartOfWeek", each Date.StartOfWeek([Date], firstDayofWeek), type date),
#"Inserted End of Week" = Table.AddColumn(#"Inserted Start of Week", "EndOfWeek", each Date.EndOfWeek([Date], firstDayofWeek), type date),
#"Year Month" = Table.TransformColumnTypes(Table.AddColumn(#"Inserted End of Week", "Year-Month", each Date.ToText([Date], "MMM yyyy")), {{"Year-Month", type text}}),
#"Year Month Code" = Table.TransformColumnTypes(Table.AddColumn(#"Year Month", "Year-MonthCode", each Date.ToText([Date], "yyyyMM")), {{"Year-MonthCode", Int64.Type}}),
WeekOfYear = Table.AddColumn(#"Year Month Code", "WeekOfYear", each if Number.RoundDown((Date.DayOfYear([Date])-(Date.DayOfWeek([Date],Day.Monday)+1)+10)/7) = 0 then
Number.RoundDown((Date.DayOfYear(#date(Date.Year([Date])-1,12,31))-(Date.DayOfWeek(#date(Date.Year([Date])-1,12,31), Day.Monday)+1)+10)/7)
else if Number.RoundDown((Date.DayOfYear([Date])-(Date.DayOfWeek([Date], Day.Monday)+1)+10)/7)=53 and Date.DayOfWeek(#date(Date.Year([Date]),12,31), Day.Monday)+1<4
then 1
else
Number.RoundDown((Date.DayOfYear([Date])-(Date.DayOfWeek([Date], Day.Monday)+1)+10)/7)),
#"Changed column type" = Table.TransformColumnTypes(WeekOfYear, {{"WeekOfYear", Int64.Type}}),
#"Merged queries" = Table.NestedJoin(#"Changed column type", {"DateId"}, HlpFeierTage, {"DateId"}, "HlpFeierTage", JoinKind.LeftOuter),
#"Expanded HlpFeierTage" = Table.ExpandTableColumn(#"Merged queries", "HlpFeierTage", {"FeiertagsName"}, {"FeiertagsName"}),
#"Inserted conditional column" = Table.AddColumn(#"Expanded HlpFeierTage", "IsArbeitstag", each if [DayOfWeek] = 5 then 0 else if [DayOfWeek] = 6 then 0 else if [FeiertagsName] <> null then 0 else 1),
#"Changed column type 1" = Table.TransformColumnTypes(#"Inserted conditional column", {{"IsArbeitstag", Int64.Type}}),
#"Added Today" = Table.TransformColumnTypes(Table.AddColumn(#"Changed column type 1", "Today", each DateTime.LocalNow()), {{"Today", type date}}),
#"Added TimeRange" = Table.TransformColumnTypes(Table.AddColumn(#"Added Today", "TimeRange", each if Duration.Days(Duration.From([Today]-[Date])) < 0 then "Future" else
if Duration.Days(Duration.From([Today]-[Date])) = 0 then "Present" else "Past"), {{"TimeRange", type text}}),
#"Added isToday" = Table.TransformColumnTypes(Table.AddColumn(#"Added TimeRange", "IsToday", each if [Date] = [Today] then true else false), {{"IsToday", type logical}}),
#"Added IsCurrentMonth" = Table.TransformColumnTypes(Table.AddColumn(#"Added isToday", "isCurrentMonth", each if [#"Year-MonthCode"] = Date.Year([Today]) * 100 + Date.Month([Today]) then true else false), {{"isCurrentMonth", type logical}}),
#"Added same Day last month" = Table.AddColumn(#"Added IsCurrentMonth", "TodayLastMonth", each Date.AddMonths([Today],-1)),
AddedisPrevMonth = Table.TransformColumnTypes(Table.AddColumn(#"Added same Day last month", "isPreviousMonth", each if [#"Year-MonthCode"] = Date.Year([TodayLastMonth]) * 100 + Date.Month([TodayLastMonth]) then true else false), {{"isPreviousMonth", type logical}}),
ReportMonat = Table.AddColumn(AddedisPrevMonth, "ReportMonat", each if [isCurrentMonth] = true then "Reportrelevant" else if [isPreviousMonth] = true then "Reportrelevant" else "alle Monate"),
#"Changed column type 2" = Table.TransformColumnTypes(ReportMonat, {{"ReportMonat", type text}})
in
#"Changed column type 2";
shared Datum_DataDestination = let
Pattern = Lakehouse.Contents([HierarchicalNavigation = null, CreateNavigationProperties = false, EnableFolding = false]),
Navigation_1 = Pattern{[workspaceId = "cf11dd2e-5e10-4d8e-bd55-6ac46aa3a6bf"]}[Data],
Navigation_2 = Navigation_1{[lakehouseId = "61e44876-d4e2-45bc-b5a7-164b20ccc87a"]}[Data],
TableNavigation = Navigation_2{[Id = "DimDatum", ItemKind = "Table"]}?[Data]?
in
TableNavigation;

The entire worked on Monday and tuesday pretty well and I didn't changed a column type or name or something like that.

I also reduced the target records to one and the column to the first one. Same result. So it seems it doesn't depend on the target structure. 

I'm getting a similar error when I try to create a data agent. 

 

We are currently using a F8 Capacity in Nort Europe.

 

Thanks a lot for helping.

1 ACCEPTED SOLUTION

Hi @PeterGoertz ,


Thanks for confirming. Since the flow runs fine after ignoring the design-time error, it’s indeed pointing more towards a front-end regression than a data issue. To help Microsoft prioritize a fix, I’d still recommend raising a support ticket and attaching your session ID. That way, the product team can trace the exact failure and speed up resolution.

Meanwhile, glad to hear your workflow is executing successfully in spite of the popup. Hopefully, the engineering team rolls out a quick patch soon.

Thanks,
Akhil.

View solution in original post

14 REPLIES 14
keith-oak
Advocate I
Advocate I

 

Solution: AriaNotify Error in Fabric Dataflows

The Error

If you're encountering this error in Fabric Dataflows:

Failed to execute 'ariaNotify' on 'Document': Failed to read the 'priority' property from 'AriaNotificationOptions': The provided value 'none' is not a valid enum value of type AriaNotifyPriority.

This is caused by how Power Query handles streaming data transformations and triggers UI progress notifications.


Root Cause

The error occurs when performing column transformations on streaming data from API sources (like Dynamics 365 Business Central, Dataverse, REST APIs, etc.).

When you transform columns on streaming data:

  1. Data flows incrementally from the API
  2. Each chunk triggers UI progress updates
  3. The UI's accessibility notification system gets called repeatedly
  4. A bug in the FluentUI component tries to set notification priority to "none" (invalid value)
  5. The error is thrown and can cause the dataflow to fail

The Solution: Buffer Before Transform

The fix is simple: buffer your data before performing transformations.

Problem Pattern (Causes Error)

let
    Source = Dynamics365BusinessCentral.ApiContentsWithOptions(...),
    Navigation_1 = Source{[Name = "Environment"]}[Data],
    Navigation_2 = Navigation_1{[Name = "Advanced"]}[Data],
    Navigation_3 = Navigation_2{[Name = "api/v2.0"]}[Data],
    Navigation_4 = Navigation_3{[Name = "entities", Signature = "table"]}[Data],

    //  Transforming streaming data - triggers the bug
    #"Changed column type" = Table.TransformColumnTypes(
        Navigation_4,
        {
            {"modifiedAt", type datetime},
            {"createdAt", type datetime},
            {"status", type text}
        }
    ),
    #"Replaced values" = Table.ReplaceValue(
        #"Changed column type",
        "_x0020_",
        " ",
        Replacer.ReplaceText,
        {"status"}
    )
in
    #"Replaced values"

Solution Pattern (Prevents Error)

let
    Source = Dynamics365BusinessCentral.ApiContentsWithOptions(...),
    Navigation_1 = Source{[Name = "Environment"]}[Data],
    Navigation_2 = Navigation_1{[Name = "Advanced"]}[Data],
    Navigation_3 = Navigation_2{[Name = "api/v2.0"]}[Data],
    Navigation_4 = Navigation_3{[Name = "entities", Signature = "table"]}[Data],

    //  Buffer the data FIRST - load completely into memory
    Buffer_Data = Table.Buffer(Navigation_4),

    //  Now transform the buffered data - no streaming, no UI bug
    #"Changed column type" = Table.TransformColumnTypes(
        Buffer_Data,
        {
            {"modifiedAt", type datetime},
            {"createdAt", type datetime},
            {"status", type text}
        }
    ),
    #"Replaced values" = Table.ReplaceValue(
        #"Changed column type",
        "_x0020_",
        " ",
        Replacer.ReplaceText,
        {"status"}
    )
in
    #"Replaced values"

Why This Works

Streaming vs Buffered Execution

Without Buffer (Streaming):

  • API data flows incrementally (chunk by chunk)
  • Each transformation evaluates lazily
  • Multiple UI progress notifications fire
  • Buggy aria notification code gets triggered
  • Error occurs

With Buffer (Eager Loading):

  • Table.Buffer() forces complete data load into memory
  • Subsequent transformations work on in-memory table
  • Single evaluation, minimal UI updates
  • Buggy code path is avoided
  • No error

Why You Should Buffer Anyway (Beyond Bug Fix)

Even without this bug, buffering before transformations is a best practice for several reasons:

1. Performance Optimization

  • Prevents repeated API calls: Without buffering, each downstream operation can trigger re-evaluation and re-fetch from the API
  • Faster transformations: Column operations on in-memory data are much faster than streaming operations
  • Reduced API throttling: Single API call instead of potential multiple evaluations

2. Predictable Refresh Behavior

  • Consistent data snapshots: All transformations work on the same dataset
  • Avoid partial refresh issues: No risk of data changing mid-transformation
  • Better error handling: Failures are easier to diagnose

3. Lower API Costs

  • Some APIs charge per call or have rate limits
  • Buffering ensures you only call the API once per refresh
  • Prevents accidental duplicate calls from query folding failures

4. Improved Dataflow Stability

  • Reduces memory pressure from streaming operations
  • More predictable resource usage
  • Fewer timeout issues on large datasets

When to Buffer

Buffer These Sources

  • Dynamics 365 Business Central API
  • Dataverse / Power Platform APIs
  • REST API calls
  • OData feeds
  • Any API-based connector
  • Web.Contents() calls
  • SharePoint lists (when doing complex transformations)

⚠️ Consider Carefully

  • Large datasets (>1GB): Buffering loads everything into memory - ensure sufficient capacity
  • Direct Lake/Warehouse queries: May prevent query folding optimization
  • Incremental refresh sources: Buffer after filtering to refresh window

Don't Buffer

  • File sources already in Lakehouse/OneLake (already optimized)
  • When query folding to SQL databases is critical
  • Very large datasets where memory is limited

Best Practice Pattern

Here's a complete best-practice pattern for API-based dataflows:

let
    // 1. Connect to API
    Source = Dynamics365BusinessCentral.ApiContentsWithOptions(
        "PROD",
        null,
        null,
        [UseReadOnlyReplica = true]
    ),

    // 2. Navigate to your table
    Navigation_1 = Source{[Name = "PROD-ENV"]}[Data],
    Navigation_2 = Navigation_1{[Name = "Advanced"]}[Data],
    Navigation_3 = Navigation_2{[Name = "company/v2.0"]}[Data],
    Navigation_4 = Navigation_3{[Name = "glEntries", Signature = "table"]}[Data],

    // 3. Remove unnecessary columns BEFORE buffering (reduces memory)
    Remove_Columns = Table.RemoveColumns(
        Navigation_4,
        {"dimensionValue", "glAccount"},
        MissingField.Ignore
    ),

    // 4. BUFFER - this is the key step
    Buffer_Data = Table.Buffer(Remove_Columns),

    // 5. Now safely perform all transformations
    Format_Columns = Table.TransformColumnTypes(
        Buffer_Data,
        {
            {"documentType", type text},
            {"postingDate", type datetime},
            {"amount", type number}
        },
        "en-US"  // Use locale for date parsing
    ),

    // 6. Additional transformations (replace, merge, etc.)
    Decode_Text = Table.TransformColumns(
        Format_Columns,
        {
            {"documentType", each Text.Replace(_, "_x0020_", " "), type text},
            {"description", each Text.Replace(_, "_x0020_", " "), type text}
        }
    )
in
    Decode_Text

Additional Tips

Optimize Buffer Placement

  1. Remove columns first: Buffer after removing unnecessary columns to reduce memory usage
  2. Filter early: If possible, filter data before buffering (but after to refresh window for incremental)
  3. Buffer once: Don't buffer multiple times in the same query

Example: Incremental Refresh with Buffer

let
    Source = API_Call,
    Navigation = Navigate_To_Table,

    // Filter to incremental window FIRST (query folding)
    Filtered = Table.SelectRows(
        Navigation,
        each [modifiedAt] >= RangeStart and [modifiedAt] < RangeEnd
    ),

    // NOW buffer the filtered data
    Buffer_Data = Table.Buffer(Filtered),

    // Then transform
    Transformed = Table.TransformColumnTypes(Buffer_Data, ...)
in
    Transformed

Summary

The Fix:

  • Add Buffer_Data = Table.Buffer(YourTable) before any column transformations
  • This prevents the AriaNotify error by changing execution from streaming to eager

The Benefits:

  • Avoids the UI bug
  • Improves performance
  • Reduces API calls
  • More stable dataflow refreshes
  • Better resource utilization

When to Use:

  • All API-based data sources
  • Before Table.TransformColumnTypes()
  • Before Table.TransformColumns()
  • Before Table.ReplaceValue()
  • Before complex transformations

This is a best practice that solves the immediate bug AND improves your dataflow quality overall.


References


Tested and verified with Dynamics 365 Business Central API, Dataverse, and REST API sources in Microsoft Fabric.

Alven
Microsoft Employee
Microsoft Employee

Hi @keith-oak ,

please note that the root cause you described above is not correct given the issue was not linked to the stream of data.

As I previously mentioned, the error reported by @PeterGoertz was due to a UI regression that happened only when the user added a default destination or a destination for a query: this caused an aria notification to be sent using a priority property which was not supported anymore by recent browsers, and caused the error to be thrown. The issue is now fixed in all regions.

 

Regards,

Alessandro

Hey Keith-Oak, thank you for that very detailed tip. I'll try that. But as far as I remeber i fetched data from a lakehouse in another workspace and the error occured. As far as I understood I don't need to buffer in case of a lakehouse ?

I'll try and come back.

IMiles
New Member

Same issue.

Alven
Microsoft Employee
Microsoft Employee

Hi @IMiles@PeterGoertz and @Edkokp24 

 

this a UI regression (see screenshot below) that happens when you add a default destination or a destination for a query, and seems to impact mainly Chrome or Edge since version 141.

The exception is not blocking: you should be able to close the dialog and continue doing your usual actions in the Query Editor. Worth also noting that it does not have an impact on dataflow refresh.

We are actively working to fix this, and we should be able to address it by end of week, sorry for the inconvenience.

 

Alven_1-1759934853551.png

 

Alven
Microsoft Employee
Microsoft Employee

Hello,

FYI the fix for the issue has been deployed to the majority of production regions, and the deployment should complete in the next few hours. Based on this, you should not encounter anymore the issue.

 

Regards,

Alessandro

Edkokp24
New Member

I am getting the same issue as detailed below.Could you help  please!

---------- Message ----------
Failed to execute 'ariaNotify' on 'Document': Failed to read the 'priority' property from 'AriaNotificationOptions': The provided value 'none' is not a valid enum value of type AriaNotifyPriority.

---------- Stack ----------
React.useCallback(()@webpack://node_modules/@fluentui/react-aria/lib/AriaLiveAnnouncer/useAriaNotifyAnnounce.js:21:23
announce()@webpack://@powerquery/monolith-ui/ClientShared/Scripts/Components/Dialog/DialogRenderer.tsx:33:45

---------- Session ID ----------
7a064a6a-bdb6-4cf5-b342-f2be20ffb917

---------- Mashup script ----------
[StagingDefinition = [Kind = "FastCopy"]]
section Section1;
shared SalesOrderDetail = let
Source = Parquet.Document(AzureStorage.DataLakeContents("https://datalake0610.dfs.core.windows.net/data/parquet/SalesOrderDetail.parquet")),
#"Removed other columns" = Table.SelectColumns(Source, {"OrderQty", "ProductID", "SalesOrderID", "SalesOrderDetailID"})
in
#"Removed other columns";
shared SalesOrderHeader = let
Source = Parquet.Document(AzureStorage.DataLakeContents("https://datalake0610.dfs.core.windows.net/data/parquet/SalesOrderHeader.parquet")),
#"Removed columns" = Table.RemoveColumns(Source, {"Status", "OnlineOrderFlag", "SalesOrderNumber"}),
#"Changed column type" = Table.TransformColumnTypes(#"Removed columns", {{"OrderDate", type date}}),
#"Removed columns 1" = Table.RemoveColumns(#"Changed column type", {"TotalDue"}),
#"Added custom" = Table.AddColumn(#"Removed columns 1", "TotalAmt", each [SubTotal]+[TaxAmt]+[Freight]),
#"Changed column type 1" = Table.TransformColumnTypes(#"Added custom", {{"TotalAmt", type number}})
in
#"Changed column type 1";
[DataDestinations = {[Definition = [Kind = "Reference", QueryName = "FactSales_DataDestination", IsNewTarget = false], Settings = [Kind = "Manual", AllowCreation = false, ColumnSettings = [Mappings = {[SourceColumnName = "SalesOrderID", DestinationColumnName = "SalesOrderID"], [SourceColumnName = "OrderDate", DestinationColumnName = "OrderDate"], [SourceColumnName = "CustomerID", DestinationColumnName = "CustomerID"], [SourceColumnName = "SalesPersonID", DestinationColumnName = "SalesPersonID"], [SourceColumnName = "TerritoryID", DestinationColumnName = "TerritoryID"], [SourceColumnName = "SubTotal", DestinationColumnName = "SubTotal"], [SourceColumnName = "TaxAmt", DestinationColumnName = "TaxAmt"], [SourceColumnName = "Freight", DestinationColumnName = "Freight"], [SourceColumnName = "TotalAmt", DestinationColumnName = "TotalAmt"], [SourceColumnName = "SalesOrderDetailID", DestinationColumnName = "SalesOrderDetailID"], [SourceColumnName = "OrderQty", DestinationColumnName = "OrderQty"], [SourceColumnName = "ProductID", DestinationColumnName = "ProductID"]}], DynamicSchema = false, UpdateMethod = [Kind = "Replace"], TypeSettings = [Kind = "Table"]]]}]
shared FactSales = let
Source = Table.NestedJoin(SalesOrderHeader, {"SalesOrderID"}, SalesOrderDetail, {"SalesOrderID"}, "SalesOrderDetail", JoinKind.LeftOuter),
#"Expanded SalesOrderDetail" = Table.ExpandTableColumn(Source, "SalesOrderDetail", {"OrderQty", "ProductID", "SalesOrderDetailID"}, {"OrderQty", "ProductID", "SalesOrderDetailID"})
in
#"Expanded SalesOrderDetail";
shared FactSales_DataDestination = let
Pattern = Fabric.Warehouse([HierarchicalNavigation = null, CreateNavigationProperties = false]),
Navigation_1 = Pattern{[workspaceId = "41518f36-d390-4d15-81f6-f17c3efa77d3"]}[Data],
Navigation_2 = Navigation_1{[warehouseId = "6a38ab53-f2e6-47b5-b1d7-d0b239f9e9db"]}[Data],
TableNavigation = Navigation_2{[Schema = "dbo", Item = "FactSales"]}[Data]
in
TableNavigation;

Good morning Edkokp24, as of today I'm getting the error still. But clicking cancel does nothing in my opinion. So for me this message is just an anoying dialog. I will raise this as a ticket at Microsoft. I just didn't find the time doing it. Sorry to all.

Thanks @PeterGoertz

I can still reproduce the issue on my side. The pop-up error appears after clicking the "Save Settings" button when choosing the destination settings in Power Query.

Session ID: 892fa393-a6b1-48f7-9bcc-4102a2eeda3c
Environment: app.fabric.microsoft.com

Steps to reproduce:

Open Power Query in Microsoft Fabric

Choose Destination Settings

Click Save Settings

Observe the error pop-up

v-agajavelly
Community Support
Community Support

Hi @PeterGoertz ,

Thanks for your input. I did validate the column types already even reduced the target to a single column/row with standard types but the issue still persists. The error message itself

Failed to execute 'ariaNotify' on 'Document'. The provided value 'none' is not a valid enum value of type AriaNotifyPriority

Points to a Fabric Dataflows Gen2 front-end bug introduced with the latest release. The mashup script runs fine, but the web client fails when it tries to announce a status update with "priority" "none", which isn’t allowed in the FluentUI control.

So while checking types is always a good practice (especially for parquet vs. SQL/Python agent compatibility), in this case the root cause is the UI regression.

Go ahead and raise a support ticket with Microsoft and include the session ID so the product team can trace it. To raise a support ticket with Microsoft so the issue is officially logged and can also help influence future improvements:
Create a Microsoft Support Ticket.

Thanks,
Akhil.

Hi v-agajavelly, Thank you for you comment. I reckoned that ignoring the error and save & run leads to the desired result. The workflow is running although in design mode the error keeps popping up. Let's hope there's a quick solution to this bug.

 

Kind regards and again thanks,

 

Peter

Hi @PeterGoertz ,


Thanks for confirming. Since the flow runs fine after ignoring the design-time error, it’s indeed pointing more towards a front-end regression than a data issue. To help Microsoft prioritize a fix, I’d still recommend raising a support ticket and attaching your session ID. That way, the product team can trace the exact failure and speed up resolution.

Meanwhile, glad to hear your workflow is executing successfully in spite of the popup. Hopefully, the engineering team rolls out a quick patch soon.

Thanks,
Akhil.

BhaveshPatel
Community Champion
Community Champion

Hi @PeterGoertz 

 

Always change the data type of columns before making sure that data agent is working. The problem here is python doesn't support particular data type and parquet with _delta_log need a specific data type ( STRING etc..). Also, Azure SQL using different data type ( for ex. VARCHAR in Azure SQL ) 

 

so, make sure that you should to have correct data type and use python data agent or Dataflow Gen II. Either or 

Thanks & Regards,
Bhavesh

Love the Self Service BI.
Please use the 'Mark as answer' link to mark a post that answers your question. If you find a reply helpful, please remember to give Kudos.

Hi BhaveshPatel, miracuosly the Agent works on Monday again. But I'll try your suggestion for the Data flow which is still throwing that error, thanks a lot.

Helpful resources

Announcements
November Fabric Update Carousel

Fabric Monthly Update - November 2025

Check out the November 2025 Fabric update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Solution Authors