Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Learn from the best! Meet the four finalists headed to the FINALS of the Power BI Dataviz World Championships! Register now

Reply
drgenius
New Member

Premium_aswl_error (400 Bad Request) after migrating from Snowflake tables to Views

Hi everyone,

 

I've been working on optimizing a Power BI Desktop report that connects to Snowflake as its data source, and after a full migration I'm hitting a publish error I can't get past.

 

I created Views in Snowflake to replace the raw tables the report was originally connected to. I then reconnected the Power BI datasource and swapped all tables with the corresponding views. Calculations that were previously handled inside Power BI are now pushed down into the Snowflake views, so the views return pre-calculated data ready for consumption. All visuals and report sheets have been updated to reference the new view-based tables. Finally, I configured Incremental Refresh with a 12-month archive window and a 2-month refresh window.

 

Everything works perfectly fine in Power BI Desktop. The error only appears when I try to publish to the Fabric service:
Error Code: Premium_aswl_error. The remote server returned an error: (400) Bad Request.

The remote server is up and running, I tested it by publishing again the previous version and it published fine.

Could this be related to incremental refresh not being compatible with how the views are defined in Snowflake? For example, a lack of query folding support for the RangeStart/RangeEnd parameters? Or is this a workspace capacity/Premium licensing issue? Is there a known conflict between incremental refresh policies and Snowflake views specifically?

 

Any help or pointers would be greatly appreciated.

5 REPLIES 5
v-echaithra
Community Support
Community Support

Hi @drgenius ,

May I ask if you have resolved this issue? Please let us know if you have any further issues, we are happy to help.

Thank you.

v-echaithra
Community Support
Community Support

Hi @drgenius ,

Thank you @Natarajan_M  for your inputs.
We’d like to follow up regarding the recent concern. Kindly confirm whether the issue has been resolved, or if further assistance is still required. We are available to support you and are committed to helping you reach a resolution.

Thank you.

Natarajan_M
Continued Contributor
Continued Contributor

Hi @drgenius , To debug this, could you try creating a new PBIX file with only one fact table? Set up incremental refresh for that single object, deploy it to the service, and then check the results. This will clarify whether tables are successfully refreshing while views are not, or if there are any permission issues.

 

thanks 

Natarajan_M
Continued Contributor
Continued Contributor

Hi @drgenius  , Could you please check what type of license is associated with the workspace? After you publish the PBIX file, the first load will involve a complete refresh, meaning that all partitions will be processed. Please verify whether the size of the semantic model exceeds the capacity limits.


To overcome the size issue you can follow the below approach : 

define a parameter as LoadAllData 

Natarajan_M_0-1772486111563.png

 

 


In Power Query step 

Let
....
SampleData = Table.FirstN(Source, 10),
Check = if LoadAllData then Source else SampleData

in
Check




Keep the default value as False and publish the model to service and do the first refresh .

The first refresh will take care of creating all the partitions in the servivce , now set the LoadAllData to True and process the partitions one by one using ssms or fabric notebook.

 

 

Thanks

 

If this response was helpful in any way, I’d gladly accept a kudo.
Please mark it as the correct solution. It helps other community members find their way faster

Hi @Natarajan_M , thank you for the suggestion. I tried the notebook approach to manually refresh the partitions but unfortunately that does not work for my use case as I need the refresh to be fully automated.
I also want to add some more context that might help narrow down the issue. The semantic model is approximately 280mb, which should be well within the capacity limits, so I don't think size is the problem here.
What is interesting is that when I use the original Snowflake tables instead of views, the publish works perfectly fine with no errors. The issue only appears after I switched to views. This makes me think the problem might not be related to the model size or the incremental refresh policy itself, but rather something in the report metadata. Could it be that there are some leftover references or conflicts in the .pbix file from the original tables that are causing the 400 error?
I also tried doing a Save As to create a fresh copy of the report thinking it might clear any stale metadata, but I got the same error on publish.

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.

February Power BI Update Carousel

Power BI Monthly Update - February 2026

Check out the February 2026 Power BI update to learn about new features.