Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now

Reply
bdpr_95
Helper II
Helper II

Best Approach for User Access Table with Different Refresh Frequencies

Hi everyone,

I'm working on a use case involving 20 Power BI reports. For 10 of these reports, the fact tables are updated daily, while for the other 10, the fact tables are only updated monthly.

My questions are specifically about the monthly reports:

  • I have a user access table, which contains user permissions. This table can change daily, unlike the fact data, which is only refreshed monthly.
  • Given this scenario, what is the best approach?
    • Should I create a composite model, where the user access table is in Direct Lake mode (to always have the latest access data), and the fact table is in Import mode (since it only updates monthly)? Will this approach have any negative impact on report consumers, such as data inconsistencies or interaction issues? Or will it work smoothly?
    • If this is not the best option, what alternatives do you recommend? For example, should I keep both tables in Import mode and create a notebook that only refreshes the user access tables for my 10 monthly reports? If so, what is the best way to build this notebook? Should I create a table with the report names and then have the notebook update the user access tables for each report in the list, pulling the data from my Data Lake?
  • Another topic: refreshing these monthly reports. As you know, Power BI only allows daily or weekly refresh schedules. How can I set up a monthly refresh? Currently, monthly refresh is in preview in Data Pipelines. Should I also consider using a notebook to refresh the monthly reports?

Thank you very much for your help!

1 ACCEPTED SOLUTION

@v-echaithra, I plan to use import mode for all tables and configure pipelines to refresh the monthly reports and update the daily user access table.

View solution in original post

9 REPLIES 9
v-echaithra
Community Support
Community Support

Hi @bdpr_95 ,

We’d like to follow up regarding the recent concern. Kindly confirm whether the issue has been resolved, or if further assistance is still required. We are available to support you and are committed to helping you reach a resolution.

Best Regards,
Chaithra E.

@v-echaithra, I plan to use import mode for all tables and configure pipelines to refresh the monthly reports and update the daily user access table.

v-echaithra
Community Support
Community Support

Hi @bdpr_95 ,

Thank you @tayloramy , @ribisht17 , @pallavi_r , @Msingh16  for your inputs.

I hope the information provided is helpful. I wanted to check whether you were able to resolve the issue with the provided solutions. Please let us know if you need any further assistance.

Thank you.

ribisht17
Super User
Super User

Hi   @bdpr_95 

 

Using a composite model with:

User Access Table in Direct Lake

Fact Table in Import Mode

 

Benefits

Fresh access control: Always reflects latest permissions without needing full model refresh.

Performance: Fact data is cached in memory (VertiPaq), ensuring fast visuals.

Separation of concerns: Each table uses the most appropriate mode.

 

 

Limitations

Model complexity: Relationships between Direct Lake and Import tables can introduce latency or query inconsistencies if not modeled carefully.

Unsupported features (as of Sept 2025):

Drillthrough and Analyze in Excel may not work with Direct Lake.

Calculated columns and Power Query transformations are limited in Direct Lake.

Security modeling: Row-level security (RLS) across mixed storage modes requires careful testing.


Recommendation
Use composite model only if:

You need real-time access control

Your user access table is small and stable

You can test interactions thoroughly

Otherwise, consider the alternative below.

 Alternative: Full Import Mode + Notebook Refresh
Keep both tables in Import mode, and use a notebook to refresh only the user access table daily.

 

Benefits
Simpler model: No mixed storage complexity.

Full feature support: Drillthrough, Excel integration, calculated columns, etc.

Controlled refresh: You decide when and what to refresh.

 

Regards,

Ritesh

Community Champion

Please mark the answer if helpful so that it can help others

Dance-Sing with Data -BI & Analytics

 

pallavi_r
Super User
Super User

Hi @bdpr_95 

 

I would go ahead with direct lake for both approach keeping daily detailed table separate and monthly aggregated detail in a materialized view and both are to be accessible in direct lake mode.

 

Direct lake does not need refresh.

 

Regading your 2nd question on monthly refresh of Power BI report, please check this link and below snapshot.

https://learn.microsoft.com/en-us/power-bi/connect-data/refresh-scheduled-refresh

pallavi_r_0-1758436425009.png

In your pipeline also, you can refresh the dataset in notebook or leverage semantic model refresh activity

https://learn.microsoft.com/en-us/fabric/data-factory/semantic-model-refresh-activity?utm_source=cha...

 

If this post helps, appreciate your kudos.

 

Thanks,

Pallavi

 

tayloramy
Community Champion
Community Champion

Hi @bdpr_95

 

Why not just use directlake for both the user access table and the fact tables? This would be the approach I would use. 

This avoids the complexity of maintaining composite models, and if in the future the business decides that these fact tables need to be refreshed more often, then it allows for this without any additional changes. 

 

This also solves problem 2 as the semantic model will always be up to date with the tables. 

 

If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution. 

@tayloramy , if I follow your approach, another question I have is about costs. My data source is Snowflake, and I’m concerned about potentially higher costs if I use mirroring, since this could require keeping a Snowflake warehouse running more often. Will I end up paying more because the warehouse needs to be online for data to be mirrored? The way I plan to get the data is through mirroring.

Hi @bdpr_95

 

Personally I would just move the data into Fabric on a monthly schedule, and have the reports sit on the fabric native data. 

 

If you want to do this without a staging approach, and without keeping the snowflake warehouse going for cost reasons, then I would recommend a more complex composite model. 

 

Mirroring does incur costs: https://learn.microsoft.com/en-us/fabric/mirroring/snowflake#mirrored-snowflake-cost-considerations

 

Use import mode for the snowflake data, and direct query for the access control data, and then using a pipeline schedule the refresh monthly: 

https://learn.microsoft.com/en-us/fabric/data-factory/semantic-model-refresh-activity

 

 

Now as @ribisht17 pointed out, there are limitations to this approach: 

 


@ribisht17 wrote:

Limitations

Model complexity: Relationships between Direct Lake and Import tables can introduce latency or query inconsistencies if not modeled carefully.

Unsupported features (as of Sept 2025):

Drillthrough and Analyze in Excel may not work with Direct Lake.

Calculated columns and Power Query transformations are limited in Direct Lake.

Security modeling: Row-level security (RLS) across mixed storage modes requires careful testing.



These are the reason I would recommend staging the data in Fabric first and then using a full direct lake model. 

 

If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution. 

Msingh16
Regular Visitor

For the monthly reports:

 

  • Keep the fact table in Import mode (since it only updates monthly).
  • Keep the user access table in Direct Query/Direct Lake mode (so permissions are always up-to-date).
  • This composite model approach works well, and report interactions will work smoothly if relationships are set correctly.

For Monthly refresh: Use the monthly refresh feature in Data Pipelines (preview) or automate the refresh via a notebook/Power Automate.

 

Helpful resources

Announcements
Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

October Power BI Update Carousel

Power BI Monthly Update - October 2025

Check out the October 2025 Power BI update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Kudoed Authors