Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us for an expert-led overview of the tools and concepts you'll need to become a Certified Power BI Data Analyst and pass exam PL-300. Register now.

Reply
db042190
Post Prodigy
Post Prodigy

using datasets instead of etl

Hi, my user wants history on computer profiles that would be taken once a month for each computer.   The records would have computer name, date recorded and things like freespace, bitlocker protection, OS, location , owner etc etc.   The purpose of this is to support a compliance report.

 

we can easily set up ssis to do this.  its small data.

 

the question is, with no special licensing (we all have pro) and no workspace upsells (we have what comes out of the box), can we record history in/from pbi 's semantic model instead of using etl?   Perhaps by scheduling the refresh on the 1st of every month and somehow recording the date along with all attributes for that day's profile of that computer?   and having a way of keeping it in a growing repository of history without resorting to saving it in excel manually?

1 ACCEPTED SOLUTION
R1k91
Super User
Super User

you could do this with Fabric, reading data with DataFlows Gen 2 and storing in a lakehouse/warehouse.
even a F2 could work (300€/month).

 

if you can't use Fabric capacity, there's a more trivial approach that is to mount a custom ADLS Gen 2 to your workspace. if you create a dataflow gen 1 that read the data from your sources it will store it in CDM format in ADLS Gen 2 and you'll be able to read it back with Power Query.

 

basically you can schedule the dataflow to run every month and it will create the snapshot file in ADLS and you'll be able to compact all the snapshot in a Semantic Model leveragin ADLS connector.

 

I wrote about it 4 years ago: https://medium.com/riccardo-perico/pbi-dataflows-organizational-adls-put-your-pbi-into-data-pipes-e6...

official docs: Configuring dataflow storage to use Azure Data Lake Gen 2 - Power BI | Microsoft Learn

 


--
Riccardo Perico
BI Architect @ Lucient Italia | Microsoft MVP

Blog | GitHub

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

View solution in original post

12 REPLIES 12
db042190
Post Prodigy
Post Prodigy


if you can't use Fabric capacity, there's a more trivial approach that is to mount a custom ADLS Gen 2 to your workspace. if you create a dataflow gen 1 that read the data from your sources it will store it in CDM format in ADLS Gen 2 and you'll be able to read it back with Power Query.

thx super user.   before i look at what you wrote 4 yrs ago, can i do this with only a pro license, no speacial ws's and no additional product purchases? 

 

gen 1 and gen 2 sound like things we would have to pay for that we havent thus far.  for instance we dont pay for data lakes.   

Sure, dataflows gen 1 are power bi pro feature.


--
Riccardo Perico
BI Architect @ Lucient Italia | Microsoft MVP

Blog | GitHub

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

ok but we dont have data lake(s).   are we stuck?

Hello @db042190,


No, you're not entirely stuck, but your options are limited without a Data Lake. Power BI doesn't natively support long-term storage snapshots. A more scalable approach would be to export snapshots to a SharePoint list for better retention.

 

Please mark the helpful reply as the solution to assist others with similar queries. If you have further issues, don’t hesitate to reach out!

Thankyou for using Microsoft Fabric Community Forum.

thx v-sgandrathi , can such exports be automated without all the things i said we dont have?  Or are you saying do those exports to sharepoint manually?  the spirit of the origianl post was more about automating.

 

thx r1k91.   i think the origianl post says that of course we can produce incrementals thru ssis etc, but the ask was about saving them (snapshots) from pbi's refreshes without 1) introducing etl , 2) manual effort, 3) much in the way of licensing and workspace horsepower.

Hi @db042190,

 

In such cases, Power Automate is the best approach.

Automating exports to SharePoint is entirely achievable with tools like Power Automate. 

 

Thankyou for using Microsoft Community Forum.

Hi @db042190,

  

We wanted to follow up since we haven't heard back from you regarding our last response. We hope your issue has been resolved.

If my answer resolved your query, please mark it as "Accept Answer" and select "Yes" if it was helpful.

If you need any further assistance, feel free to reach out.

 

Thank you for being a valued member of the Microsoft Fabric Community Forum!

Hi @db042190,

 

May I ask if you have gotten this issue resolved?

If it is solved, please mark the reply as accept it as solution, it will be helpful for other members of the community who have similar problems as yours to solve it faster.

 

Thank you for being a part of the Microsoft Fabric Community.

Hi @db042190,

 

As we did not get a response, may I know if the above reply could clarify your issue, or could you please help confirm if we may help you with anything else?

 

And if the provided information meets your requirements, you can consider Accepting the solution. It helps other users who are searching for this same information and find the information.

 

Your understanding and patience will be appreciated.

to be honest, I haven't futher idea without Fabric or other ETL systems that produce the delta files connecting to the sources. 

 


--
Riccardo Perico
BI Architect @ Lucient Italia | Microsoft MVP

Blog | GitHub

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

ok, so no fabric no adls to attach.
what is the source of your data?

are you able to produce incremental files (each containing only the delta data)?


--
Riccardo Perico
BI Architect @ Lucient Italia | Microsoft MVP

Blog | GitHub

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.
R1k91
Super User
Super User

you could do this with Fabric, reading data with DataFlows Gen 2 and storing in a lakehouse/warehouse.
even a F2 could work (300€/month).

 

if you can't use Fabric capacity, there's a more trivial approach that is to mount a custom ADLS Gen 2 to your workspace. if you create a dataflow gen 1 that read the data from your sources it will store it in CDM format in ADLS Gen 2 and you'll be able to read it back with Power Query.

 

basically you can schedule the dataflow to run every month and it will create the snapshot file in ADLS and you'll be able to compact all the snapshot in a Semantic Model leveragin ADLS connector.

 

I wrote about it 4 years ago: https://medium.com/riccardo-perico/pbi-dataflows-organizational-adls-put-your-pbi-into-data-pipes-e6...

official docs: Configuring dataflow storage to use Azure Data Lake Gen 2 - Power BI | Microsoft Learn

 


--
Riccardo Perico
BI Architect @ Lucient Italia | Microsoft MVP

Blog | GitHub

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.