This time we’re going bigger than ever. Fabric, Power BI, SQL, AI and more. We're covering it all. You won't want to miss it.
Learn moreLevel up your Power BI skills this month - build one visual each week and tell better stories with data! Get started
Good day,
I was tasked with a new role of providing insights to client, unfortunately a small company with no funding but i am looking to build AI Copilot + PowerBI + Python in PowerBI
So there's no data storage, only data from prod Product within AWS RDS+S3+mongodb but i have no layer to ingest and possibly in discussion to access only via API's which the product uses. So i thought with if i manged to get an F2 license for Copilot i could benfit with some stroage and elt usage of fabric within dataverse ro fabric or powerbi one lake all on the services as the previous person went to the product xported data manually per clinet to csv and built reports
So i want to build dashboards where i sit with clients and the auto ai narratives + python fc
what are options do i have with the F2 license or freeware tools to extract data where i use API that is run on the product and weekly in agreement with product nginnerrs to loop the api per client and write data to the service or onedrive or azure
What would you recommend i do, if i use azure db means im moving off aws which is also going to prob with the archetetucture so im thinking i will ask dataevrse, azure or if possible powerbi could strore aggregated data but i dont think powerbi could shedule the etl to load these tables, basically looking at freeware that i can sehdule to do etl or powerbi to be my etl and storage that loop api?
Please help, i have little to work with
Regards
Solved! Go to Solution.
Hi @icassiem
Your situation: no data infrastructure, data in AWS (RDS/S3/MongoDB), API-only access, manual CSV exports today. Here's the short answer.
F2 (~$262/month) is your entire stack in one purchase:
Fabric Lakehouse — your storage (replaces needing Azure SQL or Dataverse)
Fabric Notebooks — your ETL (Python script that loops the API per client weekly, writes to Lakehouse)
Fabric Pipelines — your scheduler (automates the notebook runs)
Direct Lake — Power BI reads from Lakehouse with no import refresh needed
Copilot — AI narratives, natural language Q&A, DAX generation
You don't need Dataverse, Azure SQL, or to migrate off AWS. Just read from the product API and store results in Fabric Lakehouse.
Architecture:
Product API → Fabric Notebook (loops per client) → Lakehouse → Power BI + Copilot
If F2 is denied:
Use Power Automate (free with M365) to call the API weekly → write to SharePoint → Power BI reads from SharePoint on scheduled refresh. Less elegant but zero cost.
Pitch F2 as: "One subscription replaces separate ETL tools, a database, a scheduling platform, and gives us AI — 5 tools for $262/month."
Start with one client's API as a proof of concept.
https://learn.microsoft.com/en-us/fabric/data-engineering/lakehouse-overview
https://learn.microsoft.com/en-us/fabric/data-engineering/how-to-use-notebook
https://learn.microsoft.com/en-us/power-bi/create-reports/copilot-introduction
https://learn.microsoft.com/en-us/fabric/get-started/direct-lake-overview
Thanks
If you found this helpful, please consider giving it a kudo and marking it as the accepted solution — it goes a long way in helping others facing the same issue.
For more Power BI tips and discussions, let’s connect on LinkedIn:
https://www.linkedin.com/in/natarajan-manivasagan
Cheers!
Hi @icassiem
Your situation: no data infrastructure, data in AWS (RDS/S3/MongoDB), API-only access, manual CSV exports today. Here's the short answer.
F2 (~$262/month) is your entire stack in one purchase:
Fabric Lakehouse — your storage (replaces needing Azure SQL or Dataverse)
Fabric Notebooks — your ETL (Python script that loops the API per client weekly, writes to Lakehouse)
Fabric Pipelines — your scheduler (automates the notebook runs)
Direct Lake — Power BI reads from Lakehouse with no import refresh needed
Copilot — AI narratives, natural language Q&A, DAX generation
You don't need Dataverse, Azure SQL, or to migrate off AWS. Just read from the product API and store results in Fabric Lakehouse.
Architecture:
Product API → Fabric Notebook (loops per client) → Lakehouse → Power BI + Copilot
If F2 is denied:
Use Power Automate (free with M365) to call the API weekly → write to SharePoint → Power BI reads from SharePoint on scheduled refresh. Less elegant but zero cost.
Pitch F2 as: "One subscription replaces separate ETL tools, a database, a scheduling platform, and gives us AI — 5 tools for $262/month."
Start with one client's API as a proof of concept.
https://learn.microsoft.com/en-us/fabric/data-engineering/lakehouse-overview
https://learn.microsoft.com/en-us/fabric/data-engineering/how-to-use-notebook
https://learn.microsoft.com/en-us/power-bi/create-reports/copilot-introduction
https://learn.microsoft.com/en-us/fabric/get-started/direct-lake-overview
Thanks
If you found this helpful, please consider giving it a kudo and marking it as the accepted solution — it goes a long way in helping others facing the same issue.
For more Power BI tips and discussions, let’s connect on LinkedIn:
https://www.linkedin.com/in/natarajan-manivasagan
Cheers!
@Natarajan_M wow thank you
1. in the Lakeohouse for pipelines, what sheduler is available and is there a data size or peipeline limit?
2. so my etl is not powerbi but python on a schedular that ingest, transform data to lakhouse and pbi conusme, this will help with my fc issues too?
3. but how is copilot used for my essay type client quarterly/6month feedback insights, does it have to plugged into powerbi and ask the question "client perfromance the past 6 months" and select the client in slicer and do that for all clients or can i have it done by python off final data set etc?
Check out the April 2026 Power BI update to learn about new features.
Sign up to receive a private message when registration opens and key events begin.
If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.
| User | Count |
|---|---|
| 11 | |
| 9 | |
| 9 | |
| 7 | |
| 6 |
| User | Count |
|---|---|
| 47 | |
| 27 | |
| 24 | |
| 24 | |
| 22 |