The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredCompete to become Power BI Data Viz World Champion! First round ends August 18th. Get started.
I have a db in sql with a large data set and I am importing the data to PowerBI.
The fact table have over 13 million transactions; another table is TransactionEntry with about 25 million rows of data.
I connect them with TrxID, with 1 to MANY relationship.
In addition, I have the following smaller tables ITEM, STORES, DEPARTMENT, DESCRIPTION, CATEGORY tables.
Sales Data is imported starting from 2016 till date.
I made my report with PowerBI Desktop. And PUBLISHED my file to PowerBI Service so I can share it with the team.
My question is :
Since we work in retail, the sales data updates everyday, my current practice is that I REFERESH DATA everyday on PowerBI file and then PUBLISH the file to PowerBI Service. this takes 1 hour on an avarage.
Is there a way to automate the REFRESH DATA and then PUBLISH process?
Thank you for your support and help
@Anonymous , As per what I got from the description.
I think you have install a gateway and schedule refresh in power bi service too.
https://docs.microsoft.com/en-us/power-bi/connect-data/refresh-data#data-refresh
https://radacad.com/the-power-bi-gateway-all-you-need-to-know
Thanks Amit,
I have the gateway in pleace. but the real problem is that i have to REFRESH data in the PBi Desktop file before Publishing.
How can i automate the first step i.e REFRESH data in the PBi Desktop file?