Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Be one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now

Reply
LB-Tech
Helper I
Helper I

Automation for backup restoration

I have automated backups been automatically stored in my local file everyday, I wanted this backup to be automatically restored in the sql server ssms so that i can connect this to powerbi for analysis. In what cost efficient way can i do this automation?
If any scripts to be written please provide me with the script.Or any other way of implementation ideas are also welcome.

1 ACCEPTED SOLUTION

Hi LB-Tech,

As the backups are in SQL text file format, they cannot be directly restored as binary .bak files. The SQL text files likely contain SQL scripts for recreating database objects and data. Below is a detailed step-by-step solution for automating the process using Azure services while ensuring cost efficiency:

 

  1. Set Up Azure Blob Storage:

    • Create a storage account and container (ex:sqltextfiles).
    • Generate a Shared Access Signature (SAS) token for secure access.
  2. Upload SQL Text Files to Azure Blob Storage:

    • Use PowerShell and the AzCopy tool to upload the SQL text files from the client machine to Azure Blob Storage on a daily basis.
    • Below is a PowerShell script for automating the file upload process. Schedule this script using Windows Task Scheduler to execute daily.

    $sourcePath = "C:\SQLTextFiles\*.sql" # Local folder containing SQL text files
    $destinationURL = "https://<StorageAccountName>.blob.core.windows.net/sqltextfiles" # Replace with your Blob URL
    $sasToken = "<Your_SAS_Token>" # Use SAS token for secure access

    # Upload files using AzCopy
    Start-Process -NoNewWindow -FilePath "C:\PathToAzCopy\azcopy.exe" `
    -ArgumentList "copy `"$sourcePath`" `"$destinationURL`" --recursive --sas-token `"$sasToken`""

  3. Set Up Azure SQL Database:

    • Create an Azure SQL Database. Opt for cost-efficient options such as DTU-based pricing or serverless tiers.
  4. Automate SQL File Execution:

    • Use a PowerShell script to fetch the SQL text files from Azure Blob Storage and execute them on the Azure SQL Database.
    • Below is a PowerShell script for automating the script execution process. Schedule this script using Windows Task Scheduler.

    $storageAccount = "<StorageAccountName>"
    $containerName = "sqltextfiles"
    $sasToken = "<Your_SAS_Token>"
    $sqlServer = "<AzureSQLServerName>.database.windows.net" # Replace with Azure SQL Server
    $databaseName = "<DatabaseName>"
    $username = "<SQLUsername>"
    $password = "<SQLPassword>"
    $destinationPath = "C:\DownloadedSQLFiles"

    # List files in Blob Storage
    az storage blob list --account-name $storageAccount --container-name $containerName `
    --sas-token $sasToken --output table | ForEach-Object {
    $fileName = $_.name
    $localFilePath = "$destinationPath\$fileName"

    # Download SQL file
    az storage blob download --account-name $storageAccount --container-name $containerName `
    --name $fileName --file $localFilePath --sas-token $sasToken

    # Execute SQL file on Azure SQL Database
    $sqlCommand = Get-Content $localFilePath -Raw
    Invoke-Sqlcmd -ServerInstance $sqlServer -Database $databaseName `
    -Username $username -Password $password -Query $sqlCommand
    }

  5. Create and Publish Reports in Power BI:

    • Connect Power BI to the Azure SQL Database, create the required reports, and publish them to the Power BI Service.
    • Set up a scheduled refresh to ensure the reports reflect the latest data.

Cost-Efficiency Recommendations:

  1. Opt for serverless or basic tiers for Azure SQL Database to minimize costs.
  2. Use hot or cool access tiers in Azure Blob Storage depending on the frequency of file retrieval.
  3. Utilize free tools such as PowerShell and AzCopy to reduce additional software costs.
  4. Leverage Windows Task Scheduler as a free automation solution.

If you find the response helpful, kindly mark it as the accepted solution and provide kudos, as this will assist other members with similar queries.

 

Best regards,
Pavan

View solution in original post

6 REPLIES 6
LB-Tech
Helper I
Helper I

The trap here is that the backups are stored in the client machine and powerbi is in my machine (different locations) so how do we handle this?

 

Also I needed this process to be fully automated, where in using the Azure 

Hi LB-Tech,

 

Please find below the detailed steps to ensure that the backup files are efficiently transferred, restored, and made available for Power BI analysis using Azure services:

  1. Set up Azure Blob Storage:

    • Create a storage container to store the daily .bak backup files.
  2. Automate Backup File Upload:

    • Use PowerShell and the AzCopy tool to upload .bak files from the client machine to Azure Blob Storage.
    • Install AzCopy on the client machine.Please find the PowerShell script to upload backups to Azure Blob Storage as below.Schedule this script to run daily using Windows Task Scheduler.

    # PowerShell Script to Upload Backups to Azure Blob Storage
    $sourcePath = "C:\SQLBackups\*.bak" # Path to local backup folder
    $destinationURL = "https://<YourStorageAccountName>.blob.core.windows.net/backups" # Replace with your Blob URL
    $sasToken = "<Your_SAS_Token>" # Use SAS token for secure access

    # Use AzCopy to upload files to Azure Blob Storage
    Start-Process -NoNewWindow -FilePath "C:\PathToAzCopy\azcopy.exe" `
    -ArgumentList "copy `"$sourcePath`" `"$destinationURL`" --recursive --sas-token `"$sasToken`""

  3. Restore the .bak File Automatically to SQL Server on an Azure Virtual Machine (VM):
    a) Provision an Azure Virtual Machine with SQL Server pre-installed (e.g., SQL Server Developer Edition or Standard Edition).
    b) Use a PowerShell script to fetch the latest .bak file from Azure Blob Storage and restore it to the SQL Server instance running on the Azure VM.Please find the PowerShell script to download the latest backup and restore it to SQL Server as below. Schedule this script to run daily on the Azure VM using Windows Task Scheduler. 

    # PowerShell Script to Download Latest Backup and Restore to SQL Server
    $storageAccount = "<YourStorageAccountName>"
    $containerName = "backups"
    $destinationPath = "C:\SQLRestores\"
    $sasToken = "<Your_SAS_Token>"
    $sqlInstance = "localhost" # Replace with SQL Server instance name
    $databaseName = "YourDatabaseName"

    # Find the latest backup file
    $latestBackup = (az storage blob list --account-name $storageAccount `
    --container-name $containerName --sas-token $sasToken --output table `
    | Sort-Object -Property LastModified -Descending | Select-Object -First 1).name

    # Download the latest backup
    az storage blob download --account-name $storageAccount `
    --container-name $containerName --name $latestBackup `
    --file "$destinationPath\$latestBackup" --sas-token $sasToken

    # Restore the database
    $restoreCommand = "
    RESTORE DATABASE [$databaseName]
    FROM DISK = N'$destinationPath\$latestBackup'
    WITH REPLACE, RECOVERY
    "
    Invoke-Sqlcmd -ServerInstance $sqlInstance -Query $restoreCommand

  4. Connect Restored Database to Power BI:

    • Once the database is restored on the SQL Server in Azure VM, connect to it using Power BI Desktop.
    • Create the required reports and publish them to Power BI Service.
    • Set up a scheduled refresh to ensure the reports reflect the latest data.

If you find this response helpful, please mark it as the accepted solution and provide kudos, as it will assist other community members with similar queries.

 

Best Regards,
Pavan

My backup files are in the format of ###SQL Text Files and please suggest me some cost efficient methods

Hi LB-Tech,

As the backups are in SQL text file format, they cannot be directly restored as binary .bak files. The SQL text files likely contain SQL scripts for recreating database objects and data. Below is a detailed step-by-step solution for automating the process using Azure services while ensuring cost efficiency:

 

  1. Set Up Azure Blob Storage:

    • Create a storage account and container (ex:sqltextfiles).
    • Generate a Shared Access Signature (SAS) token for secure access.
  2. Upload SQL Text Files to Azure Blob Storage:

    • Use PowerShell and the AzCopy tool to upload the SQL text files from the client machine to Azure Blob Storage on a daily basis.
    • Below is a PowerShell script for automating the file upload process. Schedule this script using Windows Task Scheduler to execute daily.

    $sourcePath = "C:\SQLTextFiles\*.sql" # Local folder containing SQL text files
    $destinationURL = "https://<StorageAccountName>.blob.core.windows.net/sqltextfiles" # Replace with your Blob URL
    $sasToken = "<Your_SAS_Token>" # Use SAS token for secure access

    # Upload files using AzCopy
    Start-Process -NoNewWindow -FilePath "C:\PathToAzCopy\azcopy.exe" `
    -ArgumentList "copy `"$sourcePath`" `"$destinationURL`" --recursive --sas-token `"$sasToken`""

  3. Set Up Azure SQL Database:

    • Create an Azure SQL Database. Opt for cost-efficient options such as DTU-based pricing or serverless tiers.
  4. Automate SQL File Execution:

    • Use a PowerShell script to fetch the SQL text files from Azure Blob Storage and execute them on the Azure SQL Database.
    • Below is a PowerShell script for automating the script execution process. Schedule this script using Windows Task Scheduler.

    $storageAccount = "<StorageAccountName>"
    $containerName = "sqltextfiles"
    $sasToken = "<Your_SAS_Token>"
    $sqlServer = "<AzureSQLServerName>.database.windows.net" # Replace with Azure SQL Server
    $databaseName = "<DatabaseName>"
    $username = "<SQLUsername>"
    $password = "<SQLPassword>"
    $destinationPath = "C:\DownloadedSQLFiles"

    # List files in Blob Storage
    az storage blob list --account-name $storageAccount --container-name $containerName `
    --sas-token $sasToken --output table | ForEach-Object {
    $fileName = $_.name
    $localFilePath = "$destinationPath\$fileName"

    # Download SQL file
    az storage blob download --account-name $storageAccount --container-name $containerName `
    --name $fileName --file $localFilePath --sas-token $sasToken

    # Execute SQL file on Azure SQL Database
    $sqlCommand = Get-Content $localFilePath -Raw
    Invoke-Sqlcmd -ServerInstance $sqlServer -Database $databaseName `
    -Username $username -Password $password -Query $sqlCommand
    }

  5. Create and Publish Reports in Power BI:

    • Connect Power BI to the Azure SQL Database, create the required reports, and publish them to the Power BI Service.
    • Set up a scheduled refresh to ensure the reports reflect the latest data.

Cost-Efficiency Recommendations:

  1. Opt for serverless or basic tiers for Azure SQL Database to minimize costs.
  2. Use hot or cool access tiers in Azure Blob Storage depending on the frequency of file retrieval.
  3. Utilize free tools such as PowerShell and AzCopy to reduce additional software costs.
  4. Leverage Windows Task Scheduler as a free automation solution.

If you find the response helpful, kindly mark it as the accepted solution and provide kudos, as this will assist other members with similar queries.

 

Best regards,
Pavan

xy665423
Community Support
Community Support

Hi @LB-Tech,

Thank you for reaching out through the Microsoft Fabric Community Forum.

 

To automate the restoration process of daily backups to SQL Server, please follow the steps mentioned below:

  1. Save the daily backups in a specific folder (ex: C:\SQLBackups\). Ensure that you have the necessary access permissions for SQL Server (ex: sa user credentials or Windows authentication).
  2. Use a SQL Server Agent Job or a PowerShell Script to restore the latest backup file. The PowerShell script file attached identifies the most recent .bak file in the backup directory and restores it to the SQL Server  instance. Save the script as RestoreLatestBackup.ps1.
  3. Automate the script execution using Windows Task Scheduler or SQL Server Agent. Open the Task            Scheduler, provide a name for the task, select the trigger frequency, and specify the appropriate fields    under the Action section(ex: for Program/Script : powershell.exe and Add Arguments :  -ExecutionPolicy  Bypass -File "C:\Scripts\RestoreLatestBackup.ps1)
  4. Use the restored database as a data source in Power BI for analysis.

 

By following the above steps, the backup will be restored automatically without any manual intervention, ensuring that the data is ready for Power BI analysis on a daily basis.

Tools such as PowerShell and SQL Server Agent are available for free with SQL Server. Similarly, Windows Task Scheduler is free and comes pre-installed in all Windows environments. Power BI Desktop is also free, making this solution highly cost-effective.

 

If you find the response helpful, please mark it as the accepted solution and provide kudos. This will help other members with similar queries.

 

Best regards,

Pavan

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!

Dec Fabric Community Survey

We want your feedback!

Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions.

ArunFabCon

Microsoft Fabric Community Conference 2025

Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.

December 2024

A Year in Review - December 2024

Find out what content was popular in the Fabric community during 2024.