Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!View all the Fabric Data Days sessions on demand. View schedule
Hi,
I have backups stored in my local drive as SQL Text File format, I wanted this to be conveted into parquet file and this should be an automated task without involvement of SSMS.
How can this task be automated?
Hi @LB-Tech
What is in the SQL Text File? The SQL statements for querying desired data? Or the query result data?
You could consider using Pandas library in Python script.
Reference: hdfs - Python: save pandas data frame to parquet file - Stack Overflow
Please note that you need to install the necessary libraries in your system.
To automate the task, you can use a task scheduler to run this script at regular intervals. For example, on Windows, you can use Task Scheduler, and on Linux, you can use cron jobs. For Fabric notebook, you can schedule its runs.
Hope this would be helpful.
Best Regards,
Jing
If this post helps, please Accept it as Solution to help other members find it. Appreciate your Kudos!
The file contains the
create statement to create the tables
and the data to be fed from the software
it is actually a backup file which i took from a software and it has 200 to 300 tables wherein we want only 5 tables and in the selected table the maximum number of rows are 3 plus lakhs.
Hi @LB-Tech
You can use a combination of Python and Apache Spark. My idea is as follows:
Handling a table with over 300,000 rows might be challenging for Pandas, especially if you have limited memory. Given your requirements, you may consider using PySpark to handle the data extraction and conversion to Parquet files. This approach will ensure better performance and scalability for your task.
I'm not sure what format the data in your backup files is, so it's difficult to provide specific code examples. You can try asking Copilot or ChatGPT to help you organize some sample code.
Best Regards,
Jing
If this post helps, please Accept it as Solution to help other members find it. Appreciate your Kudos!
Check out the November 2025 Fabric update to learn about new features.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!