Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!View all the Fabric Data Days sessions on demand. View schedule
Hi Team,
Fabric does not have any native connector for Quickbase, in our scenario need to ingest historical data exists in Quickbase to lakehouse.
Can you share some insights how this can be acheived? CData sync is an option we are looking but there is an additional cost to be incurred. is there any alternate way data can be ingested into Fabric.
Thanks!
Hi @BalajiL,
Thank you for reaching out to Microsoft Fabric Community.
Thank you @tayloramy and @Ugk161610 for the prompt response.
As we haven’t heard back from you, we wanted to kindly follow up to check if the solution provided by the user's for the issue worked? or let us know if you need any further assistance.
Thanks and regards,
Anjan Kumar Chippa
Hi @BalajiL,
We wanted to kindly follow up to check if the solution provided by the user's for the issue worked? or let us know if you need any further assistance.
Thanks and regards,
Anjan Kumar Chippa
Hi @BalajiL ,
You’re right – there’s no native Fabric connector for Quickbase today, so you won’t get a “click-next-next-finish” style integration like you do for SQL or SaaS sources. What you can do, though, is still bring the data in using Fabric plus either Quickbase’s APIs or exports, without paying for CData if you don’t want to.
In practice, people usually do one of these:
For a one-time or rare historical load, the simplest path is: export data from Quickbase as CSV (from the UI or a basic script), drop those files into a Storage account or straight into Lakehouse Files, and then use a Fabric Data Pipeline or notebook to load them into Delta tables. For a big historical dump this is often enough, and you don’t need any extra tools.
For ongoing loads without CData, the realistic option is to sit in the middle with something that can call Quickbase’s REST APIs and then write to storage/Lakehouse. That can be:
In all of these cases, Fabric itself is only handling the “from storage into Lakehouse” part – the hop from Quickbase into storage is done via API calls or manual exports, because there isn’t a built-in connector yet.
So short answer:
Without CData or another paid connector, you’ll need a bit of custom glue (API + script/flow) to land Quickbase data into storage, and then use Fabric pipelines/notebooks to bring it into the Lakehouse. There’s no fully native “Quickbase → Fabric” connector right now.
– Gopi Krishna
Hi @BalajiL,
Does Quickbase have any public APIs? If so you can probably write some python code in a notebook to populate a lakehouse.
If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution.
Check out the November 2025 Fabric update to learn about new features.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!