Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

View all the Fabric Data Days sessions on demand. View schedule

Reply
BalajiL
Helper III
Helper III

Quickbase integration to Fabric

Hi Team,

 

Fabric does not have any native connector for Quickbase, in our scenario need to ingest historical data exists in Quickbase to lakehouse. 

Can you share some insights how this can be acheived?  CData sync is an option we are looking but there is an additional cost to be incurred.  is there any alternate way data can be ingested into Fabric. 

 

Thanks!

4 REPLIES 4
v-achippa
Community Support
Community Support

Hi @BalajiL,

 

Thank you for reaching out to Microsoft Fabric Community.

 

Thank you @tayloramy and @Ugk161610 for the prompt response. 

 

As we haven’t heard back from you, we wanted to kindly follow up to check if the solution provided by the user's for the issue worked? or let us know if you need any further assistance.

 

Thanks and regards,

Anjan Kumar Chippa

Hi @BalajiL,

 

We wanted to kindly follow up to check if the solution provided by the user's for the issue worked? or let us know if you need any further assistance.

 

Thanks and regards,

Anjan Kumar Chippa

Ugk161610
Continued Contributor
Continued Contributor

Hi @BalajiL ,

 

You’re right – there’s no native Fabric connector for Quickbase today, so you won’t get a “click-next-next-finish” style integration like you do for SQL or SaaS sources. What you can do, though, is still bring the data in using Fabric plus either Quickbase’s APIs or exports, without paying for CData if you don’t want to.

 

In practice, people usually do one of these:

 

For a one-time or rare historical load, the simplest path is: export data from Quickbase as CSV (from the UI or a basic script), drop those files into a Storage account or straight into Lakehouse Files, and then use a Fabric Data Pipeline or notebook to load them into Delta tables. For a big historical dump this is often enough, and you don’t need any extra tools.

 

For ongoing loads without CData, the realistic option is to sit in the middle with something that can call Quickbase’s REST APIs and then write to storage/Lakehouse. That can be:

 

  • A small Python notebook in Fabric that calls Quickbase’s API, pulls the data, and writes it into Lakehouse tables.
  • Or a Power Automate / Logic App flow using HTTP actions to call Quickbase and then write the response into a Storage container, from which Fabric pipelines pick it up.

In all of these cases, Fabric itself is only handling the “from storage into Lakehouse” part – the hop from Quickbase into storage is done via API calls or manual exports, because there isn’t a built-in connector yet.

 

So short answer:
Without CData or another paid connector, you’ll need a bit of custom glue (API + script/flow) to land Quickbase data into storage, and then use Fabric pipelines/notebooks to bring it into the Lakehouse. There’s no fully native “Quickbase → Fabric” connector right now.

 

– Gopi Krishna

 

 

tayloramy
Community Champion
Community Champion

Hi @BalajiL

 

Does Quickbase have any public APIs? If so you can probably write some python code in a notebook to populate a lakehouse. 

 

If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution.  

Helpful resources

Announcements
November Fabric Update Carousel

Fabric Monthly Update - November 2025

Check out the November 2025 Fabric update to learn about new features.

Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Kudoed Authors