Don't miss your chance to take the Fabric Data Engineer (DP-700) exam on us!
Learn moreNext up in the FabCon + SQLCon recap series: The roadmap for Microsoft SQL and Maximizing Developer experiences in Fabric. All sessions are available on-demand after the live show. Register now
Hey Guys,
I have a source databricks and I have 2 tables that consist of more than 62,312,943 rows each, so which mode of import would be preferable for me in this case
also in the same file, there are other 10 small tables with 1-2M rows each
Regards
Solved! Go to Solution.
@NimaiAhluwalia , I vote for the composite Model. 😀
If that 60M row table is very thin(Not many columns) or you are on Premium, you can use import mode.
@NimaiAhluwalia , I vote for the composite Model. 😀
If that 60M row table is very thin(Not many columns) or you are on Premium, you can use import mode.
If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.
A new Power BI DataViz World Championship is coming this June! Don't miss out on submitting your entry.
Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.
| User | Count |
|---|---|
| 48 | |
| 45 | |
| 41 | |
| 19 | |
| 18 |
| User | Count |
|---|---|
| 69 | |
| 68 | |
| 33 | |
| 33 | |
| 31 |