@bcdobbs It depends on what you want to do with the data, if your source data is already structured/curated then you don't generally need to copy it again in the lakehouse/warehouse, shortcuts will work.
@GeethaT-MSFT Will the connectors for the Fabric copy data activity be expanded to match what is available in Azure Data Factory? For instance, I have on premises Oracle data that I would like to land in OneLake.
I fully intend to once it supports what I need! Currently Fabric copy data activity in pipeline only has very basic API support; I need to be able to pass a bearer key in the auth header (which I've ideally retrieved from key vault or new fabric equivalent), or for internal Microsoft stuff use a managed identity/service principal.
I was just thinking until that was available (I understand it's coming) I could use ADF in azure to land the data while I experiment with fabric.
That said having now played with it more I think the way to go is simply create a shortcut to my exisiting ADLS storage and get the data in that way.