The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends September 15. Request your voucher.
Is there a reason why I am missing the Source Option
What I see
Solved! Go to Solution.
Hi @smpa01,
Thank you for reaching out to the Microsoft Fabric Community Forum. The solutions provided by the @datacoffee, are correct according to the thread.
And, I have identified few alternative workarounds that may help resolve the issue. Please follow these steps:
You are correct. When you create a KQL database directly from the Event house or Fabric UI, the "Source" step does not appear in the "New Table" wizard. This behaviour is intentional.
Navigate to the workspace settings by clicking the workspace name and selecting "Settings." Under "Workspace type," you should see “Real-Time Intelligence” listed. If this is not visible, you are not in an RTI workspace. Only RTI workspaces support streaming scenarios with the full table wizard, including the Source step.
Tables created manually from One Lake files will not automatically refresh when the files are updated. This information is crucial for setting expectations and directing them towards the appropriate methods, such as Event Streams or Notebooks.
Option A: Use Event Streams
Option B: Use a Notebook to Manage Refresh
Kindly refer to the below documentation links:
Get data from Event stream - Microsoft Fabric | Microsoft Learn
Create an event stream in Microsoft Fabric - Microsoft Fabric | Microsoft Learn
Real-Time Intelligence in Microsoft Fabric documentation - Microsoft Fabric | Microsoft Learn
If this post helps, then please give us ‘Kudos’ and consider Accept it as a solution to help the other members find it more quickly.
Thank you for using Microsoft Community Forum.
Hi @smpa01,
Thank you for reaching out to the Microsoft fabric community forum. Also thanks for @datacoffee, his inputs.
You're already using the correct documentation. I can confirm that the Source step is missing from your new table wizard. This typically happens when the table is being created outside of a Real-Time Intelligence (RTI) enabled workspace or not through an Event stream-connected flow. To resolve this, please try the following:
Verify Your Workspace Type: Ensure the workspace you're working in is a Real-Time Intelligence workspace. You can confirm this in the workspace settings. If not, you'll need to create a new workspace and select Real-Time Intelligence as the type.
Create the Table from the Event stream Interface: Instead of creating the table directly from the KQL database view. Go to Data Factory > Event stream. From there, add a destination and choose New Table. This will open the full table creation wizard with the “Source” step included.
Ensure Event stream is Connected to the KQL Database: The KQL database must be registered as a destination in an Event stream to support streaming ingestion.
Note: The Source tab only appears if a source is being used, such as an Event stream or another streaming ingestion method. If you're manually creating a table without using a streaming source, the Source step is intentionally skipped, as it’s not required.
If this post helps, then please give us ‘Kudos’ and consider Accept it as a solution to help the other members find it more quickly.
Thank you for using Microsoft Community Forum.
Verify Your Workspace Type: Ensure the workspace you're working in is a Real-Time Intelligence workspace. You can confirm this in the workspace settings. - what am I looking for?
What amI currently doing? Creating Eventhouse -> creating KQL DB -> Was expecting same as 27:49
Insetead I see this after - Creating Eventhouse -> creating KQL DB
But I don't get the same option if I do this
I need to work with what I have; so circling back to this, if I stick to this; where I am connecting to One Lake file from this UI to build a db table; can I expect this to be auto refreshed upon the same onelake like update? I will be updating the files in onelake with notebook.
Hi @smpa01,
Thank you for reaching out to the Microsoft Fabric Community Forum. The solutions provided by the @datacoffee, are correct according to the thread.
And, I have identified few alternative workarounds that may help resolve the issue. Please follow these steps:
You are correct. When you create a KQL database directly from the Event house or Fabric UI, the "Source" step does not appear in the "New Table" wizard. This behaviour is intentional.
Navigate to the workspace settings by clicking the workspace name and selecting "Settings." Under "Workspace type," you should see “Real-Time Intelligence” listed. If this is not visible, you are not in an RTI workspace. Only RTI workspaces support streaming scenarios with the full table wizard, including the Source step.
Tables created manually from One Lake files will not automatically refresh when the files are updated. This information is crucial for setting expectations and directing them towards the appropriate methods, such as Event Streams or Notebooks.
Option A: Use Event Streams
Option B: Use a Notebook to Manage Refresh
Kindly refer to the below documentation links:
Get data from Event stream - Microsoft Fabric | Microsoft Learn
Create an event stream in Microsoft Fabric - Microsoft Fabric | Microsoft Learn
Real-Time Intelligence in Microsoft Fabric documentation - Microsoft Fabric | Microsoft Learn
If this post helps, then please give us ‘Kudos’ and consider Accept it as a solution to help the other members find it more quickly.
Thank you for using Microsoft Community Forum.
Hi @smpa01,
May I ask if you have resolved this issue? If so, please mark the helpful reply and accept it as the solution. This will be helpful for other community members who have similar problems to solve it faster.
Thank you.
Hi @smpa01,
I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If my response has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.
Thank you.
Hi @smpa01,
I hope this information is helpful. Please let me know if you have any further questions or if you'd like to discuss this further. If this answers your question, please Accept it as a solution and give it a 'Kudos' so others can find it easily.
Thank you.
Hi
the easiest way to build a KQL table from a OneLake file is to create a shortcut.
select new in the menu and then find the shortcut option
here you can choose to have an accelerated shortcut or a "normal" shortcut. The accelerated option gives you KQL speed of the data when querying whereas the second option relies on the speed of the storage read in OneLake...
If I do what you are suggesting, will the KQL table always reflect up-to-date data?
Yes - the KQL engine will "listen" to the underlying parquet files and update accordingly 😊
Thanks for the response. That's great and sorry to be a pain.
I need to clarify few more aspects beofre I can out this in PROD.
I am not being ble to create a shortcut in KQL databse for a file residing in lakehouse that contains a output.json like this
[
{
"dev": "dev_1",
"id": 1,
"comment": "some_comment"
},
{
"dev": "dev_2",
"id": 2,
"comment": "some_other_comment"
}
]
I am getting
Failed to create external table "st_2" with error: Action: command cannot be executed due to an invalid argument: argument: 'compressed'; reason: "Server is not configured for compression. Compression or compression properties aren't allowed for this schema table. Either drop the compression properties or use one of the following formats: csv, tsv, json."
Oh - I had the idea that it was a parquet file.
then you can't use a shortcut.
let me try to understand the situation.
you have a json file from which you want to have the data in a KQL database.
how is that file generated? Can it be send to an eventhub from the source instead?
if so, then you have a perfect option to use the Eventstream to get the data into the KQL database and from there, do what you need with the data.
let me know - we can surely figure out a good solution for you
Hi
without knowling the specific details on where you are on the UI, it looks to me you are trying to create a new table directly in the KQL databse
the screenshot you have from the docs looks like the one from Eventstream (or Ingestion) where you have a different UI flow.
let me know and I can help more 😊