Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends September 15. Request your voucher.

Reply
smpa01
Super User
Super User

Source option missing

Is there a reason why I am missing the Source Option

 

Documentation 

smpa01_0-1745531429961.png

 

What I see

smpa01_1-1745531538071.png

 

 

Did I answer your question? Mark my post as a solution!
Proud to be a Super User!
My custom visualization projects
Plotting Live Sound: Viz1
Beautiful News:Viz1, Viz2, Viz3
Visual Capitalist: Working Hrs
1 ACCEPTED SOLUTION

Hi @smpa01,

Thank you for reaching out to the Microsoft Fabric Community Forum. The solutions provided by the @datacoffee, are correct according to the thread.

And, I have identified few alternative workarounds that may help resolve the issue. Please follow these steps:

You are correct. When you create a KQL database directly from the Event house or Fabric UI, the "Source" step does not appear in the "New Table" wizard. This behaviour is intentional.


Navigate to the workspace settings by clicking the workspace name and selecting "Settings." Under "Workspace type," you should see “Real-Time Intelligence” listed. If this is not visible, you are not in an RTI workspace. Only RTI workspaces support streaming scenarios with the full table wizard, including the Source step.


Tables created manually from One Lake files will not automatically refresh when the files are updated. This information is crucial for setting expectations and directing them towards the appropriate methods, such as Event Streams or Notebooks.

Option A: Use Event Streams

  • Create a table from Event Streams > Add Destination > New Table.
  • This enables the full wizard with Source, supports auto-refreshing data when new events arrive.

Option B: Use a Notebook to Manage Refresh

  • Use a notebook to drop and recreate/update the table regularly.
  • This is more manual but gives you control if Event Streams are not part of your current setup.


Kindly refer to the below documentation links:
Get data from Event stream - Microsoft Fabric | Microsoft Learn
Create an event stream in Microsoft Fabric - Microsoft Fabric | Microsoft Learn
Real-Time Intelligence in Microsoft Fabric documentation - Microsoft Fabric | Microsoft Learn

If this post helps, then please give us ‘Kudos’ and consider Accept it as a solution to help the other members find it more quickly.

Thank you for using Microsoft Community Forum.

View solution in original post

12 REPLIES 12
v-kpoloju-msft
Community Support
Community Support

Hi @smpa01,
Thank you for reaching out to the Microsoft fabric community forum. Also thanks for @datacoffee, his inputs.

You're already using the correct documentation. I can confirm that the Source step is missing from your new table wizard. This typically happens when the table is being created outside of a Real-Time Intelligence (RTI) enabled workspace or not through an Event stream-connected flow. To resolve this, please try the following:

Verify Your Workspace Type: Ensure the workspace you're working in is a Real-Time Intelligence workspace. You can confirm this in the workspace settings. If not, you'll need to create a new workspace and select Real-Time Intelligence as the type.

 

Create the Table from the Event stream Interface: Instead of creating the table directly from the KQL database view. Go to Data Factory > Event stream. From there, add a destination and choose New Table. This will open the full table creation wizard with the “Source” step included.

 

Ensure Event stream is Connected to the KQL Database: The KQL database must be registered as a destination in an Event stream to support streaming ingestion.

 

Note: The Source tab only appears if a source is being used, such as an Event stream or another streaming ingestion method. If you're manually creating a table without using a streaming source, the Source step is intentionally skipped, as it’s not required.

 

If this post helps, then please give us ‘Kudos’ and consider Accept it as a solution to help the other members find it more quickly.

Thank you for using Microsoft Community Forum.

Verify Your Workspace Type: Ensure the workspace you're working in is a Real-Time Intelligence workspace. You can confirm this in the workspace settings. - what am I looking for?

 

What amI currently doing? Creating Eventhouse -> creating KQL DB -> Was expecting same as 27:49 

 

Insetead I see this after - Creating Eventhouse -> creating KQL DB

 

smpa01_0-1745589217097.png

 

But I don't get the same option if I do this

smpa01_1-1745589291347.png

 

I need to work with what I have; so circling back to this, if I stick to this; where I am connecting to One Lake file from this UI to build a db table; can I expect this to be auto refreshed upon the same onelake like update? I will be updating the files in onelake with notebook.

smpa01_0-1745589217097.png

Did I answer your question? Mark my post as a solution!
Proud to be a Super User!
My custom visualization projects
Plotting Live Sound: Viz1
Beautiful News:Viz1, Viz2, Viz3
Visual Capitalist: Working Hrs

Hi @smpa01,

Thank you for reaching out to the Microsoft Fabric Community Forum. The solutions provided by the @datacoffee, are correct according to the thread.

And, I have identified few alternative workarounds that may help resolve the issue. Please follow these steps:

You are correct. When you create a KQL database directly from the Event house or Fabric UI, the "Source" step does not appear in the "New Table" wizard. This behaviour is intentional.


Navigate to the workspace settings by clicking the workspace name and selecting "Settings." Under "Workspace type," you should see “Real-Time Intelligence” listed. If this is not visible, you are not in an RTI workspace. Only RTI workspaces support streaming scenarios with the full table wizard, including the Source step.


Tables created manually from One Lake files will not automatically refresh when the files are updated. This information is crucial for setting expectations and directing them towards the appropriate methods, such as Event Streams or Notebooks.

Option A: Use Event Streams

  • Create a table from Event Streams > Add Destination > New Table.
  • This enables the full wizard with Source, supports auto-refreshing data when new events arrive.

Option B: Use a Notebook to Manage Refresh

  • Use a notebook to drop and recreate/update the table regularly.
  • This is more manual but gives you control if Event Streams are not part of your current setup.


Kindly refer to the below documentation links:
Get data from Event stream - Microsoft Fabric | Microsoft Learn
Create an event stream in Microsoft Fabric - Microsoft Fabric | Microsoft Learn
Real-Time Intelligence in Microsoft Fabric documentation - Microsoft Fabric | Microsoft Learn

If this post helps, then please give us ‘Kudos’ and consider Accept it as a solution to help the other members find it more quickly.

Thank you for using Microsoft Community Forum.

Hi @smpa01,

 

May I ask if you have resolved this issue? If so, please mark the helpful reply and accept it as the solution. This will be helpful for other community members who have similar problems to solve it faster.

Thank you.

Hi @smpa01,


I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If my response has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.
Thank you.

Hi @smpa01,


I hope this information is helpful. Please let me know if you have any further questions or if you'd like to discuss this further. If this answers your question, please Accept it as a solution and give it a 'Kudos' so others can find it easily.
Thank you.

Hi

 

the easiest way to build a KQL table from a OneLake file is to create a shortcut.

 

select new in the menu and then find the shortcut option

 

here you can choose to have an accelerated shortcut or a "normal" shortcut. The accelerated option gives you KQL speed of the data when querying whereas the second option relies on the speed of the storage read in OneLake...

 

 


If you find this reply to help with your problem, please consider hitting the accept button...
----------------
Blog: https://dcode.bi
KQL ref guide: https://aka.bi/kql
LinkedIn: https://aka.bi/follow

If I do what you are suggesting, will the KQL table always reflect up-to-date data?

Did I answer your question? Mark my post as a solution!
Proud to be a Super User!
My custom visualization projects
Plotting Live Sound: Viz1
Beautiful News:Viz1, Viz2, Viz3
Visual Capitalist: Working Hrs

Yes - the KQL engine will "listen" to the underlying parquet files and update accordingly 😊


If you find this reply to help with your problem, please consider hitting the accept button...
----------------
Blog: https://dcode.bi
KQL ref guide: https://aka.bi/kql
LinkedIn: https://aka.bi/follow

Thanks for the response. That's great and sorry to be a pain.

I need to clarify few more aspects beofre I can out this in PROD.

I am not being ble to create a shortcut in KQL databse for a file residing in lakehouse that contains a output.json like this

[
  {
    "dev": "dev_1",
    "id": 1,
    "comment": "some_comment"
  },
  {
    "dev": "dev_2",
    "id": 2,
    "comment": "some_other_comment"
  }
]

I am getting

smpa01_0-1745607353195.png

Failed to create external table "st_2" with error: Action: command cannot be executed due to an invalid argument: argument: 'compressed'; reason: "Server is not configured for compression. Compression or compression properties aren't allowed for this schema table. Either drop the compression properties or use one of the following formats: csv, tsv, json."
Did I answer your question? Mark my post as a solution!
Proud to be a Super User!
My custom visualization projects
Plotting Live Sound: Viz1
Beautiful News:Viz1, Viz2, Viz3
Visual Capitalist: Working Hrs

Oh - I had the idea that it was a parquet file.

then you can't use a shortcut.

 

let me try to understand the situation.

 

you have a json file from which you want to have the data in a KQL database.

 

how is that file generated? Can it be send to an eventhub from the source instead?

if so, then you have a perfect option to use the Eventstream to get the data into the KQL database and from there, do what you need with the data.

 

let me know - we can surely figure out a good solution for you


If you find this reply to help with your problem, please consider hitting the accept button...
----------------
Blog: https://dcode.bi
KQL ref guide: https://aka.bi/kql
LinkedIn: https://aka.bi/follow
datacoffee
Super User
Super User

Hi

 

without knowling the specific details on where you are on the UI, it looks to me you are trying to create a new table directly in the KQL databse

 

the screenshot you have from the docs looks like the one from Eventstream (or Ingestion) where you have a different UI flow.

 

let me know and I can help more 😊


If you find this reply to help with your problem, please consider hitting the accept button...
----------------
Blog: https://dcode.bi
KQL ref guide: https://aka.bi/kql
LinkedIn: https://aka.bi/follow

Helpful resources

Announcements
August Fabric Update Carousel

Fabric Monthly Update - August 2025

Check out the August 2025 Fabric update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.

Top Kudoed Authors