Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric certified for FREE! Don't miss your chance! Learn more

Reply
Charline_74
Resolver I
Resolver I

How to retrieve the output of a look-up activity in a notebook

Hi,

I have a pipeline in which I perform a lookup on a table and retrieve this:

 

{ "count": 7, "value": [ { "WorkspaceID": "WKS13", "WorkspaceName": "Test Workspace 5", "UserEmail": "email" }, { "WorkspaceID": "WKS9", "WorkspaceName": "Test Workspace 9", "UserEmail": "email" }, { "WorkspaceID": "WKS9", "WorkspaceName": "Test Workspace 9", "UserEmail": "email" }, { "WorkspaceID": "WKS12", "WorkspaceName": "Test Workspace 12", "UserEmail": "email" }, { "WorkspaceID": "WKS11", "WorkspaceName": "Test Workspace 11", "UserEmail": "email" }, { "WorkspaceID": "WKS15", "WorkspaceName": "Test Workspace 11", "UserEmail": "email" }, { "WorkspaceID": "WKS16", "WorkspaceName": "Test Workspace 11", "UserEmail": "email" } ] }

 

And I would like to retrieve this in a notebook.

Do you have a solution?

1 ACCEPTED SOLUTION

Hi @Charline_74,

 

not sure about the error. But this is then the processing of the data within the notebook.
Maybe instead of writing, check out this video Passing Variables to Notebooks in Fabric Metadata Driven Pipelines
This helped me implementing this pattern of passing the data to the notebook.

 

Best regards!

 

View solution in original post

7 REPLIES 7
v-karpurapud
Community Support
Community Support

Hi @Charline_74 

We have not received a response from you regarding the query and were following up to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions.

 

Thank You.

v-karpurapud
Community Support
Community Support

Hi @Charline_74 

I would also like to thank you @Mauro89  and @Gopikrsihna  for your active participation and for sharing solutions within the community forum. 

I hope the information provided helps resolve your issue. If you have any further questions or need additional assistance, please feel free to contact us. We are always here to help.

 

Best regards,
Community Support Team

Gopikrsihna
Frequent Visitor

another solution can be - Store in a temparory Table / file and read in Notepad

In ADF:

1. After the 'Loopup' activity , add a 'Copy Data' activity 

2. Source in Copy Activity : Use the Lookup' Output

    Sink in Copy Activity: write it to a temparory location either a table/Azure Blob or ADLS

3. In the notepad use this code

# Read from the temporary location
df = spark.read.json("abfss://container@storage.dfs.core.windows.net/temp/lookup_result.json")
df.show()

 

Gopikrsihna
Frequent Visitor

Solution: Pass Lookup Output as Notebook Parameter
In your ADF pipeline, after the Lookup activity, add a Notebook activity and configure it to pass the lookup result as a parameter.

ADF Pipeline Configuration:
1. In the Notebook activity, go to Settings > Base parameters

2. Add a parameter (e.g., lookup_data) with this expression:
   @string(activity('YourLookupActivityName').output.value)

 

3.In the Notebook (Python/PySpark):
import json

# Get the parameter passed from ADF
lookup_data_str = dbutils.widgets.get("lookup_data")

# Parse the JSON string
lookup_data = json.loads(lookup_data_str)

# Now you can work with the data
print(f"Total records: {len(lookup_data)}")

# Iterate through the results
for item in lookup_data:
workspace_id = item['WorkspaceID']
workspace_name = item['WorkspaceName']
user_email = item['UserEmail']
print(f"WorkspaceID: {workspace_id}, Name: {workspace_name}, Email: {user_email}")

# Or convert to a DataFrame
from pyspark.sql import SparkSession
spark = SparkSession.builder.getOrCreate()

df = spark.createDataFrame(lookup_data)
df.show()

Charline_74
Resolver I
Resolver I

Hi @Mauro89
I am getting this error: 

NameError: name 'dbutils' is not defined
 



Hi @Charline_74,

 

not sure about the error. But this is then the processing of the data within the notebook.
Maybe instead of writing, check out this video Passing Variables to Notebooks in Fabric Metadata Driven Pipelines
This helped me implementing this pattern of passing the data to the notebook.

 

Best regards!

 

Mauro89
Super User
Super User

Hi @Charline_74,

 

Yes you can do that. Here is a common method.

 

Method: Pass the Lookup output to the Notebook as a parameter

 

1. In your pipeline

 

After the Lookup activity (for example, named LookupWorkspaces), add a Notebook activity.

 

In the Notebook activity → Base parameters, pass the lookup result:

 

Parameter name: lookupResult

Value:

@activity('LookupWorkspaces').output.value

This passes only the value array (the list of rows).

If you want the entire JSON output:

@activity('LookupWorkspaces').output

 

2. In your notebook (PySpark or Python)

 

PySpark

dbutils.widgets.get("lookupResult")

 

Python

If using a Python cell:

import json

 

lookup_json = dbutils.widgets.get("lookupResult")

lookup_list = json.loads(lookup_json)

 

lookup_list

This gives you a Python list of dicts:

[

   {"WorkspaceID": "WKS13", "WorkspaceName": "Test Workspace 5", "UserEmail": "email"},

   {"WorkspaceID": "WKS9",  "WorkspaceName": "Test Workspace 9", "UserEmail": "email"},

   ...

]

You can convert it into a Spark DataFrame if needed:

df = spark.createDataFrame(lookup_list)

df.show()

 

If this helps, leave kudos and mark it as solution. 

Best regards, Mauro

Helpful resources

Announcements
Sticker Challenge 2026 Carousel

Join our Community Sticker Challenge 2026

If you love stickers, then you will definitely want to check out our Community Sticker Challenge!

Free Fabric Certifications

Free Fabric Certifications

Get Fabric certified for free! Don't miss your chance.

January Fabric Update Carousel

Fabric Monthly Update - January 2026

Check out the January 2026 Fabric update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.