Join us for an expert-led overview of the tools and concepts you'll need to pass exam PL-300. The first session starts on June 11th. See you there!
Get registeredJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
I am wanting to store the data collected from the Get Metadata activity into a lakehouse table, along with the Copy Data activity output for each table loaded in the ForEach activity. I have tried using variables of type array, and have now been testing out system variables. I haven't had any luck with figuring it out so far. Does anyone have any suggestions?
Solved! Go to Solution.
The following works;
1) Feed the output of the Get MetaData activity into a parameter of a Notebook Activity
2) Have a Notebook similiar to the following - attach it to a lakehouse
3) When the pipeline is run, this is what gets output;
If this helps, please consider Accepting as a Solution to help others find it more easily.
Hi @jpelham ,
I hope this information provided is helpful. Feel free to reach out if you have any further questions or would like to discuss this in more detail. If responses provided answers your question, please accept it as a solution so other community members with similar problems can find a solution faster.
Thank you!!
Hi @jpelham ,
I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If the responses has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.
Thank you!!
Hi @jpelham ,
Thank you for reaching out to Microsoft Fabric Community.
Thank you @spencer_sa and @residualcorn for the response.
May I ask if you have resolved this issue? If so, please mark the helpful reply and accept it as the solution. This will be helpful for other community members who have similar problems to solve it faster.
Thank you!!
If you explicitly want the Get Metadata activity data in a table, then it is possible to have a notebook with a string input parameter, feed the activity output into the parameter of the Notebook Activity, and then parse the json in python and write the subsequent output to a delta table using pyspark.
Thanks! I will give that a try!
The following works;
1) Feed the output of the Get MetaData activity into a parameter of a Notebook Activity
2) Have a Notebook similiar to the following - attach it to a lakehouse
3) When the pipeline is run, this is what gets output;
If this helps, please consider Accepting as a Solution to help others find it more easily.
Ive stored web activity output (which is basically the same) in a warehouse before using the script activity.
It is kinda hacky though, so a proper way would definitely be better, a stored procedure should work as well i think and be a bit cleaner already but i havent tried it
Heres example code for the script activity:
DECLARE @DynamicSQL NVARCHAR(MAX);
DECLARE @JsonData NVARCHAR(MAX) = '@{variables('
data')}';
-- Replace the pipeline expression with the actual value at runtime
SET @DynamicSQL = '
IF ISJSON(''' + REPLACE(@JsonData, '''', '''''') + ''') = 1
BEGIN
INSERT INTO ApiDataTable (RowId, RandomNumber, RandomString, RandomDate)
SELECT
randomId,
randomNumber,
randomString,
randomDate
FROM OPENJSON(''' + REPLACE(@JsonData, '''', '''''') + ''')
WITH (
randomId VARCHAR(100) ''$.randomId'',
randomNumber VARCHAR(100) ''$.randomNumber'',
randomString VARCHAR(100) ''$.randomString'',
randomDate VARCHAR(100) ''$.randomDate''
);
SELECT CONCAT(''Successfully inserted '', @@ROWCOUNT, '' rows into ApiDataTable'') AS Result;
END
ELSE
BEGIN
SELECT ''Error: Invalid JSON data. Please check the format.'' AS Result;
END';
EXEC sp_executesql @DynamicSQL;
Thanks for the suggestion! I tried doing a stored procedure and it wasn't quite working how I had hoped and it was getting kind of messy to work with. I'll give this a try if the other solution doesn't work.