In early 2022, Microsoft released a new activity in Azure Data Factory (ADF) called the Script activity. The Script activity allows you to execute one or more SQL statements and receive zero, one, or multiple result sets as the output. This is an advantage over the stored procedure activity that was already available in ADF, as the stored procedure activity doesn’t support using the result set returned from a query in a downstream activity.
However, when I went to find examples of how to reference those result sets, the documentation was lacking. It currently just says:
“For consuming activity output resultSets in down stream activity please refer to the Lookup activity result documentation.”
Microsoft Learn documentation on the Script activity
Populate the items in the ForEach activity
Similar to a Lookup activity, the Script activity can be used to populate the items in a ForEach activity, but the syntax is a bit different.
Let’s say we have a Script activity followed by a ForEach activity. The Script activity has a single result set.

When I populate the items property of my ForEach activity, I use the following expression: @activity('SCR_ScriptActivity').output.resultSets[0].rows
It starts similar to how we reference output from a Lookup activity. I reference the activity and then the output. But then instead of values
I use resultSets[0].rows
.
This makes sense when you look at the output from the activity.
{
"resultSetCount":1,
"recordsAffected":0,
"resultSets":[
{
"rowCount":4,
"rows":[
{
"colA":1
},
{
"colA":2
},
{
"colA":3
},
{
"colA":4
}
]
}
],
"outputParameters":{
},
"outputLogs":"",
"outputLogsLocation":"",
"outputTruncated":false,
"effectiveIntegrationRuntime":"AutoResolveIntegrationRuntime (East US)",
"executionDuration":1,
"durationInQueue":{
"integrationRuntimeQueue":1
},
"billingReference":{
"activityType":"PipelineActivity",
"billableDuration":[
{
"meterType":"AzureIR",
"duration":0.016666666666666666,
"unit":"Hours"
}
]
}
}
I want the output from the first (in this case, only) result set, so that’s resultSets[0]
. The data returned is in the rows array in that result set. So that’s resultSets[0].rows
.
How are you liking the script activity in Data Factory? Is there anything else you wish were included in the documentation? Let me know in the comments.
Thanks Meagan, this was very helpful. The only additional thing I found was when referencing the current item in an activity I needed to specify the column name from the result set, e.g. @item().colA rather than just @item(), which makes sense in hindsight.
Thanks. Glad it was helpful. Yes, when referring to a specific column inside a ForEach activity, you do use @item().colA. That is true regardless of the input source (Script, Lookup, etc.).
I’d just like to say thanks to both of you, because between you I made short work of this. A search on the concept brought back learn.microsoft stuff and some YouTube videos, but I find that blogs like this are what pay the bills. This had me up and running in about an hour. From scratch.
Thank you, Meagan. One thing worth mentioning is that by default this setup will error if the script pulls 0 rows back. You will receive an error similar to the following:
The expression ‘length(activity(”).output.resultSets[0].rows)’ cannot be evaluated because array index ‘0’ cannot be selected from empty array.
One workaround to this is to union a null record to the script and add logic within the foreach loop to do nothing on the null record.
It is too bad that the Script Activity has the same limitation as Lookup, which is that it can only return 5,000 rows.
Yeah, but if I need to do something more than 5000 times, I usually start looking for a better batch solution rather than a for each loop in ADF… like can I run a spark script or use a stored procedure in a database to look up values in a table and handle this in one (or a few) batches rather than 5000 ADF activity runs.
Thanks for that, it is very helpful.
In my case, the script activity has a where clause so it can also produce zero rows. i.e. ForEach should only gets executed if certain condition is met.
When no rows are selected from the script activity, i.e. when the resultSet is empty, I get the following error on ForEach activity:
length(activity(”).output.resultSets[0].rows)’ cannot be evaluated because array index ‘0’ cannot be selected from empty array.”,
“failureType”: “UserError”,
“target”: “ForEachTable”,
“details”: “”
}
How would I handle empty resultSet?
With the lookup activity, empty item is not an issue as it just skips executing the ForEach if no rows are passed.
How could you solve empty array error? When this code returns o result set ?
@activity(‘SCR_ScriptActivity’).output.resultSets[0].rows
Another comment suggested “One workaround to this is to union a null record to the script and add logic within the foreach loop to do nothing on the null record.”
How to use the output of a Script activity(Million rows) as input for Copy Data to store it in Database?
Sorry, I haven’t tried that
This IS very helpful, Meagan – I’m missing it closer! 🙂 But I’m trying to loop through all the tables in an Azure SQL DB, copying each table in a parameterized schema to ADLSG2 CSV files for subsequent PowerBI access. If I just use item().SchemaName and item().TableName in my CopyData Source and Sink dataset connection properties, they don’t resolve.
Tis architecture works beautifully using the Web API to get the Azure Storage Account table names to pull those tables IN to SQL, but it falls flat using a script task to get table name to pull OUT of SQL to copy into ADLSG2 files.
A couple of thoughts. 1) the item() notation is used to iterate through input in a ForEach activity, so make sure you have that part working and you are getting your items correctly as input to the FE. 2) If you are referencing a table like that, you need the @ symbol in front of item(). 3) There is a template for bulk copy from SQL DB to ADLS in ADF, so you really don’t have to build it yourself. It assumes you are using a control table in the lookup to get the items , but you can change that to a query or stored proc. You can read more about the template at https://learn.microsoft.com/en-us/azure/data-factory/solution-template-bulk-copy-with-control-table. I’d have to know more about what you are doing to understand if there is a reason to use a script task over a lookup.
Thanks so much for the reply! A colleague and I were able to get it working after a while of futzing around with the proper syntax in setting parameters from the CopyData task inside the FE container which then pass to the destination dataset.
The 30,000 foot view is that I’m integrating data between 5 disparate systems – a couple homegrown systems which use Azure table Storage accounts, Salesforce, and an ERP system. We’re using a central data staging SQL database to stage all the data from each source system before transforming it as it gets integrated into each destination. In the midst of all this integration work, management has decided they want a central data lake where they can do analytics, aggregating and combining data from each of the source systems, as well as terabytes of historical streamed data from field sensors, which is also persisted in Azure table storage accounts.
I’m using a script task because I put all the data from each source system into SQL tables in the staging DB in a schema named for the source system, and I want the ADF pipeline to be flexible enough to capture new tables from each source as the integration project develops, especially with the ERP source system. If we used a lookup task or a configuration table, that would depend on the integration developers to maintain that config table – I’d rather depend on the system tables than my own or my colleagues’ diligence to maintain system meta data.
I will definitely check out the bulk copy template, however!