Getting started
Rules Engine Widgets
Loop Over Items
8 min
loop over items (logic node → loop operations) overview the loop over items node processes each element of an array, optionally in batches, with control over concurrency and rate limiting use it when you need to run the same set of steps for many items (e g , many users, records, or urls) without hitting api limits when to use use this node when you receive a list/array from a previous node (e g , db query, api response) and must perform the same action for each entry you want to chunk a large list into manageable batches you need to throttle external calls (e g , api requests) to avoid rate limit errors you want to control sequential vs parallel execution for reliability and performance core concepts data source – the array that will be looped over batch size – number of items processed in one batch execution mode / concurrency – whether items run sequentially or in parallel per item context – each loop iteration exposes the current item to inner nodes aggregate output – collect results of all iterations when the loop finishes node configuration 1\ data source field on node data source (not set by default) bind this to an array from a previous node (e g , {{ nodes get users data }}) the node will iterate over each element of this array if the input is not an array, the node will not be able to loop as expected 2\ batch size field on node batch size (default 1) controls how many items are processed per batch 1 – process each item individually n > 1 – process items in groups of n batching is useful when the downstream api supports small parallel workloads you want to limit memory/cpu by processing fixed size chunks 3\ execution / concurrency mode field on node e g , sequential (visible button/state) determines how batches/items are executed sequential – processes one item/batch at a time in order safest, easiest for debugging helps avoid rate limits and race conditions concurrent / parallel (if enabled in your workspace) processes multiple items at once up to a concurrency limit faster for large lists, but use with care for apis with strict limits some configurations may expose explicit concurrency and delay settings concurrency – maximum number of items processed in parallel delay between batches – wait time (in ms/seconds) between batches to throttle calls 4\ loop outputs the node exposes two main types of outputs per item context inside the loop branch, you can reference the current item (e g , {{ loop item }} or an alias defined by the platform) use this to access properties like item id, item email, item url, etc aggregate results (after completion) optionally, the node can output an array of results from inner nodes success/failure counts, or other aggregate data you can use this on the main path that continues after the loop finishes (the exact variable names depend on your workspace’s expression syntax; follow the in editor hints/tooltips ) how it works (execution flow) input node receives an array via data source the array is split into batches based on batch size for each batch items are executed either sequentially or concurrently , based on mode/concurrency settings inner nodes (connected to the loop’s inner branch) run once per item, using the current item context completion once all items are processed, the node optionally aggregates outputs the scenario then continues along the main path if the loop encounters an error for one item, behavior (stop vs continue) follows your scenario’s error handling configuration step by step usage add the node drag loop over items into your scenario from logic → loop operations bind data source open the node settings set data source to an array from a previous node (api response, db list, etc ) configure batching & mode set batch size (e g , 10 for batches of 10 items) choose sequential or concurrent mode and, if available, set concurrency (max parallel items) delay between batches (to throttle requests) design per item branch from the loop’s inner output, connect the nodes that should run for each item inside those nodes, use the current item variables (e g , to send an email to each user) handle completion use the loop’s main output after completion to log aggregate results trigger follow up actions once all items are done test with a small list start with a small array and low concurrency to verify behavior increase batch size/concurrency gradually if you need more throughput best practices validate input ensure the bound data is an array; log it from previous nodes if unsure respect rate limits keep batch size and concurrency conservative for third party apis idempotency inner actions should handle retries gracefully (e g , avoid double charging or double sending) monitoring store per item success/failure info in a log table or sheet for observability fail fast vs tolerant decide whether errors on a single item should stop the whole loop or be handled per item