[✅ Solution] How to proceed if Workflow Automations get Throttled?

I needed to run a flow that processed almost two thousand records and matched them with other client tracking records. In one branch of logic this flow could also trigger additional flows that help with data accuracy confirmation.

Usually the flow will be running in small quantities (one or two at a time) but in this instance we built it new and needed to run it on 1,500+ existing records.

Not surprisingly, this caused the flow to get Throttled.

My question is for clarification…do users need to be aware of and re-trigger anything when flows get throttled? Or will they simply be put into a queue and processed as time passes?

I’m sure I’ve seen documentation before on rate limits and when flows will get throttled, but I couldn’t find that again today when looking. What is the rate on automations running per hour?

3 Likes

Hi @CarsonRedCliffLabs,

thanks for bringing this up and for pointing out your questions in detail.

Throttling

First of all it is worth noting that throttling always happens for an entire organization, not for individual flows at the moment. This was a decision made in order to prevent any bad neighbor issues and to guarantee the behavior documented on our trust page.

The throttling will only last for a couple of minutes and will reduce flow execution throughput drastically to prevent excessive resource usage. There is nothing you need to do apart from being patient - no executions will be dropped or skipped, but they will just be queued and worked off “over time”. :timer_clock:

Documentation

I will add a dedicated throttling page to the developer docs’ workflow automation section and post it here.

Thanks for bringing this up! :pray:

Let me know if you have any follow up questions or something needs more detailed clarification.

Cheers
Tim

2 Likes

Hi @Tim, where can I find this? :point_up_2:
Need to run some flows for ~10k records.
Waiting for some time now but it feels way more than ‘a couple of minutes’…

Hi @Kollaborateur,

you will find information on workflow automation limitations and throttling here:

Cheers
Tim

Can someone please clarify what exactly is meant by ‘excessive action usage’ and ‘short period of time’?

How many actions are considered excessive, and within what specific timeframe?

I’m trying to find the ideal setup for regularly updating 20,000 records. Every configuration I’ve tried so far has resulted in throttling.

How many records can I safely update at once, and how long should I wait between batches to avoid being throttled?

Throttling per organization may be applied if an organization performs excessive action usage during a short period of time.

We once ran into a similar situation where we needed to update 10k records. The best solution that we have at that time and would probably do it the same way today would be: create some kind of views using filters to around 200 records. Then create automation that is triggered by date and time and use it for each saved view for as long as records aren’t updated. This way automations will keep running and records will slowly update without having a slow user experience and the rest of automation throttling :upside_down_face:
I think it also has to do if you need to update just one field compared to multiple fields and also type of fields.
If anyone else has a better solution I would love to hear it also.

1 Like

Hi @Kollaborateur

Obviously you will need Leo to give you the interpretation response however I would recommend the batch-update if you are not already doing so it can help a lot.

As Tomaz eluded to I believe each type of action has a different cost so for example handling files costs more than just updating a status field. Also the size of file uses more ‘credits’ (I should say here that i recently exported a lot of files from the other place and uploaded to Tape and never hit the Tape limits as I had to go soooo slow on the export).

In this example I take 337 records and update them all in one automation (takes 2 mins) there are other ways to do it which would be ‘neater’ and potentially push through updates faster but this is the least code way i could think of to safely update records if you have over night to do your updates you could run this automation every 15 mins it should process 350 records every 15 mins meaning around 14 hours - you maybe able to push this up to 400 each run if you are short of time or reduce it down to 300 if you want to be dafe and say have the weekend.

The video shows it in ‘action’ its quite boring:

https://share.cleanshot.com/Sfy3cQRq

  1. I search for records that need updating we only need one record so it is limited to 1
  2. I use JSONata to could the collected records and make sure we have one (I use JSONata to count as it will cope when its null)
  3. If the collection has a record we continue by clearing the collection
  4. I re-search for records that need updating this I limit to 50 records as that is what the batch update can handle
  5. I extract the record ID and build the inputs with my updates - before using the tape.Record.batchUpdate to update the 50 records in one go
  6. I then repeat steps 3-5 as many times as you dare :wink: i’ve done it 7 times in mine
5 Likes