Skip to content

Jobs causing memory overflow do not consider $maxExceptions (and are retried endlessly if $tries=0) #58207

@pingencom

Description

@pingencom

Laravel Version

12.40.1

PHP Version

8.4.15

Database Driver & Version

No response

Description

A job which causes a memory overflow, will be retried endlessly if $tries=0, despite having $maxExceptions=1
We also tried out what happens to jobs which timeout, however there it works as expected.

We'd expect a job which dies because of memory overflow to fail on second pickup by the worker, because $maxExceptions is set to 1 (and i'd argue that a memory overflow constitutes an exception, or is our assumption wrong here?)

This happens for redis driver & database only as for example SQS (as per comment below) has a native protection.

Steps To Reproduce

Create a job with:

timeout=10
tries=0
maxExceptions=1
Set 'retry_after' to something low, so you don't have to wait endlessly until workers pick up the job again

the handle() method of the job should look like this

ini_set('memory_limit', '72M');

$leak = [];
$ticks = 0;
while(true){
    $leak[$ticks] = file_get_contents(__FILE__);
    $ticks++;
    unset($leak[$ticks]);
}

in our stack the initial memory used by a queue worker is approximately 70MB. Maybe you'll need to adjust the memory limit to something closer to your 'initial' memory usage by the worker.

If you dispatch the job and see what horizon does, is that it will

  • run the job
  • report PHP Fatal error 'Allowed memory size of xxxx bytes exhausted'
  • after the 'retry_after' period the same job will be retried again and run into the same issue. This will go on forever

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions