Skip to content

Conversation

@oerp-odoo
Copy link

@oerp-odoo oerp-odoo commented Jan 28, 2024

When job fails because of concurrent update error, it does not respect max retries set by the job. Problem is that perform method logic that handles re-try is never called, because runjob in controller that triggers jobs, catches expected exception and silences it. Though it is done to not pollute logs.

So for now, adding extra check before job is run, to make sure max retries are handled if it reached it.

Some context:

It looks like code that supposed to handle max retries, is never called. But I am not sure what would be the right way to trigger up exception as there is some logic in here

except RetryableJobError as err:
that explicitly not want to raise that exception.

Not having max retries can be very problematic if your jobs can have many concurrent updates. Had some issue where somehow same job record (yes job record itself, not some other records, job would update) was being updated by two job runners at the same time and it would always fail and re-try. It had over 400 re-tries. And the only way to stop it, was to restart odoo.

For example, without this fix we can end up in situation like this:

Selection_1050

When job fails because of concurrent update error, it does not respect
max retries set by the job. Problem is ``perform`` method logic that
handles re-try is never called, because ``runjob`` in controller that
triggers jobs, catches expected exception and silences it. Though it
is done to not pollute logs.

So for now, adding extra check before job is run, to make sure max
retries are handled if it reached it.
@OCA-git-bot
Copy link
Contributor

Hi @guewen,
some modules you are maintaining are being modified, check this out!

@github-actions
Copy link

github-actions bot commented Jun 2, 2024

There hasn't been any activity on this pull request in the past 4 months, so it has been marked as stale and it will be closed automatically if no further activity occurs in the next 30 days.
If you want this PR to never become stale, please ask a PSC member to apply the "no stale" label.

@github-actions github-actions bot added the stale PR/Issue without recent activity, it'll be soon closed automatically. label Jun 2, 2024
@oerp-odoo
Copy link
Author

@guewen can you check this?

Copy link
Member

@amh-mw amh-mw left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Seems reasonable, but broad enough in scope that it should be covered by unit tests?

@github-actions github-actions bot removed the stale PR/Issue without recent activity, it'll be soon closed automatically. label Jun 9, 2024
@github-actions
Copy link

There hasn't been any activity on this pull request in the past 4 months, so it has been marked as stale and it will be closed automatically if no further activity occurs in the next 30 days.
If you want this PR to never become stale, please ask a PSC member to apply the "no stale" label.

@github-actions github-actions bot added the stale PR/Issue without recent activity, it'll be soon closed automatically. label Oct 13, 2024
@github-actions github-actions bot closed this Nov 17, 2024
job.store()
env.cr.commit()

# FIXME: this exception never triggers up, so it never reaches
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

reason?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Don't remember exact scenario how this would happen, but it would just keep running and retrying way past max retries, if let say concurrent update error would occur over and over again.

@AungKoKoLin1997
Copy link

I found this issue was happening in our customer environment.
@oerp-odoo Can you attend the comments?
I may work on that and supersede your PR if you don’t have time to follow it up.

@simahawk Can you please reopen the PR?

@rvalyi rvalyi reopened this Dec 1, 2025
@OCA-git-bot
Copy link
Contributor

Hi @guewen,
some modules you are maintaining are being modified, check this out!

@github-actions github-actions bot removed the stale PR/Issue without recent activity, it'll be soon closed automatically. label Dec 7, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants