8000 successfulJobsHistoryLimit Applies Namespace-Wide Instead of Per Schedule · Issue #1053 · k8up-io/k8up · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

successfulJobsHistoryLimit Applies Namespace-Wide Instead of Per Schedule #1053

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
FieldofClay opened this issue Mar 30, 2025 · 0 comments
Labels
bug Something isn't working

Comments

@FieldofClay
Copy link
FieldofClay commented Mar 30, 2025

Description

It appears that successfulJobsHistoryLimit is applied across all Schedules within the same namespace, rather than being scoped to the Backups created by each individual Schedule.

Additional Context

  • When creating two Schedules in separate namespaces, the expected behavior occurs (each Schedule retains its own backups properly).
  • failedJobsHistoryLimit may also be affected by this issue.

Logs

Expected Behavior

Each Schedule should retain up to successfulJobsHistoryLimit backups independently, meaning we should see four backups in total (two per Schedule) from the steps listed below.

Steps To Reproduce

  1. Create two backup Schedules within the same namespace, each selecting different pods using label selectors.
  2. Set successfulJobsHistoryLimit to 2 for both Schedules.
  3. Only the two most recent backups are retained across both Schedules, rather than retaining two per Schedule (for a total of four).

Version of K8up

v2.12.0

Version of Kubernetes

v1.31

Distribution of Kubernetes

k3s

@FieldofClay FieldofClay added the bug Something isn't working label Mar 30, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant
0