Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Increase timeouts for some workspace YAML tests #4869

Closed
wants to merge 1 commit into from

Conversation

imjasonh
Copy link
Member

I've seen these fail due to timeouts lately (example), and 60s seems like a very short timeout for a test that might be running in a cluster alongside lots of other tests.

This bumps the timeout from 60s -> 5m.

Changes

Submitter Checklist

As the author of this PR, please check off the items in this checklist:

  • Docs included if any changes are user facing
  • Tests included if any functionality added or changed
  • Follows the commit message standard
  • Meets the Tekton contributor standards (including
    functionality, content, code)
  • Release notes block below has been filled in
    (if there are no user facing changes, use release note "NONE")

Release Notes

NONE

@tekton-robot tekton-robot added the release-note-none Denotes a PR that doesnt merit a release note. label May 13, 2022
@tekton-robot tekton-robot added the size/XS Denotes a PR that changes 0-9 lines, ignoring generated files. label May 13, 2022
@abayer
Copy link
Contributor

abayer commented May 13, 2022

/lgtm

I'm not sure if this will make a huge difference - when I tried to debug/reproduce the workspace_in_sidecar failures in a kind cluster, I tried fiddling with the timeout, namely by setting it stupidly low to hopefully reproduce the failure. But while I eventually got a failure, it wasn't the one that shows up most frequently in CI. That said, this definitely won't hurt, and will probably help to at least some extent.

@tekton-robot tekton-robot added the lgtm Indicates that a PR is ready to be merged. label May 13, 2022
@abayer
Copy link
Contributor

abayer commented May 13, 2022

I'm actually wondering if we should limit the parallelism of the YAML tests - I'm gonna try that in an experiment PR.

@imjasonh
Copy link
Member Author

/retest

@abayer
Copy link
Contributor

abayer commented May 14, 2022

/test pull-tekton-pipeline-alpha-integration-tests
/test pull-tekton-pipeline-integration-tests

I’m running the integration tests a few times to see how the relevant tests perform over multiple runs, similar to what I’m doing on #4841. I’m thinking the likely end result is going to be increased timeouts (also to some non-YAML tests, namely the sidecar ones and kaniko ones, see #4863 for the latter) and an explicit parallelism limit combined. That won’t solve all of our flakes - there are always going to be cases of proxy.golang.org or GitHub API burps, the GKE project hitting quotas, and whatever the root cause is for the sporadic runs taking 1.5+ hours and just barfing out, but the timeout and parallelism changes will, I believe, drastically reduce the recurrence of particular tests flaking out.

@abayer
Copy link
Contributor

abayer commented May 14, 2022

/test pull-pipeline-kind-k8s-v1-21-e2e

@abayer
Copy link
Contributor

abayer commented May 14, 2022

I’m also gonna run pull-pipeline-kind-k8s-v1-21-e2e a bunch on this PR, since that does give us a better sense of how much of the instability is GKE related…annoyingly I can’t run it effectively in my explicit parallelism PR because it doesn’t match the changed files filter!

EDIT: ooooh, actually, it looks like that does fire correctly when run via /test, but doesn’t seem to trigger automatically properly for any PR. Something else to investigate…

@abayer
Copy link
Contributor

abayer commented May 14, 2022

Bleh, the kind run times out waiting for installation of Pipeline to finish anyway.

@abayer
Copy link
Contributor

abayer commented May 14, 2022

/test pull-tekton-pipeline-alpha-integration-tests
/test pull-tekton-pipeline-integration-tests

2 similar comments
@abayer
Copy link
Contributor

abayer commented May 14, 2022

/test pull-tekton-pipeline-alpha-integration-tests
/test pull-tekton-pipeline-integration-tests

@abayer
Copy link
Contributor

abayer commented May 14, 2022

/test pull-tekton-pipeline-alpha-integration-tests
/test pull-tekton-pipeline-integration-tests

@tekton-robot
Copy link
Collaborator

[APPROVALNOTIFIER] This PR is APPROVED

This pull-request has been approved by: vdemeester

The full list of commands accepted by this bot can be found here.

The pull request process is described here

Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

@tekton-robot tekton-robot added the approved Indicates a PR has been approved by an approver from all required OWNERS files. label May 17, 2022
@tekton-robot
Copy link
Collaborator

@imjasonh: The following test failed, say /retest to rerun them all:

Test name Commit Details Rerun command
pull-tekton-pipeline-integration-tests abd584b link /test pull-tekton-pipeline-integration-tests

Full PR test history. Your PR dashboard.

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes/test-infra repository. I understand the commands that are listed here.

1 similar comment
@tekton-robot
Copy link
Collaborator

@imjasonh: The following test failed, say /retest to rerun them all:

Test name Commit Details Rerun command
pull-tekton-pipeline-integration-tests abd584b link /test pull-tekton-pipeline-integration-tests

Full PR test history. Your PR dashboard.

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes/test-infra repository. I understand the commands that are listed here.

@imjasonh
Copy link
Member Author

/close

@tekton-robot
Copy link
Collaborator

@imjasonh: Closed this PR.

In response to this:

/close

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes/test-infra repository.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
approved Indicates a PR has been approved by an approver from all required OWNERS files. lgtm Indicates that a PR is ready to be merged. release-note-none Denotes a PR that doesnt merit a release note. size/XS Denotes a PR that changes 0-9 lines, ignoring generated files.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants