Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to find coverage xml file in current working directory #1278

Closed
nfelt14 opened this issue Feb 7, 2024 · 12 comments
Closed

Unable to find coverage xml file in current working directory #1278

nfelt14 opened this issue Feb 7, 2024 · 12 comments

Comments

@nfelt14
Copy link

nfelt14 commented Feb 7, 2024

After upgrading to v4, the upload step consistently fails with this message:

debug - 2024-02-07 17:18:18,054 -- Running preparation plugin: <class 'codecov_cli.plugins.pycoverage.Pycoverage'>
info - 2024-02-07 17:18:18,054 -- Generating coverage.xml report in /home/runner/work/tm_devices/tm_devices
debug - 2024-02-07 17:18:18,146 -- Collecting relevant files
warning - 2024-02-07 17:18:18,156 -- Some files being explicitly added are found in the list of excluded files for upload. --- {"files": [".coverage_tests.xml"]}
warning - 2024-02-07 17:18:18,175 -- Some files were not found --- {"not_found_files": [".coverage_tests.xml"]}
info - 2024-02-07 17:18:18,193 -- Found 0 coverage files to upload
Error: No coverage reports found. Please make sure you're generating reports successfully.

See https:/tektronix/tm_devices/actions/runs/7818613874/job/21329253422 for details.

What needs to be done for v4 of this action to be able to find the xml file that does exist at that location?

This is currently blocking tektronix/tm_devices#140

@nfelt14
Copy link
Author

nfelt14 commented Feb 7, 2024

This is what the contents of the working directory are:

ls -la
total 10920
drwxr-xr-x 13 runner docker     4096 Feb  7 17:18 .
drwxr-xr-x  3 runner docker     4096 Feb  7 17:16 ..
-rw-r--r--  1 runner docker 10817536 Feb  7 17:18 .coverage
-rw-r--r--  1 runner docker   210820 Feb  7 17:18 .coverage_tests.xml
drwxr-xr-x  8 runner docker     4096 Feb  7 17:16 .git
-rw-r--r--  1 runner docker      459 Feb  7 17:16 .gitattributes
drwxr-xr-x  4 runner docker     4096 Feb  7 17:16 .github
-rw-r--r--  1 runner docker     1288 Feb  7 17:16 .gitignore
-rw-r--r--  1 runner docker     4[11](https:/tektronix/tm_devices/actions/runs/7818613874/job/21329253422#step:7:12)1 Feb  7 17:16 .pre-commit-config.yaml
-rw-r--r--  1 runner docker      454 Feb  7 17:16 .readthedocs.yml
drwxr-xr-x  3 runner docker     4096 Feb  7 17:18 .results_tests
drwxr-xr-x  3 runner docker     4096 Feb  7 17:17 .ruff_cache
drwxr-xr-x  6 runner docker     4096 Feb  7 17:17 .tox
-rw-r--r--  1 runner docker    15334 Feb  7 17:16 CHANGELOG.md
-rw-r--r--  1 runner docker     3276 Feb  7 17:16 CODE_OF_CONDUCT.md
-rw-r--r--  1 runner docker     6503 Feb  7 17:16 CONTRIBUTING.md
-rw-r--r--  1 runner docker    10[12](https:/tektronix/tm_devices/actions/runs/7818613874/job/21329253422#step:7:13)6 Feb  7 17:16 LICENSE.md
-rw-r--r--  1 runner docker    11485 Feb  7 17:16 README.rst
-rw-r--r--  1 runner docker      320 Feb  7 17:16 SECURITY.md
-rw-r--r--  1 runner docker      164 Feb  7 17:16 codecov.yml
drwxr-xr-x  7 runner docker     4096 Feb  7 17:16 docs
drwxr-xr-x  6 runner docker     4096 Feb  7 17:16 examples
-rw-r--r--  1 runner docker    [13](https:/tektronix/tm_devices/actions/runs/7818613874/job/21329253422#step:7:14)897 Feb  7 17:[16](https:/tektronix/tm_devices/actions/runs/7818613874/job/21329253422#step:7:17) pyproject.toml
drwxr-xr-x  2 runner docker     4096 Feb  7 [17](https:/tektronix/tm_devices/actions/runs/7818613874/job/21329253422#step:7:18):16 python_semantic_release_templates
drwxr-xr-x  2 runner docker     4096 Feb  7 17:16 scripts
drwxr-xr-x  3 runner docker     4096 Feb  7 17:16 src
drwxr-xr-x  5 runner docker     4096 Feb  7 17:17 tests

@dokempf
Copy link

dokempf commented Feb 8, 2024

I do have the very same issue and I can also confirm that this is a regression compared to v3.

@rohan-at-sentry
Copy link
Contributor

Hi @nfelt14 thanks for raising this. It looks like (from reading the error message) that there are some files in an excluded path that are specified to be uplaoded. This is indeed a regression, and is related to what others have reported (see codecov/feedback#265)

We're actively working on a fix for this, you can follow along codecov/engineering-team#1143

@thomasrockhu-codecov
Copy link
Contributor

@nfelt14 @dokempf I made a fix that might have fixed your issue, would you be able to try again and see if it's working now?

@nfelt14
Copy link
Author

nfelt14 commented Feb 27, 2024

@nfelt14 @dokempf I made a fix that might have fixed your issue, would you be able to try again and see if it's working now?

@thomasrockhu-codecov
Is there a specific version of the action I should use? Re-running the jobs as-is didn't make a difference.

@thomasrockhu-codecov thomasrockhu-codecov removed their assignment Mar 14, 2024
@nfelt14
Copy link
Author

nfelt14 commented Mar 14, 2024

@rohan-at-sentry, is there any update on this? I saw the linked issue was fixed, but this issue appears to not be fixed.

@rohan-at-sentry
Copy link
Contributor

@nfelt14 - We merged the linked issue 2 days ago. We haven't triggered a release yet, which could probably be why your runs still fail (I noticed v 0.4.8 as the CLI being used in a failing run on your repo from 30 min ago)

I'll reach back out here notifying when we've released the CLI

@nfelt14
Copy link
Author

nfelt14 commented Mar 17, 2024

@nfelt14 - We merged the linked issue 2 days ago. We haven't triggered a release yet, which could probably be why your runs still fail (I noticed v 0.4.8 as the CLI being used in a failing run on your repo from 30 min ago)

I'll reach back out here notifying when we've released the CLI

@rohan-at-sentry, any update on when a new release will be made?

@nfelt14
Copy link
Author

nfelt14 commented Mar 21, 2024

@nfelt14 - We merged the linked issue 2 days ago. We haven't triggered a release yet, which could probably be why your runs still fail (I noticed v 0.4.8 as the CLI being used in a failing run on your repo from 30 min ago)
I'll reach back out here notifying when we've released the CLI

@rohan-at-sentry, any update on when a new release will be made?

@rohan-at-sentry any update on a timeline for a release that will fix this issue?

@rohan-at-sentry
Copy link
Contributor

@nfelt14 We're aiming for Thursday next week (28th) ... I'll update here when it's done

@rohan-at-sentry
Copy link
Contributor

@nfelt14 can you please try again (the CLI has a new minor version that should help with this issue - no upgrade of the GH action should be necessary)

@nfelt14
Copy link
Author

nfelt14 commented Mar 28, 2024

@nfelt14 can you please try again (the CLI has a new minor version that should help with this issue - no upgrade of the GH action should be necessary)

I already tried it and it works as expected. Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants