Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PipelineResource for S3 buckets (for IBM COS, AWS S3, Minio ...) #323

Closed
bkuschel opened this issue Dec 11, 2018 · 11 comments
Closed

PipelineResource for S3 buckets (for IBM COS, AWS S3, Minio ...) #323

bkuschel opened this issue Dec 11, 2018 · 11 comments
Labels
lifecycle/rotten Denotes an issue or PR that has aged beyond stale and will be auto-closed. meaty-juicy-coding-work This task is mostly about implementation!!! And docs and tests of course but that's a given

Comments

@bkuschel
Copy link

Expected Behavior

Ability to use S3 for object storage as PipelineResources types

Actual Behavior

#321 exists for GCS but does not support S3

Additional Info

In multi-zone clusters, PVCs can be difficult and sometimes its prefered to transfer artifacts between tasks in a pipeline using other types of storage.

@bkuschel bkuschel changed the title PipelineResource for S3 buckets (for IBM COS, AWS S3) PipelineResource for S3 buckets (for IBM COS, AWS S3, Minio ...) Dec 11, 2018
@mustafaakin
Copy link

mustafaakin commented Jan 19, 2019

I would like to contribute on this issue since we are on AWS+EKS. Considering https:/knative/build-pipeline/blob/master/pkg/apis/pipeline/v1alpha1/gcs_resource.go it seems straight forward but any pointers would be nice, following Finding something to work on on CONTRIBUTING.md

@bobcatfish
Copy link
Collaborator

Sounds good @mustafaakin ! I'm excited that you want to contribute :D!

(btw if you haven't already plz feel free to join us in slack at #build-pipeline !)

@bobcatfish
Copy link
Collaborator

I think to properly support this we should expand our end to end tests to support using S3 as well (not to mention other clouds in general!), but I think there's probably a bit of work to do there so for now I think we should add this functionality, initially not cover it with end to end tests, and create a separate issue around setting up infrastructure for end to end tests against s3.

(Any other thoughts @shashwathi @pivotal-nader-ziada @imjasonh @tejal29 ?)

@bobcatfish bobcatfish added the meaty-juicy-coding-work This task is mostly about implementation!!! And docs and tests of course but that's a given label Jan 22, 2019
@bobcatfish
Copy link
Collaborator

initially not cover it with end to end tests

Or maybe a better idea: add an end to end test that is skipped by default, which folks can run manually?

@jstrachan
Copy link

btw this go library is a great way to work with all the different cloud blob storage provides (GCS, S3, Azure etc) https:/google/go-cloud via https:/google/go-cloud/tree/master/blob using a simple URL scheme

@mustafaakin
Copy link

Sorry for the late reply, assuming we use go-cloud, what would be the way to go? Right now there is GCS Storage resource and it would be a duplicate.

@dlorenc
Copy link
Contributor

dlorenc commented Apr 23, 2019

What do you think about keeping a single StorageResource, but expanding it to support multiple backing stores? I think that's in-line with what we were thinking for #778 as well.

@bobcatfish
Copy link
Collaborator

That does sound like a good diea @dlorenc - and maybe in that case it would make sense to try out go-cloud like @mustafaakin is suggesting? (I think I was initially opposed b/c I assumed there would be too many difference but now I'm thinking I was wrong)

@iancoffey
Copy link
Member

As chance would have it, I had made a step and a small task/utility using go-cloud for S3 artifact diddling, so I reworked it into a Storage resource type-> #1258

hrishin pushed a commit to hrishin/tekton-pipeline that referenced this issue Feb 18, 2020
…ease-version

Reflect Tekton release version in Artifacts after release
@tekton-robot
Copy link
Collaborator

Stale issues rot after 30d of inactivity.
Mark the issue as fresh with /remove-lifecycle rotten.
Rotten issues close after an additional 30d of inactivity.
If this issue is safe to close now please do so with /close.

/lifecycle rotten

Send feedback to tektoncd/plumbing.

@tekton-robot tekton-robot added the lifecycle/rotten Denotes an issue or PR that has aged beyond stale and will be auto-closed. label Aug 12, 2020
@bobcatfish
Copy link
Collaborator

Two possible ways to handle this:

  • Make it possible for workspaces to be backed by something other than k8s types (e.g. GCS, S3)
  • Create Tasks for this and consider it done

There is also the PipelineResource redesign in #1673

Given all of that it feels reasonable to me to close this for now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
lifecycle/rotten Denotes an issue or PR that has aged beyond stale and will be auto-closed. meaty-juicy-coding-work This task is mostly about implementation!!! And docs and tests of course but that's a given
Projects
None yet
Development

No branches or pull requests

7 participants