Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error: Error running command 'completed_apply=0 for i in seq 1 10; do \ echo "apiVersion: v1 preferences: {} kind: Config #833

Closed
1 task
keremceliker opened this issue Apr 15, 2020 · 2 comments · Fixed by #927

Comments

@keremceliker
Copy link

keremceliker commented Apr 15, 2020

I have issues

I'm submitting a...

  • [*] bug report
  • feature request
  • [*] support request - read the FAQ first!
  • [*] kudos, thank you, warm fuzzy

What is the current behavior?

module.eks-cluster.module.eks.aws_autoscaling_group.workers[0]: Creation complete after 2m47s [id=EKSK8s-worker-group-12020041519025643350000000e]

Error: Error running command 'completed_apply=0
for i in `seq 1 10`; do \
echo "apiVersion: v1
preferences: {}
kind: Config

clusters:
- cluster:
    server: https://.....gr7.us-west-2.eks.amazonaws.com
    certificate-authority-data: ....
  name: eks_EKSK8s

contexts:
- context:
    cluster: eks_EKSK8s
    user: eks_EKSK8s
  name: eks_EKSK8s

current-context: eks_EKSK8s

users:
- name: eks_EKSK8s
  user:
    exec:
      apiVersion: client.authentication.k8s.io/v1alpha1
      command: aws-iam-authenticator
      args:
        - "token"
        - "-i"
        - "EKSK8s"


" > kube_config.yaml && \
echo "apiVersion: v1
kind: ConfigMap
metadata:
  name: aws-auth
  namespace: kube-system
data:
  mapRoles: |
    - rolearn: arn:aws:iam::855030332868:role/EKSK8s20200415190242620100000005
      username: system:node:{{EC2PrivateDNSName}}
      groups:
        - system:bootstrappers
        - system:nodes




" > aws_auth_configmap.yaml && \
kubectl apply -f aws_auth_configmap.yaml --kubeconfig kube_config.yaml && \
completed_apply=1 && break || \
sleep 10; \
done; \
rm aws_auth_configmap.yaml kube_config.yaml;
if [ "$completed_apply" = "0" ]; then exit 1; fi;
': exec: "/bin/sh": file does not exist. Output:

Environment details

  • Affected module version:
  • OS: Windows 10
  • Git: Version 2.26.1.windows.1
  • Terraform version:

Terraform v0.12.24

  • provider.aws v2.57.0
  • provider.local v1.4.0
  • provider.null v2.1.2
  • provider.random v2.2.1
  • provider.template v2.1.2

kubectl version => Client Version: version.Info{Major:"1", Minor:"15+", GitVersion:"v1.15.10-eks-bac369", GitCommit:"bac3690554985327ae4d13e42169e8b1c2f37226", GitTreeState:"clean", BuildDate:"2020-02-21T23:37:18Z", GoVersion:"go1.12.12", Compiler:"gc", Platform:"windows/amd64"}

Unable to connect to the server: dial tcp [::1]:8080: connectex: No connection could be made because the target machine actively refused it.

Any idea on it how to fix this bug or issue ?

@dpiddockcmp
Copy link
Contributor

The default configuration of the module is designed for use under unix-type environments, like macOS and Linux. You are trying to run under Windows which does not have a /bin/sh interpreter.

You need to set the correct values for wait_for_cluster_interpreter and possibly wait_for_cluster_cmd. I do not have access to a Windows machine for testing but #795 (comment) suggests a valid configuration to be:

  wait_for_cluster_interpreter = ["c:/git/bin/sh.exe", "-c"]
  wait_for_cluster_cmd         = "until curl -sk $ENDPOINT >/dev/null; do sleep 4; done"

@github-actions
Copy link

I'm going to lock this issue because it has been closed for 30 days ⏳. This helps our maintainers find and focus on the active issues. If you have found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Nov 25, 2022
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants