Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Data source trying to read version when passed as a variable #3262

Open
marcosmartinezfco opened this issue Dec 30, 2024 · 0 comments
Open

Comments

@marcosmartinezfco
Copy link

Description

╷
│ Error: reading EKS Add-On version info (ebs-csi-driver, 1.31): empty result
│ 
│   with module.eks_base.module.eks_base.data.aws_eks_addon_version.this["ebs-csi-driver"],
│   on .terraform/modules/eks_base.eks_base/main.tf line 729, in data "aws_eks_addon_version" "this":729: data "aws_eks_addon_version" "this" {
│ 
╵

I'm experiencing the same issue as #2855 but I don't manage to solve it (I guess that I'm doing wrong)

Versions

  • Module version [Required]: 20.31.6

  • Terraform version: 1.9.5

  • Provider version(s): v5.82.2

Reproduction Code [Required]

module "eks_base" {
  source  = "terraform-aws-modules/eks/aws"
  version = "~> 20.0"

  cluster_name    = local.name
  cluster_version = var.cluster_version

  # Gives Terraform identity admin access to cluster which will
  # allow deploying resources (Karpenter) into the cluster
  # if not set to true, we would need a vpn connection to the cluster
  enable_cluster_creator_admin_permissions = true
  cluster_endpoint_public_access           = true

  cluster_addons = {
    coredns                = {}
    eks-pod-identity-agent = {}
    kube-proxy             = {}
    vpc-cni                = {}
    ebs-csi-driver = {
      version                  = "v1.38.1-eksbuild.1"
      service_account_role_arn = module.ebs_csi_diver_irsa.iam_role_arn
    }
  }

  vpc_id                   = module.eks_vpc.vpc_id
  subnet_ids               = module.eks_vpc.private_subnets
  control_plane_subnet_ids = module.eks_vpc.intra_subnets

  eks_managed_node_groups = {
    karpenter = {
      ami_type       = "AL2_ARM_64"
      instance_types = ["m6g.large"]

      min_size     = local.min_size_karpenter
      max_size     = local.max_size_karpenter
      desired_size = local.desired_size_karpenter

      labels = {
        # Used to ensure Karpenter runs on nodes that it does not manage
        "karpenter.sh/controller" = "true"
      }

      taints = {
        # This Taint aims to keep just EKS Addons and Karpenter running on this MNG
        # The pods that do not tolerate this taint should run on nodes created by Karpenter
        addons = {
          key    = "CriticalAddonsOnly"
          value  = "true"
          effect = "NO_SCHEDULE"
        },
      }
    }
  }

  access_entries = merge({
    AdministratorAccess_role_sandbox_account = {
      principal_arn = "arn:aws:iam::552564660527:role/aws-reserved/sso.amazonaws.com/AWSReservedSSO_AdministratorAccess_176be10a926531d3"
      policy_associations = {
        AmazonEKSClusterAdminPolicy = {
          policy_arn = "arn:aws:eks::aws:cluster-access-policy/AmazonEKSClusterAdminPolicy"
          access_scope = {
            type = "cluster"
          }
        }
      }
    }
    AdministratorAccess_role_production_account = {
      principal_arn = "arn:aws:iam::975050190559:role/aws-reserved/sso.amazonaws.com/AWSReservedSSO_AdministratorAccess_e310aa4016ad3b28"
      policy_associations = {
        AmazonEKSClusterAdminPolicy = {
          policy_arn = "arn:aws:eks::aws:cluster-access-policy/AmazonEKSClusterAdminPolicy"
          access_scope = {
            type = "cluster"
          }
        }
      }
    }
    GitHubActions_role = {
      principal_arn = "arn:aws:iam::${data.aws_caller_identity.current.account_id}:role/GithubActionsSP_ECR_and_Lambdas"
      policy_associations = {
        AmazonEKSClusterAdminPolicy = {
          policy_arn = "arn:aws:eks::aws:cluster-access-policy/AmazonEKSClusterAdminPolicy"
          access_scope = {
            type = "cluster"
          }
        }
      }
    }
  }, var.access_entries_roles)

  node_security_group_tags = {
    # NOTE - if creating multiple security groups with this module, only tag the
    # security group that Karpenter should utilize with the following tag
    # (i.e. - at most, only one security group should have this tag in your account)
    "karpenter.sh/discovery" = local.name
  }
}

Steps to reproduce the behavior:

Expected behavior

I'd expect the cluster to be created without issues and with the ebs-csi-driver

Actual behavior

Apparently the data source tries to fetch the version and gives the error

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants