In our previous article, we explored how to integrate GitLab with a self-hosted HashiCorp Vault to manage secrets securely. Building on that foundation, today’s focus shifts to utilizing HashiCorp Cloud Platform (HCP) Vault as a secret storage solution for GitLab. HCP Vault offers the same robust security features as the self-hosted version but with the added benefits of being fully managed and maintained by HashiCorp. This means you can enjoy streamlined setup, reduced operational overhead, and seamless scaling. In this article, we’ll guide you through the process of integrating HCP Vault with GitLab, ensuring your secrets are stored securely in the cloud while maintaining the efficiency of your CI/CD pipelines.
Prerequisites:
- Create an Organization in the Haschicorp Cloud Platform (Only manually)
- Connect and configure billing (Only manually)
- Configure Administrator User accounts (Only manually)
- Create a Project (One per bunch of services if you plan to access the Vault from the Consul via a Private network or vice versa). Hashicorp Virtual Network can’t be shared between projects. So, if you want to build services from HCP in one HVN, put them in one Project.
Just keep it in mind.
- Configure a Service Principal with the Contributor role on top of the Project
- If you have an organization created in the HCP with the billing in place all the next steps from the list above can be automated via Terraform and the HCP provider.
All the steps marked as manual from the above were done in the test trial organization under a test account without the billing connected.
Service Principal And Project created manually. Will be added to the terraform once we go into Production setup
Hashicorp Virtual Network and HCP Vault Cluster in the dev tier created via terraform from the repository: Volodymyr Vrublevskyy / Hcp Vault Gitlab Secretstore · GitLab
In addition to the Azure RM 3 hcp cloud variables have to be added into the terraform runtime system:
export HCP_CLIENT_ID="***"
export HCP_CLIENT_SECRET="*****"
export VAULT_NAMESPACE=admin
Vault Client Configuration.
Install Vault Binary on your own laptop. let’s say Mac os
brew tap hashicorp/tap
brew install hashicorp/tap/vault
Export Vault Server external IP address:
export VAULT_ADDR='https://vault-public-vault***.z1.hashicorp.cloud:8200'
export VAULT_TOKEN='*****'
export VAULT_NAMESPACE=admin
After that, you can run commands from your laptop, which will be executed on the remote HCP Vault Cluster.
Let’s configure Vault for the JWT token validation and GitHub CI access
The previous post was with GitLab + HashiCorp Vault as secret storage. You can find more details there.
Step 1: Go to the Vault server and type the command to enable the auth method for JWT.
$ vault auth enable jwt
Success! Enabled jwt auth method at: jwt/
Step 2: Then the command to write to the auth method
$ vault write auth/jwt/config jwks_url="https://gitlab.com/-/jwks" bound_issuer="gitlab.com"
Success! Data written to: auth/jwt/config
Step 3: Creating the variable needs to be used in Gitlab-CI
vault kv put kv-v2/mcptestproject/shared/security/prisma_key key='****'
vault kv put kv-v2/mcptestproject/shared/security/prisma_secret secret='*****'
Step 4: The next step is to create a Vault policy to access the key value (variable). In the current example, there will be only 1 policy
vault policy write mcptestproject-dev-000 - <<EOF
path "kv-v2/data/mcptestproject/shared/security/*" {
capabilities = [ "read" ]
}
EOF
Step 5: Create a vault role to restrict access to a particular project and namespace.
Depending on the scope, in our case, we can use role for the entire root group as credentials from the example are shared and for the Prisma Cloud scanning for all the projects in the Gitlab organization:
vault write auth/jwt/role/mcptestproject-dev-000 - <<EOF
{
"role_type": "jwt",
"policies": ["mcptestproject-dev-000"],
"token_explicit_max_ttl": 60,
"user_claim": "user_login",
"bound_claims_type": "glob",
"bound_claims": {
"namespace_path": "vovandodev/*"
}
}
EOF
The particular project role can be narrowed with the "project_id"
instead of "namespace_path"
vault write auth/jwt/role/mcptestproject-dev-000 - <<EOF
{
"role_type": "jwt",
"policies": ["mcptestproject-dev-000"],
"token_explicit_max_ttl": 60,
"user_claim": "user_login",
"bound_claims_type": "glob",
"bound_claims": {
"project_id": "48552721",
"ref": "*",
"ref_type": "branch"
}
}
EOF
The entire list of the bound_claims
and their description you can find here Authenticating and reading secrets with HashiCorp Vault | GitLab
A few Environment Variables should be added in the Gitlad on top of the Root Group :
- VAULT_AUTH_ROLE
- VAULT_AUTH_PATH
- VAULT_SERVER_URL
- VAULT_NAMESPACE
In our case:
- VAULT_AUTH_PATH: jwt
- VAULT_AUTH_ROLE: management-root
- VAULT_SERVER_URL: ‘https://vault-public-vault-***.z1.hashicorp.cloud:8200’
- VAULT_NAMESPACE: admin
If you scoped a project or subgroup with a separate role, the new variable with the correct role name should be added on the respective level, for example:
VAULT_AUTH_ROLE: vovandodev-root
stages:
- test
prisma-cloud-test:
variables:
PRISMA_API_URL: https://api.ca.prismacloud.io
PRISMA_SCAN_DIR: $CI_PROJECT_DIR
TF_HOST_NAME: gitlab.com
TF_REGISTRY_TOKEN: $GL_TF_MODULE_TOKEN
stage: test
image:
name: bridgecrew/checkov:latest
entrypoint:
- '/usr/bin/env'
- 'PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin'
secrets:
PRISMA_K:
vault: mcptestproject/shared/security/prisma_key/key@kv-v2
file: false
PRISMA_S:
vault: mcptestproject/shared/security/prisma_secret/secret@kv-v2
file: false
script:
- checkov -d "$PRISMA_SCAN_DIR" --download-external-modules true --bc-api-key $PRISMA_K::$PRISMA_S --repo-id "$CI_PROJECT_NAMESPACE/$CI_PROJECT_NAME" --branch "$CI_COMMIT_REF_NAME" --use-enforcement-rules -o cli -o junitxml --output-file-path console,prismacloud.xml
artifacts:
paths:
- prismacloud.xml
reports:
junit: prismacloud.xml
allow_failure: false
In conclusion, integrating GitLab with HashiCorp Cloud Platform (HCP) Vault provides a secure, scalable, and managed solution for secrets management. By leveraging HCP Vault, you can reduce operational complexity and focus on developing and deploying your applications without worrying about the intricacies of maintaining a self-hosted secrets management system. This setup not only enhances security but also improves efficiency within your DevOps workflows. As we’ve demonstrated, configuring this integration is straightforward, enabling you to seamlessly manage your secrets in the cloud. With HCP Vault, you can ensure that your sensitive information is well-protected, allowing your team to work with confidence and peace of mind.
Leave a Reply