Skip to main content

Add custom certificates to a pipeline

Some organizations prefer to use custom SSL certificates instead of certificates generated by a public Certificate Authority (CA). If your organization uses internal certificates, you need to set up Harness to use these certificates.

Harness supports three workflows for using custom certificates. You can add your certs to the delegate, to individual pipelines, or to the container images you use to run your scans.

When to use this workflow

Harness STO supports three workflows for running scans with custom certificates. This workflow is recommended if either of the following are true:

  • You're using any delegate type other than Kubernetes or Docker, such as a Harness Cloud delegate.

  • You cannot access or customize your delegate directly.

You can also use this workflow if the external scanner requires additional files, such as auth script or license files, to run scans. For example, ZAP scans might require context files as noted below.

Important notes

  • You must have root access to perform the workflow documented below.

  • Make sure that your certificates meet all requirements of the external scan tool. Your certificates must be valid, unexpired, and have a complete trust chain.

  • STO supports certificates in PEM and Distinguished Encoding Rules (DER) format.

  • Harness STO does not support certificate bundles. Each certificate should be specified in its own file. If you have a bundle that you want to use with an external scanner, Harness recommends that you split the bundle into individual files.

  • Store each certificate file as a Harness file secret. You can also use third-party managers such as HashiCorp Vault, Azure Key Vault, and AWS Secrets Manager. For more information, go to Harness Secrets Manager Overview.

  • You must include all required files in /shared/customer_artifacts/ or a related subfolder, as described below. You can include any number of certificates or other files in or under this folder.

  • Save each SSL certificate file to /shared/customer_artifacts/certificates/<certificate_name>.

  • If the scanner requires a license file, save it to /shared/customer_artifacts/<license_file_name>.

  • If you're running a ZAP scan that uses context files such as auth scripts, context files, or URL files, specify the following shared folders and make sure that your Run step copies in the required files.

    • /shared/customer_artifacts/authScript/<artifact_file_name>
    • /shared/customer_artifacts/context/<artifact_file_name>
    • /shared/customer_artifacts/urlFile/<artifact_file_name>
    • /shared/customer_artifacts/hosts/<artifact_file_name>

Workflow description

This workflow applies to all supported build infrastructures. It also applies to STO on SaaS, as well as Harness Self-Managed Platform.

  1. For each artifact that contains sensitive information, such as an SSL certificate, create a Harness secret.

  2. Go to the pipeline where you want to add the artifact.

  3. In the stage where that will use the artifact, go to Overview > Shared Paths and create a folder under /shared such as /shared/customer_artifacts.

  4. Add a Run step to the stage that adds the artifacts to the shared folder. This step needs to run before the scanner step that uses the artifact.

Example workflow

This example shows how to include a PEM file in a pipeline that runs a scan using a Security step. This workflow assumes that you have a valid PEM stored as a Harness File Secret.

  1. In your Harness pipeline, go to the Overview tab of the Security stage. Under Shared Paths, enter the following shared path:

    /shared/customer_artifacts/certificates

    This is the default certificate location for Harness pipelines. You can copy any number of certificates to this folder.

  2. Add a Run step that copies your PEM file to the certificates folder. Here's some example code that does this:

    set -e
    touch /shared/customer_artifacts/certificates/certificate
    printf "%s" "$NEWCERT" > /shared/customer_artifacts/certificates/certificate
  3. Set up the remaining downstream steps in your pipeline. When the pipeline runs a SonarQube scan that requires a PEM, it looks in /shared/customer_artifacts/certificates and proceeds if it finds a valid certificate.

YAML pipeline example

The following illustrates an end-to-end pipeline that copies a PEM certificate to the default location, builds an image, and then scans the image using SonarQube (authorized using the certificate).

YAML pipeline example
pipeline:
allowStageExecutions: false
projectIdentifier: STO
orgIdentifier: default
identifier: jsmith_cloud_sq_mvn_with_pem_files
name: "jsmith cloud - sq mvn with pem files " tags: {}
properties:
ci:
codebase:
connectorRef: dvja
build: <+input>
stages:
- stage:
name: build
identifier: build
type: SecurityTests
spec:
cloneCodebase: true
sharedPaths:
- /var/run
- /shared/customer_artifacts
serviceDependencies:
- identifier: dind
name: dind
type: Service
spec:
connectorRef: account.harnessImage
image: docker:dind
privileged: true
entrypoint:
- dockerd-entrypoint.sh
resources:
limits:
memory: 4Gi
cpu: 1000m
execution:
steps:
- step:
type: Run
name: export path
identifier: export_path
spec:
connectorRef: DockerNoAuth
image: alpine
shell: Sh
command: |-
pwd
harness_path=$(pwd)
export harness_path
outputVariables:
- name: harness_path
- step:
type: Run
name: addcerts
identifier: addcert
spec:
connectorRef: mydocker
image: alpine
shell: Sh
command: |-
set -e
mkdir -p -v /shared/customer_artifacts/certificates

touch /shared/customer_artifacts/certificates/certificate1
printf "%s" "$NEWCERT" > /shared/customer_artifacts/certificates/certificate1

# touch /shared/customer_artifacts/certificates/certificate2
# printf "%s" "$NEWDUMMYCERT" > /shared/customer_artifacts/certificates/certificate2

ls -l /shared/customer_artifacts/certificates
cat /shared/customer_artifacts/certificates/certificate | base64
envVariables:
NEWCERT: <+secrets.getValue("sonarqube_self_signed_cert")>
NEWDUMMYCERT: <+secrets.getValue("my-dummy-pem")>
- step:
type: Run
name: build
identifier: build
spec:
connectorRef: DockerNoAuth
image: maven:3.3-alpine
shell: Sh
command: |
mvn clean package
- step:
type: Security
name: sonar
identifier: sonar
spec:
privileged: true
settings:
policy_type: orchestratedScan
scan_type: repository
product_domain: https://sonarqube-cert-test.myorg.dev/
product_name: sonarqube
product_config_name: sonarqube-agent
repository_project: dvja
repository_branch: <+codebase.branch>
product_access_token: MY_PROD_TOKEN
product_project_key: dvja
verify_ssl: true
bypass_ssl_check: true
workspace: <+pipeline.stages.build.spec.execution.steps.export_path.output.outputVariables.harness_path>/target
imagePullPolicy: Always
resources:
limits:
memory: 2Gi
cpu: 1000m
description: sonar
failureStrategies: []
platform:
os: Linux
arch: Amd64
runtime:
type: Cloud
spec: {}
variables:
- name: runner_tag
type: String
value: dev