It may be a bit of an unusual thing to integrate 2 different docker image repositories from different vendors. Usually companies pick 1 particular product to host their private docker images, however there may be special circumstances where more than 1 product is used.
In this particular case, when new docker images are pushed into the Artifactory repository, we want to have a copy of those images pushed into a Google Container Repository (GCR) as well. In order to achieve this, 2 main components are introduced: 1) user plugin that we drop into the Artifactory system, 2) a VM that runs docker, gcloud API, and Jenkins that will pull, re-tag, push the docker images from Artifactory to GCR.
Essentially what we’re doing is in Artifactory we want to kick off a Groovy plugin when a new image has been pushed. This plugin does a simple http call to Jenkins to kick off a job. Jenkins then in turn pulls the docker image from Artifactory, re-tags it, then pushes it to GCR and finally deletes them from docker so they don’t stay behind and use up disk space.
Snippet of your Groovy plugin for Artifactory: /opt/jfrog/artifactory/etc/plugins/copyToGcr.groovy
storage {
afterCreate { item ->
...
def req = new URL(jenkinsUri).openConnection();
req.setRequestMethod("POST")
req.setDoOutput(true)
req.setRequestProperty("Content-Type", "application/x-www-form-urlencoded")
req.setRequestProperty("Authorization", "Basic " + jenkinsAuth)
def body = 'IMAGE=' + imageName
req.getOutputStream().write(body.getBytes("UTF-8"));
...
}
}
Snippet of the Jenkins pipeline job:
node {
stage('Pull Image from Artifactory') { ... }
stage('Re-Tag Image') { ... }
stage('Push Image to GCR') { ... }
stage('Clean up Image') { ... }
}
Additional details can be found in “Triggering Jenkins builds when deploying artifacts to Artifactory“
