Skip to main content
Redhat Developers  Logo
  • Products

    Featured

    • Red Hat Enterprise Linux
      Red Hat Enterprise Linux Icon
    • Red Hat OpenShift AI
      Red Hat OpenShift AI
    • Red Hat Enterprise Linux AI
      Linux icon inside of a brain
    • Image mode for Red Hat Enterprise Linux
      RHEL image mode
    • Red Hat OpenShift
      Openshift icon
    • Red Hat Ansible Automation Platform
      Ansible icon
    • Red Hat Developer Hub
      Developer Hub
    • View All Red Hat Products
    • Linux

      • Red Hat Enterprise Linux
      • Image mode for Red Hat Enterprise Linux
      • Red Hat Universal Base Images (UBI)
    • Java runtimes & frameworks

      • JBoss Enterprise Application Platform
      • Red Hat build of OpenJDK
    • Kubernetes

      • Red Hat OpenShift
      • Microsoft Azure Red Hat OpenShift
      • Red Hat OpenShift Virtualization
      • Red Hat OpenShift Lightspeed
    • Integration & App Connectivity

      • Red Hat Build of Apache Camel
      • Red Hat Service Interconnect
      • Red Hat Connectivity Link
    • AI/ML

      • Red Hat OpenShift AI
      • Red Hat Enterprise Linux AI
    • Automation

      • Red Hat Ansible Automation Platform
      • Red Hat Ansible Lightspeed
    • Developer tools

      • Red Hat Trusted Software Supply Chain
      • Podman Desktop
      • Red Hat OpenShift Dev Spaces
    • Developer Sandbox

      Developer Sandbox
      Try Red Hat products and technologies without setup or configuration fees for 30 days with this shared Openshift and Kubernetes cluster.
    • Try at no cost
  • Technologies

    Featured

    • AI/ML
      AI/ML Icon
    • Linux
      Linux Icon
    • Kubernetes
      Cloud icon
    • Automation
      Automation Icon showing arrows moving in a circle around a gear
    • View All Technologies
    • Programming Languages & Frameworks

      • Java
      • Python
      • JavaScript
    • System Design & Architecture

      • Red Hat architecture and design patterns
      • Microservices
      • Event-Driven Architecture
      • Databases
    • Developer Productivity

      • Developer productivity
      • Developer Tools
      • GitOps
    • Secure Development & Architectures

      • Security
      • Secure coding
    • Platform Engineering

      • DevOps
      • DevSecOps
      • Ansible automation for applications and services
    • Automated Data Processing

      • AI/ML
      • Data Science
      • Apache Kafka on Kubernetes
      • View All Technologies
    • Start exploring in the Developer Sandbox for free

      sandbox graphic
      Try Red Hat's products and technologies without setup or configuration.
    • Try at no cost
  • Learn

    Featured

    • Kubernetes & Cloud Native
      Openshift icon
    • Linux
      Rhel icon
    • Automation
      Ansible cloud icon
    • Java
      Java icon
    • AI/ML
      AI/ML Icon
    • View All Learning Resources

    E-Books

    • GitOps Cookbook
    • Podman in Action
    • Kubernetes Operators
    • The Path to GitOps
    • View All E-books

    Cheat Sheets

    • Linux Commands
    • Bash Commands
    • Git
    • systemd Commands
    • View All Cheat Sheets

    Documentation

    • API Catalog
    • Product Documentation
    • Legacy Documentation
    • Red Hat Learning

      Learning image
      Boost your technical skills to expert-level with the help of interactive lessons offered by various Red Hat Learning programs.
    • Explore Red Hat Learning
  • Developer Sandbox

    Developer Sandbox

    • Access Red Hat’s products and technologies without setup or configuration, and start developing quicker than ever before with our new, no-cost sandbox environments.
    • Explore Developer Sandbox

    Featured Developer Sandbox activities

    • Get started with your Developer Sandbox
    • OpenShift virtualization and application modernization using the Developer Sandbox
    • Explore all Developer Sandbox activities

    Ready to start developing apps?

    • Try at no cost
  • Blog
  • Events
  • Videos

Dynamically manage apps with the Argo CD plug-in generator

Working with the ApplicationSet controller and generators in Argo CD 2.8

February 28, 2024
Tal Hason
Related topics:
ContainersHelm
Related products:
Red Hat OpenShiftRed Hat OpenShift Local

Share:

    Argo CD introduced us to the concepts of ApplicationSets and generators sometime ago. It added several generators, such as the Git generator and the list generator (to get full details on those, see the Argo CD documentation).

    The idea behind these new generators is to provide users, developers, administrators, and platform teams with a way to create their flavor of generator with a basic web application server that exposes a POST path with a JSON body of the desired schema.

    So why do I bring you all here, you might be asking?

    I have created a small plug-in that uses a simple Node.js application server and a Helm chart to dynamically update, create, and delete those new applications from the ApplicationSet, all in the GitOps style of practices.

    Let's begin

    The repository we will use in this tutorial can be found here. Fork and follow the README file on how to use and duplicate.

    What you need to make this work:

    • Red Hat OpenShift 4.13 cluster (get Red Hat OpenShift Local).
    • Red Hat OpenShift GitOps 1.10 (Argo CD 2.8)
    • oc client
    • Helm command-line interface (CLI)
    • yq processor

    The sample application project can be found here; please fork if you plan to change stuff.

    The repository files and folder are shown in Figure 1.

    sadfsafdsfdsfdssadsadsa
    Figure 1: The repository for the sample application project.

    Let's explain the nodejs folder.

    Under it, we have:

    • src folder, with the app.js file that contains our web application server.
    • src/config; here we store a default app.yaml file with basic config.
    • argocd-plugin-app.postman_collection.json: Postman collection to assist with testing the POST request.
    • automation.sh script to assist with building, testing, and pushing the application image.
    • Dockerfile, a multi-stage Docker file to build the application container image.

    The automation.sh file will make it easier to run podman build/run/push.

    The script accepts two running arguments; the first is the commit message, and the second is the desired operation.

    • test: This will build and run the application image.
    • push: This will build and push the application image to the image registry after it prompts to either update the Helm chart values file or not with the new image tag.
    • To work with the file, create an environment variable named IMAGE_NAME with your _image.registry/repo_name/image_; if you forget, the script will ask you for the image name in the first run.

    I added a Swagger UI to the application to assist with testing the payload on the fly directly from the server for environments with limited resources (no Postman, for example).

    To access the Swagger UI, you need to navigate to the application route from your cluster, i.e., https://plugin-argo-plugin-openshift-gitops.<>/API-docs.

    You should be presented with the Swagger UI, as shown in Figure 2.

    Swaager
    Figure 2: The Swagger UI.

    The API is protected with a simple bearer token, the deployment mechanism generating a new one after each commit so to get the latest token, run the following:

    oc get secrets plugin-argocd-app-set-plugin-token -n openshift-gitops -o yaml | yq eval '.data.token' | base64 -d

    This will print the token to the terminal and you can click the Authorize button on the top-right of the Swagger UI to enter it. Click Login, and you are all set (Figure 3).

    Login
    Figure 3: Log out after the token is authorized.

    The GitOps folder

    In our GitOps folder, we find 2 subfolders:

    • Argo-Plugin: Holds our Helm chart with its values file to deploy our plug-in web application.
    • ArgoApps: Holds all our Argo CD objects with Kustomize.

    I built it this way because Helm is more flexible with application deployment manipulation, and Kustomize is better with non-planned add-ons and changes. To duplicate the plug-in, there are changes that we need to make, which I will explain later on.

    Argo-Plugin folder

    Read the README file to understand the chart options; it's very basic.

    Some tricks and tips

    Folders:

    • Certificates: Paste your domain certificate and it will be added to the route that will be generated by the ingress.
    • ApplicationFiles: This is the most important folder, in which we will put our  {Application}.yaml files like the following example. The application supports multiple config files (i.e., batman.yaml, robin.yaml, joker.yaml, etc.), and will merge them to a single JSON payload:

    batman.yaml: 

    GenerateApplication:
      name: batman
      project: gotham-demo
      image: quay.io/gotham/batman
      tag: 4a7050d
      repoURL: https://github.com
      branch: main
      gitopsRepo: gotham-cd

    The JSON body payload:

    {
      "output": {
        "parameters": [
          {
            "name": "batman",
            "project": "gotham-demo",
            "image": "quay.io/gotham/batman",
            "tag": "4a7050d",
            "repoURL": "https://github.com",
            "branch": "main",
            "gitopsRepo": "gotham-cd"
          },
          {
            "name": "joker",
            "project": "gotham-demo",
            "image": "quay.io/gotham/joker",
            "tag": "8280f51",
            "repoURL": "https://github.com",
            "branch": "main",
            "gitopsRepo": "gotham-cd"
          },
          {
            "name": "robin",
            "project": "gotham-demo",
            "image": "quay.io/gotham/robin",
            "tag": "a678098",
            "repoURL": "https://github.com",
            "branch": "main",
            "gitopsRepo": "gotham-cd"
          }
        ]
      }
    }

    Each new Object in the GenerateApplication key will generate an Argo application via the ApplicationSet.

    You can add more or even change the schema—just adapt the ApplicationSet.yaml file to the current schema:

    apiVersion: argoproj.io/v1alpha1
    kind: ApplicationSet
    metadata:
      name: plugin-applicationset
      namespace: openshift-gitops
    spec:
      generators:
        - plugin:
            configMapRef:
              name: plugin-config
            requeueAfterSeconds: 30
      template:
        metadata:
          name: "{{name}}-{{project}}"
        spec:
          project: argocd-plugin
          source:
            helm:
              valueFiles:
                - '{{project}}/develop/values-{{name}}.yaml'
              parameters:
                - name: "image.name"
                  value: '{{image}}'
                - name: "image.tag"
                  value: '{{tag}}'
                - name: "global.namespace"
                  value: 'plugin-test'
            repoURL: '{{repoURL}}/{{project}}/{{gitopsRepo}}.git'
            targetRevision: '{{branch}}'
            path: Application
          destination:
            server: https://kubernetes.default.svc
            namespace: plugin-test
          syncPolicy:
            automated:
              prune: true
              selfHeal: true
            syncOptions:
              - CreateNamespace=true
    • {{name}}: This is the name object from each item in the GenerateApplication array (i.e., the name of the application or whatever you think is relevant).
    • {{project}}: This is the project object from each item in the GenerateApplication array (i.e., the organization name in GitHub).
    • {{image}}: This is the image object from each item in the GenerateApplication array (i.e., quay.io/gotham/batman).
    • {{tag}}: This is the tag object from each item in the GenerateApplication array (i.e., v1.0.1).
    • {{repoURL}}: This is the repoURL object from each item in the GenerateApplication array (i.e., <http://www.github.com>).
    • {{branch}}: This is the branch object from each item in the GenerateApplication array (i.e., the branch name for the GitOps repo).
    • {{gitopsRepo}}: This is the gitopsRepo object from each item in the GenerateApplication array (i.e., the repository name for the GitOps repo).
     

    Info alert: Note

    These fields are for my example; you can create any fields you want or need. Just remember they all have to be consistent in all the files.

    Secrets

    We have 2 secrets in the template:

    • secret-ca-cert.yaml, which will be generated with the ingress (i.e., deploy.ingress.enabled=true)
    • header-secret.yaml; this secret auto-generates the bearer token for the web application and the ApplicationSet.
    apiVersion: v1
    kind: Secret
    metadata:
      name: '{{ .Values.global.serviceName }}-argocd-app-set-plugin-token'
      labels:
        {{- include "app.labels" . | nindent 4 }}
        app.kubernetes.io/part-of: argocd
      annotations:
        helm.sh/hook: "pre-install"
        helm.sh/hook-delete-policy: "before-hook-creation"
    type: Opaque
    data:
      token: '{{ randAlphaNum 14 | b64enc }}'

    Notes:

    • The secret will generate a new token each time a new commit is created, so always double-check your token if you testing with the Swagger or Postman.
    • If the Secret has been changed, the deployment will roll out and reload the new token.
    • The name of the token is part of the plug-in config in the ConfigMap plugin-config.

    ArgoApps folder

    Under the ArgoApps folder, we have the following:

    • A plug-in folder, which holds the plug-in Argo application that directs the Git repo to the Helm chart folder.
    • Project.yaml, an Argo project.
    • AppofApps.yaml, an app of apps application to bootstrap the plug-in and ApplicationSet.
    • kustomization.yaml, an easy way to deploy all YAMLs in one command.

    To bootstrap the plug-in generator, just run the following command from the root of the repo:

    oc apply -k GitOps/ArgoApps

    This will install the Argo project and the app of apps, and then the plug-in application and the ApplicationSet (Figure 4).

    Argo-UI
    Figure 4: The Argo user interface.

    How to create support for multiple schemas and patterns

    This plug-in supports a single schema that is defined in our YAML file under the ApplicationFiles in the Helm chart folder. If we want to support multiple patterns, we can easily just duplicate our Helm chart and create a new schema in it. Here is how to do it:

    1. Duplicate the Argo-Plugin folder under GitOps folder and rename it.
    2. Update the Chart.yaml with the new folder name under name:.
    3. Under ArgoApps/Plugin , duplicate the ApplicationSet-Plugin.yaml and the Plugin-Application.yaml, rename them and update the paths to your new plug-in deployment under the GitOps folder.
    4. Update the values.yaml file under your newly created folder with global.serviceName={new name}.
    5. In the ApplicationSet, adjust the template to the new schema that you apply in the new config files and update the config name to the new ConfigMap name (tip: the ConfigMap name has the service name at the beginning).
    6. Sync the app of apps, and a new plug-in will be created with a new ApplicationSet.

    Now you can add your new application YAML files into it and see your application generated.

    Watch this demo video to see it in action.

    Using the Plugin-Generator to create a Pull-Request generator

    Under a branch named Pull-Request, there is an example of how to combine Tekton Pipelines and the Plugin-Generator. It's a very simple pipeline that listens to PR webhooks, and for every new PR open, the pipeline creates a new file under the ApplicationFiles folder. See Figures 5 and 6.

    Then, a new application is created via the new branch, and when the PR is closed, the pipeline deletes the file, and the branch application is deleted.

    Figure 5 - Flow Architecture
    Figure 5: Flow architecture.

    Okay, so let's go over that mess of Topology. From left to right:

    1. On top, we have my local running Git server (Gitea, running inside my OpenShift instance).
    2. Under that, we have the plug-in repository that is stored in GitHub.
    3. In the OpenShift instance, I have created a namespace named pullrequest. In it, I have created the following:

      • Tekton event-listener
      • Tekton Template trigger
      • Tekton Trigger Binding
      • Tekton Pipeline
      • Tekton Task — Create branch file
      • Tekton Task — Delete Branch file
      • Tekton Task — git cli push changes to the repository
      • Secret with my .gitconfig and .git-credentials files, that will be mounted to the “git-cli” task.
    4. For the testing, I set an Argo CD ApplicationSet that deploys my PR application to this namespace.
    5. In the OpenShift instance, under the Openshift-GitOps namespace (the default instance), the ApplicationSet generates applications from a GitOps repository that is hosted in GitHub, with an Argo CD application that deploys:

      • The plug-in generator application
      • The Tekton YAMLs
      • The ApplicationSet
    PR-Pipeline
    Figure 6: The PR pipeline.

    For this demo, I have 2 applications in the Git repository that are in the Gitea Git server: Batman and Joker.

    I have added a webhook to each of those repositories (Figure 7).

    Web-Hook Config
    Figure 7:  Webhook configuration.

    In the Config, I have set the Pull Request checkbox, which sends an event when a pull request is opened, reopened, or closed. Also, I checked the Pull request synchronized checkbox, so the pipeline will be triggered for every change in the commit head in the branch.

    Note: I have another pipeline that has an event listener that is webhook to the Gitea repository and builds a new image for each new commit, so any changes to the Git repo will create a new image in the image repository.

    Tekton folder

    The Tekton folder holds the following files:

    • event-listener.yaml: This file holds the Tekton event listener that creates the end-point that will be set as a webhook to our source repo webhook config.
    • gitea-trigger-binding.yaml: This file maps the webhook Payload to Tekton parameters that later can be used in the pipeline.
    • pull-request-pipeline.yaml: Tekton pipeline config and structure.
    • task-create-branch.yaml: a Tekton task that will create a file in the Argo-Plugin from the Pull-Request event payload data.
    • task-delete-branch.yaml:A Tekton task that will delete the relevant file from the close pull request event.
    • trigger_template-PipelineRun.yaml: A Tekton pipelineRun template that will be triggered from the event-listener.
    • trigger_template-taskRun.yaml: A Tekton taskRun template to test the payload data.

    The application is shown in Figure 8.

    Tetkon Application
    Figure 8: The Tekton application.

    To access all the files, you can go to my repo: Plugin-Generator Branch: pull-request

    Demo time

    Feel free to clone or fork the repo and try and use it. You can also open issues and contact me if you have questions or need any assistance.

    Links:

    • Git repository (note that you are in the pull-request branch)
    • Argo CD Plugin Generator manual
    • Argo CD webhooks for the applications
    • Argo CD webhooks for ApplicationSet
    OSZAR »

    Related Posts

    • 3 patterns for deploying Helm charts with Argo CD

    • A developer's guide to CI/CD and GitOps with Jenkins Pipelines

    • Multiple sources for Argo CD applications

    • Manage namespaces in multitenant clusters with Argo CD, Kustomize, and Helm

    • Introduction to Tekton and Argo CD for multicluster development

    • Prevent auto-reboot during Argo CD sync with machine configs

    Recent Posts

    • Storage considerations for OpenShift Virtualization

    • Upgrade from OpenShift Service Mesh 2.6 to 3.0 with Kiali

    • EE Builder with Ansible Automation Platform on OpenShift

    • How to debug confidential containers securely

    • Announcing self-service access to Red Hat Enterprise Linux for Business Developers

    Red Hat Developers logo LinkedIn YouTube Twitter Facebook

    Products

    • Red Hat Enterprise Linux
    • Red Hat OpenShift
    • Red Hat Ansible Automation Platform

    Build

    • Developer Sandbox
    • Developer Tools
    • Interactive Tutorials
    • API Catalog

    Quicklinks

    • Learning Resources
    • E-books
    • Cheat Sheets
    • Blog
    • Events
    • Newsletter

    Communicate

    • About us
    • Contact sales
    • Find a partner
    • Report a website issue
    • Site Status Dashboard
    • Report a security problem

    RED HAT DEVELOPER

    Build here. Go anywhere.

    We serve the builders. The problem solvers who create careers with code.

    Join us if you’re a developer, software engineer, web designer, front-end designer, UX designer, computer scientist, architect, tester, product manager, project manager or team lead.

    Sign me up

    Red Hat legal and privacy links

    • About Red Hat
    • Jobs
    • Events
    • Locations
    • Contact Red Hat
    • Red Hat Blog
    • Inclusion at Red Hat
    • Cool Stuff Store
    • Red Hat Summit
    © 2025 Red Hat

    Red Hat legal and privacy links

    • Privacy statement
    • Terms of use
    • All policies and guidelines
    • Digital accessibility

    Report a website issue

    OSZAR »