Resume a Pipeline

You can resume a pipeline that has failed. There are many reasons why a pipeline may not be completed. It can be costly and time consuming to completely restart it. Resuming a pipeline allows you to pick up where the pipeline stopped and complete the remaining jobs.

  1. Create new jobs for your pipeline or use existing jobs, knowing that each job must contain the following script in its command line:

    #!/bin/bash
    if [ -z "$(echo "$DONT_EXECUTE" | grep "<job_alias>")" ]; then
        <script_execution>
    else
        echo " Variable DONT_EXECUTE contains this job. No execution."
    fi

    Where

    • <script_execution> must be replaced with the job execution script, that is, the name of the code file uploaded in the package of the job when it was created.

    • <job_alias> must be replaced with the alias of the job. You can find it in the job The "Overview" page icon is a square divided into several other squares. Overview page.

      pipeline resume job cmd line
      You can modify the command line of existing jobs by upgrading them.
  2. Create a new pipeline or use an existing one.

    If you use an existing pipeline, check that the jobs have the required script in their command line, as said in the previous step.
  3. Run your pipeline.

  4. According to the result create the $DONT_EXECUTE pipeline environment variable with the list of job aliases that you do not want to be run.

    You can modify this variable as many times as required. It is up to you to update its value according to the results of your pipeline. You do not need to create a new variable every time you run your pipeline. However, if you have several pipelines, you do need to create an environment variable at the corresponding pipeline level for each pipeline.

    This selective execution avoids the need to rerun one or more jobs, and thus the entire pipeline.

There are several solutions that can be considered to automate this process.

  • Each job could be asked to update the $DONT_EXECUTE environment variable at the start of processing. To do this, they need access to the Saagie API, which requires authentication.

  • The jobs could be asked to update an external file or database at the start and end of processing.