Test
You are on your root folder of the repository, you need to test the tools instead of running them on the HPC and validate the tool configuration. Here we can test it easily with local environment. This section is linked with previous example repostory setup on Development
Create parameters json
The platform generates params.json using your SSH credentials and injects values into shell environment variables.
To simulate this behavior on the HPC, you can use this as below content
Example params.json for testing:
Create the file tests/params.json
{
"name": "Thanh-Giang Tan Nguyen 09/07/2025 11:05:01 PM",
"cpu": 1,
"memory": 2,
"time": 3600,
"tag": "4.4.2-3",
"profile": "",
"git": "valid_git_repo",
"bucket_name": "genomics",
"AWS_ACCESS_KEY_ID": "invalid",
"AWS_SECRET_ACCESS_KEY": "invalid",
"AWS_REGION_NAME": "invalid",
"AWS_ENDPOINT_URL": "invalid",
"file": "workspace/test.txt",
"dir": "workspace",
"integer": 1,
"number": 0.3,
"string": "string1",
"boolean": true,
"outdir": "batch-template-analysis"
}
Create the run_local command
Create the tests/run_local.sh bash script, it is similar to the script that will be generated on the HPC.
Instead running under the job, it will run at your local environment. With pixi, it ensure the consitency between your local environment and HPC
###########################################################################################################################
# simulate job directory
export BASEDIR=$PWD
export RIVER_HOME=$PWD/work
export job_id="job_id"
export PIXI_HOME=$RIVER_HOME/.pixi
mkdir -p $RIVER_HOME/jobs/job_id
cp $PWD/tests/params.json $RIVER_HOME/jobs/job_id/params.json
cd $RIVER_HOME/jobs/job_id
###########################################################################################################################
# === Install pixi ===
which pixi || curl -fsSL https://pixi.sh/install.sh | sh
export PATH=$PATH:$HOME/.pixi/bin
pixi config append default-channels bioconda --global
pixi config append default-channels conda-forge --global
pixi global install nextflow jq git singularity python=3.14
# Setup networking
export PORT=$(python -c "import socket; s=socket.socket(); s.bind(('',0)); print(s.getsockname()[1]); s.close()")
echo $PORT > $RIVER_HOME/jobs/job_id/job.port
echo $(hostname) > $RIVER_HOME/jobs/job_id/job.host
# Load parameters and clone repository
while IFS== read -r key value; do
export "$key=$value"
done < <(jq -r 'to_entries|map("\(.key)=\(.value|tostring)")|.[]' params.json)
# Create symlink to analysis directory
ln -sf $BASEDIR $RIVER_HOME/jobs/job_id/analysis
git=$(git remote get-url origin 2>/dev/null)
repo_name=$(basename -s .git "$git")
owner=$(basename "$(dirname "$git")")
local_dir="$RIVER_HOME/tools/$owner/$repo_name/$tag"
if [[ "$git" == *"nf-"* ]]; then
profiles="${profile:+singularity,$profile}"
profiles="${profiles:-singularity}"
nextflow run "$owner/$repo_name" \
-r "$tag" \
-c river.config \
-profile "$profiles" \
-process.executor slurm \
-process.shell 'bash' \
--outdir "s3://$bucket_name/$outdir/job_id" \
-with-report "s3://$bucket_name/$outdir/job_id/report.html" \
-resume
else
bash $BASEDIR/river/main.sh
fi
On HPC, it will clone tools and relative tags on the RIVER_HOME location where you defined when you created SSH credential.
The tool is ran by creating the symlink from tool source of RIVER_HOME to job working directory. On the local environment, it
will generate the symlink on the same repo, you should delete it to avoid commit it on the repo.
Run the test from this
bash tests/run_local.sh