diff --git a/docs/en/docs/side_quests/img/nf-test/snapshot-vs-content.excalidraw.svg b/docs/en/docs/side_quests/img/nf-test/snapshot-vs-content.excalidraw.svg new file mode 100644 index 0000000000..ac3d81e7b7 --- /dev/null +++ b/docs/en/docs/side_quests/img/nf-test/snapshot-vs-content.excalidraw.svg @@ -0,0 +1,126 @@ + + + + + + Snapshot vs Content Assertions + + + vs + + + + + Snapshot Assertions + + + + assert snapshot(process.out) + .match() + + + How it works + + First run: + Save output as .snap + + + + + Next runs: + Compare to .snap file + + + Strengths + + Catches ANY change in output + + Easy to set up (default behavior) + + Comprehensive coverage + + + Weaknesses + + Brittle: breaks on ANY change + + Opaque: unclear what matters in the output + + Must update snapshots when valid changes occur + + + + Best for: + Stable outputs where any change + should be flagged for review + + + + --update-snapshot + to accept new output as reference + + + + Content Assertions + + + + assert path(process.out[0][0]) + .readLines().contains('hello') + + + How it works + + Every run: + Check specific values + + + + + No snapshot + file needed + + + Strengths + + Explicit: clear what is being tested + + Resilient to irrelevant changes + + Better error messages when tests fail + + + Weaknesses + + More code to write + + May miss unexpected changes in output + + Must know what to check for in advance + + + + Best for: + Outputs with variable parts + (timestamps, random IDs, etc.) + + + + readLines(), contains(), getText() + and other Groovy/Java methods + + + + + + + diff --git a/docs/en/docs/side_quests/img/nf-test/test-execution-flow.excalidraw.svg b/docs/en/docs/side_quests/img/nf-test/test-execution-flow.excalidraw.svg new file mode 100644 index 0000000000..398df84032 --- /dev/null +++ b/docs/en/docs/side_quests/img/nf-test/test-execution-flow.excalidraw.svg @@ -0,0 +1,119 @@ + + + + + + + + + + + + + + + + + + nf-test Execution Flow + + + + 1 + + Run test command + nf-test test tests/main.nf.test + + + + + + 2 + + Create isolated directory + .nf-test/tests/<hash>/ + + + + + + 3 + + Apply when {} block + Set params and/or process inputs + + + + + + 4 + + Run Nextflow + Pipeline/process executes in isolation + + + + + + 5 + + Evaluate then {} block + Check all assertions + + + + + PASSED + + + + FAILED + + + + Why isolation matters + + + + Project directory + main.nf + greetings.csv + tests/ + results/ + + + + + + + Test directory + .nf-test/tests/ + <hash>/ + meta/ + (empty!) + + + + Common pitfall + Files like greetings.csv are NOT + available in the test directory. + Use ${projectDir} to reference + files from the original project. + + + + Solution: + params { + input_file = "${projectDir}/greetings.csv" + } + diff --git a/docs/en/docs/side_quests/img/nf-test/test-structure-anatomy.excalidraw.svg b/docs/en/docs/side_quests/img/nf-test/test-structure-anatomy.excalidraw.svg new file mode 100644 index 0000000000..ad276fdce4 --- /dev/null +++ b/docs/en/docs/side_quests/img/nf-test/test-structure-anatomy.excalidraw.svg @@ -0,0 +1,89 @@ + + + + + + nf-test Structure: Pipeline vs Process Tests + + + Pipeline-level test + + + + nextflow_pipeline { + + + + name "Test Workflow main.nf" + script "main.nf" + + + + test("Should run successfully...") { + + + + when { + + params { + input_file = "${projectDir}/..." + } + + + + then { + assert workflow.success + assert workflow.trace + .tasks().size() == 6 + assert file(...).exists() + } + + + Process-level test + + + + nextflow_process { + + + + name "Test Process sayHello" + script "main.nf" process "sayHello" + + + + test("Should produce correct output") { + + + + when { + + process { + input[0] = "hello" + } + + + + then { + assert process.success + assert snapshot(process.out) + .match() + } + + + + Key difference: + Pipeline tests use + params {} + to configure the whole run. Process tests use + process {} + to provide direct inputs. + diff --git a/docs/en/docs/side_quests/img/nf-test/testing-strategy.excalidraw.svg b/docs/en/docs/side_quests/img/nf-test/testing-strategy.excalidraw.svg new file mode 100644 index 0000000000..696edec07b --- /dev/null +++ b/docs/en/docs/side_quests/img/nf-test/testing-strategy.excalidraw.svg @@ -0,0 +1,95 @@ + + + + + + + + + + + + + + + Complete Testing Strategy + nf-test test . + + + + Pipeline-level tests + tests/main.nf.test + + + + Test 1: Correct process count + assert workflow.success + assert workflow.trace.tasks().size() == 6 + + + + Test 2: Correct output files + assert file("$launchDir/results/ + Hello-output.txt").exists() + ... for all 6 expected files + + + + Pipeline tests cover: + End-to-end execution + File publishing to results/ + Process orchestration (all 6 tasks) + + + + Process-level tests + + + + tests/main.sayhello.nf.test + + + Test: Expected greeting content + Content assertion (readLines, contains) + + + input: "hello" + + + + + hello-output.txt + + + + tests/main.converttoupper.nf.test + + + Test: Correct uppercase output + Snapshot assertion (.snap file) + + + input: greetings.csv + + + + + UPPER-*.txt + + + + + + + SUCCESS: Executed 4 tests + 2 pipeline + 1 sayHello + 1 convertToUpper + diff --git a/docs/en/docs/side_quests/img/nf-test/workflow-data-flow.excalidraw.svg b/docs/en/docs/side_quests/img/nf-test/workflow-data-flow.excalidraw.svg new file mode 100644 index 0000000000..41f007caee --- /dev/null +++ b/docs/en/docs/side_quests/img/nf-test/workflow-data-flow.excalidraw.svg @@ -0,0 +1,100 @@ + + + + + + Workflow Data Flow + + + + greetings.csv + Hello, Bonjour, Hola + + + + splitCsv + + + + Channel + + + "Hello" + + + "Bonjour" + + + "Hola" + + + + + + + + + sayHello + + + Hello-output.txt + + + Bonjour-output.txt + + + Hola-output.txt + + + + + + + + + convertToUpper + + + UPPER-Hello-output.txt + + + UPPER-Bonjour-output.txt + + + UPPER-Hola-output.txt + + + + + + + results/ + 6 output files + published here + + + Legend: + + Input data + + Channel items + + sayHello outputs + + convertToUpper outputs + + Published results + + + + + + + + diff --git a/docs/en/docs/side_quests/nf-test.md b/docs/en/docs/side_quests/nf-test.md index 62c18e184f..5f83a4d3d2 100644 --- a/docs/en/docs/side_quests/nf-test.md +++ b/docs/en/docs/side_quests/nf-test.md @@ -104,6 +104,14 @@ The workflow we'll be testing is a subset of the Hello workflow built in [Hello In this side quest, we use an intermediate form of the Hello workflow that only contains the first two processes. The subset we'll be working with is composed of two processes: `sayHello` and `convertToUpper`. +The following diagram shows how data flows through the workflow. + + +
+--8<-- "docs/en/docs/side_quests/img/nf-test/workflow-data-flow.excalidraw.svg" +
+ + You can see the full workflow code below. ??? example "Workflow code" @@ -385,6 +393,12 @@ ERROR ~ No such file or directory: /workspaces/training/side-quests/nf-test/.nf- So what was the issue? Remember the pipeline has a greetings.csv file in the project directory. When nf-test runs the pipeline, it will look for this file, but it can't find it. The file is there, what's happening? Well, if we look at the path we can see the test is occurring in the path `./nf-test/tests/longHashString/`. Just like Nextflow, nf-test creates a new directory for each test to keep everything isolated. The data file is not located in there so we must correct the path to the file in the original test. + +
+--8<-- "docs/en/docs/side_quests/img/nf-test/test-execution-flow.excalidraw.svg" +
+ + Let's go back to the test file and change the path to the file in the `when` block. You may be wondering how we're going to point to the root of the pipeline in the test. Since this is a common situation, nf-test has a range of global variables that we can use to make our lives easier. You can find the full list [here](https://www.nf-test.com/docs/testcases/global_variables/) but in the meantime we'll use the `projectDir` variable, which means the root of the pipeline project. @@ -693,6 +707,14 @@ nextflow_process { As before, we start with the test details, followed by the `when` and `then` blocks. However, we also have an additional `process` block which allows us to define the inputs to the process. +The following diagram compares the structure of pipeline-level and process-level tests side by side. + + +
+--8<-- "docs/en/docs/side_quests/img/nf-test/test-structure-anatomy.excalidraw.svg" +
+ + Let's run the test to see if it works. ```bash title="nf-test pipeline pass" @@ -859,6 +881,12 @@ Success! The test passes because the `sayHello` process ran successfully and the ### 2.3. Alternative to Snapshots: Direct Content Assertions + +
+--8<-- "docs/en/docs/side_quests/img/nf-test/snapshot-vs-content.excalidraw.svg" +
+ + While snapshots are great for catching any changes in output, sometimes you want to verify specific content without being so strict about the entire file matching. For example: - When parts of the output might change (timestamps, random IDs, etc.) but certain key content must be present @@ -1108,6 +1136,12 @@ Running nf-test on each component is fine, but laborious and error prone. Can't Yes we can! + +
+--8<-- "docs/en/docs/side_quests/img/nf-test/testing-strategy.excalidraw.svg" +
+ + Let's run nf-test on the entire repo. ### 3.1. Run nf-test on the entire repo