Skip to content
Snippets Groups Projects
Commit 979b938c authored by nf-core-bot's avatar nf-core-bot
Browse files

Template update for nf-core/tools version 2.6

parent 0fb3c263
Branches nf-core-template-merge-2.6
No related tags found
No related merge requests found
Showing
with 305 additions and 31 deletions
......@@ -28,3 +28,7 @@ jobs:
"outdir": "s3://${{ secrets.AWS_S3_BUCKET }}/hic/results-${{ github.sha }}"
}
profiles: test_full,aws_tower
- uses: actions/upload-artifact@v3
with:
name: Tower debug log file
path: tower_action_*.log
......@@ -23,3 +23,7 @@ jobs:
"outdir": "s3://${{ secrets.AWS_S3_BUCKET }}/hic/results-test-${{ github.sha }}"
}
profiles: test,aws_tower
- uses: actions/upload-artifact@v3
with:
name: Tower debug log file
path: tower_action_*.log
email_template.html
adaptivecard.json
.nextflow*
work/
data/
......
......@@ -13,8 +13,8 @@ authors:
given-names: Johannes
- family-names: Wilm
given-names: Andreas
- family-names: Ulysse Garcia
given-names: Maxime
- family-names: Garcia
given-names: Maxime Ulysse
- family-names: Di Tommaso
given-names: Paolo
- family-names: Nahnsen
......@@ -39,8 +39,8 @@ prefered-citation:
given-names: Johannes
- family-names: Wilm
given-names: Andreas
- family-names: Ulysse Garcia
given-names: Maxime
- family-names: Garcia
given-names: Maxime Ulysse
- family-names: Di Tommaso
given-names: Paolo
- family-names: Nahnsen
......
{
"type": "message",
"attachments": [
{
"contentType": "application/vnd.microsoft.card.adaptive",
"contentUrl": null,
"content": {
"\$schema": "http://adaptivecards.io/schemas/adaptive-card.json",
"msteams": {
"width": "Full"
},
"type": "AdaptiveCard",
"version": "1.2",
"body": [
{
"type": "TextBlock",
"size": "Large",
"weight": "Bolder",
"color": "<% if (success) { %>Good<% } else { %>Attention<%} %>",
"text": "nf-core/hic v${version} - ${runName}",
"wrap": true
},
{
"type": "TextBlock",
"spacing": "None",
"text": "Completed at ${dateComplete} (duration: ${duration})",
"isSubtle": true,
"wrap": true
},
{
"type": "TextBlock",
"text": "<% if (success) { %>Pipeline completed successfully!<% } else { %>Pipeline completed with errors. The full error message was: ${errorReport}.<% } %>",
"wrap": true
},
{
"type": "TextBlock",
"text": "The command used to launch the workflow was as follows:",
"wrap": true
},
{
"type": "TextBlock",
"text": "${commandLine}",
"isSubtle": true,
"wrap": true
}
],
"actions": [
{
"type": "Action.ShowCard",
"title": "Pipeline Configuration",
"card": {
"type": "AdaptiveCard",
"\$schema": "http://adaptivecards.io/schemas/adaptive-card.json",
"body": [
{
"type": "FactSet",
"facts": [<% out << summary.collect{ k,v -> "{\"title\": \"$k\", \"value\" : \"$v\"}"}.join(",\n") %>
]
}
]
}
}
]
}
}
]
}
id: "nf-core-hic-methods-description"
description: "Suggested text and references to use when describing pipeline usage within the methods section of a publication."
section_name: "nf-core/hic Methods Description"
section_href: "https://github.com/nf-core/hic"
plot_type: "html"
## TODO nf-core: Update the HTML below to your prefered methods description, e.g. add publication citation for this pipeline
## You inject any metadata in the Nextflow '${workflow}' object
data: |
<h4>Methods</h4>
<p>Data was processed using nf-core/hic v${workflow.manifest.version} ${doi_text} of the nf-core collection of workflows (<a href="https://doi.org/10.1038/s41587-020-0439-x">Ewels <em>et al.</em>, 2020</a>).</p>
<p>The pipeline was executed with Nextflow v${workflow.nextflow.version} (<a href="https://doi.org/10.1038/nbt.3820">Di Tommaso <em>et al.</em>, 2017</a>) with the following command:</p>
<pre><code>${workflow.commandLine}</code></pre>
<h4>References</h4>
<ul>
<li>Di Tommaso, P., Chatzou, M., Floden, E. W., Barja, P. P., Palumbo, E., & Notredame, C. (2017). Nextflow enables reproducible computational workflows. Nature Biotechnology, 35(4), 316-319. <a href="https://doi.org/10.1038/nbt.3820">https://doi.org/10.1038/nbt.3820</a></li>
<li>Ewels, P. A., Peltzer, A., Fillinger, S., Patel, H., Alneberg, J., Wilm, A., Garcia, M. U., Di Tommaso, P., & Nahnsen, S. (2020). The nf-core framework for community-curated bioinformatics pipelines. Nature Biotechnology, 38(3), 276-278. <a href="https://doi.org/10.1038/s41587-020-0439-x">https://doi.org/10.1038/s41587-020-0439-x</a></li>
</ul>
<div class="alert alert-info">
<h5>Notes:</h5>
<ul>
${nodoi_text}
<li>The command above does not include parameters contained in any configs or profiles that may have been used. Ensure the config file is also uploaded with your publication!</li>
<li>You should also cite all software used within this run. Check the "Software Versions" of this report to get version information.</li>
</ul>
</div>
......@@ -3,9 +3,11 @@ report_comment: >
analysis pipeline. For information about how to interpret these results, please see the
<a href="https://nf-co.re/hic" target="_blank">documentation</a>.
report_section_order:
software_versions:
"nf-core-hic-methods-description":
order: -1000
"nf-core-hic-summary":
software_versions:
order: -1001
"nf-core-hic-summary":
order: -1002
export_plots: true
......@@ -237,6 +237,14 @@ See the main [Nextflow documentation](https://www.nextflow.io/docs/latest/config
If you have any questions or issues please send us a message on [Slack](https://nf-co.re/join/slack) on the [`#configs` channel](https://nfcore.slack.com/channels/configs).
## Azure Resource Requests
To be used with the `azurebatch` profile by specifying the `-profile azurebatch`.
We recommend providing a compute `params.vm_type` of `Standard_D16_v3` VMs by default but these options can be changed if required.
Note that the choice of VM size depends on your quota and the overall workload during the analysis.
For a thorough list, please refer the [Azure Sizes for virtual machines in Azure](https://docs.microsoft.com/en-us/azure/virtual-machines/sizes).
## Running in the background
Nextflow handles job submissions and supervises the running jobs. The Nextflow process must run until the pipeline is finished.
......
......@@ -145,6 +145,61 @@ class NfcoreTemplate {
output_tf.withWriter { w -> w << email_txt }
}
//
// Construct and send adaptive card
// https://adaptivecards.io
//
public static void adaptivecard(workflow, params, summary_params, projectDir, log) {
def hook_url = params.hook_url
def summary = [:]
for (group in summary_params.keySet()) {
summary << summary_params[group]
}
def misc_fields = [:]
misc_fields['start'] = workflow.start
misc_fields['complete'] = workflow.complete
misc_fields['scriptfile'] = workflow.scriptFile
misc_fields['scriptid'] = workflow.scriptId
if (workflow.repository) misc_fields['repository'] = workflow.repository
if (workflow.commitId) misc_fields['commitid'] = workflow.commitId
if (workflow.revision) misc_fields['revision'] = workflow.revision
misc_fields['nxf_version'] = workflow.nextflow.version
misc_fields['nxf_build'] = workflow.nextflow.build
misc_fields['nxf_timestamp'] = workflow.nextflow.timestamp
def msg_fields = [:]
msg_fields['version'] = workflow.manifest.version
msg_fields['runName'] = workflow.runName
msg_fields['success'] = workflow.success
msg_fields['dateComplete'] = workflow.complete
msg_fields['duration'] = workflow.duration
msg_fields['exitStatus'] = workflow.exitStatus
msg_fields['errorMessage'] = (workflow.errorMessage ?: 'None')
msg_fields['errorReport'] = (workflow.errorReport ?: 'None')
msg_fields['commandLine'] = workflow.commandLine
msg_fields['projectDir'] = workflow.projectDir
msg_fields['summary'] = summary << misc_fields
// Render the JSON template
def engine = new groovy.text.GStringTemplateEngine()
def hf = new File("$projectDir/assets/adaptivecard.json")
def json_template = engine.createTemplate(hf).make(msg_fields)
def json_message = json_template.toString()
// POST
def post = new URL(hook_url).openConnection();
post.setRequestMethod("POST")
post.setDoOutput(true)
post.setRequestProperty("Content-Type", "application/json")
post.getOutputStream().write(json_message.getBytes("UTF-8"));
def postRC = post.getResponseCode();
if (! postRC.equals(200)) {
log.warn(post.getErrorStream().getText());
}
}
//
// Print pipeline summary on completion
//
......
......@@ -21,19 +21,26 @@ class Utils {
}
// Check that all channels are present
def required_channels = ['conda-forge', 'bioconda', 'defaults']
def conda_check_failed = !required_channels.every { ch -> ch in channels }
// This channel list is ordered by required channel priority.
def required_channels_in_order = ['conda-forge', 'bioconda', 'defaults']
def channels_missing = ((required_channels_in_order as Set) - (channels as Set)) as Boolean
// Check that they are in the right order
conda_check_failed |= !(channels.indexOf('conda-forge') < channels.indexOf('bioconda'))
conda_check_failed |= !(channels.indexOf('bioconda') < channels.indexOf('defaults'))
def channel_priority_violation = false
def n = required_channels_in_order.size()
for (int i = 0; i < n - 1; i++) {
channel_priority_violation |= !(channels.indexOf(required_channels_in_order[i]) < channels.indexOf(required_channels_in_order[i+1]))
}
if (conda_check_failed) {
if (channels_missing | channel_priority_violation) {
log.warn "~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~\n" +
" There is a problem with your Conda configuration!\n\n" +
" You will need to set-up the conda-forge and bioconda channels correctly.\n" +
" Please refer to https://bioconda.github.io/user/install.html#set-up-channels\n" +
" NB: The order of the channels matters!\n" +
" Please refer to https://bioconda.github.io/\n" +
" The observed channel order is \n" +
" ${channels}\n" +
" but the following channel order is required:\n" +
" ${required_channels_in_order}\n" +
"~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~"
}
}
......
......@@ -2,6 +2,8 @@
// This file holds several functions specific to the workflow/hic.nf in the nf-core/hic pipeline
//
import groovy.text.SimpleTemplateEngine
class WorkflowHic {
//
......@@ -42,6 +44,23 @@ class WorkflowHic {
yaml_file_text += "data: |\n"
yaml_file_text += "${summary_section}"
return yaml_file_text
}
public static String methodsDescriptionText(run_workflow, mqc_methods_yaml) {
// Convert to a named map so can be used as with familar NXF ${workflow} variable syntax in the MultiQC YML file
def meta = [:]
meta.workflow = run_workflow.toMap()
meta["manifest_map"] = run_workflow.manifest.toMap()
meta["doi_text"] = meta.manifest_map.doi ? "(doi: <a href=\'https://doi.org/${meta.manifest_map.doi}\'>${meta.manifest_map.doi}</a>)" : ""
meta["nodoi_text"] = meta.manifest_map.doi ? "": "<li>If available, make sure to update the text to include the Zenodo DOI of version of the pipeline used. </li>"
def methods_text = mqc_methods_yaml.text
def engine = new SimpleTemplateEngine()
def description_html = engine.createTemplate(methods_text).make(meta)
return description_html
}//
// Exit pipeline if incorrect --genome key provided
//
......
......@@ -4,7 +4,8 @@
nf-core/hic
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Github : https://github.com/nf-core/hic
Website: https://nf-co.re/hic
Website: https://nf-co.re/hic
Slack : https://nfcore.slack.com/channels/hic
----------------------------------------------------------------------------------------
*/
......
......@@ -2,20 +2,21 @@
"name": "nf-core/hic",
"homePage": "https://github.com/nf-core/hic",
"repos": {
"nf-core/modules": {
"git_url": "https://github.com/nf-core/modules.git",
"https://github.com/nf-core/modules.git": {
"modules": {
"custom/dumpsoftwareversions": {
"git_sha": "e745e167c1020928ef20ea1397b6b4d230681b4d",
"branch": "master"
},
"fastqc": {
"git_sha": "e745e167c1020928ef20ea1397b6b4d230681b4d",
"branch": "master"
},
"multiqc": {
"git_sha": "e745e167c1020928ef20ea1397b6b4d230681b4d",
"branch": "master"
"nf-core": {
"custom/dumpsoftwareversions": {
"branch": "master",
"git_sha": "5e34754d42cd2d5d248ca8673c0a53cdf5624905"
},
"fastqc": {
"branch": "master",
"git_sha": "5e34754d42cd2d5d248ca8673c0a53cdf5624905"
},
"multiqc": {
"branch": "master",
"git_sha": "5e34754d42cd2d5d248ca8673c0a53cdf5624905"
}
}
}
}
......
process CUSTOM_DUMPSOFTWAREVERSIONS {
label 'process_low'
label 'process_single'
// Requires `pyyaml` which does not have a dedicated container but is in the MultiQC container
conda (params.enable_conda ? "bioconda::multiqc=1.11" : null)
conda (params.enable_conda ? 'bioconda::multiqc=1.13' : null)
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
'https://depot.galaxyproject.org/singularity/multiqc:1.11--pyhdfd78af_0' :
'quay.io/biocontainers/multiqc:1.11--pyhdfd78af_0' }"
'https://depot.galaxyproject.org/singularity/multiqc:1.13--pyhdfd78af_0' :
'quay.io/biocontainers/multiqc:1.13--pyhdfd78af_0' }"
input:
path versions
......
......@@ -44,4 +44,16 @@ process FASTQC {
END_VERSIONS
"""
}
stub:
def prefix = task.ext.prefix ?: "${meta.id}"
"""
touch ${prefix}.html
touch ${prefix}.zip
cat <<-END_VERSIONS > versions.yml
"${task.process}":
fastqc: \$( fastqc --version | sed -e "s/FastQC v//g" )
END_VERSIONS
"""
}
process MULTIQC {
label 'process_medium'
label 'process_single'
conda (params.enable_conda ? 'bioconda::multiqc=1.12' : null)
conda (params.enable_conda ? 'bioconda::multiqc=1.13' : null)
container "${ workflow.containerEngine == 'singularity' && !task.ext.singularity_pull_docker_container ?
'https://depot.galaxyproject.org/singularity/multiqc:1.12--pyhdfd78af_0' :
'quay.io/biocontainers/multiqc:1.12--pyhdfd78af_0' }"
'https://depot.galaxyproject.org/singularity/multiqc:1.13--pyhdfd78af_0' :
'quay.io/biocontainers/multiqc:1.13--pyhdfd78af_0' }"
input:
path multiqc_files
path multiqc_files, stageAs: "?/*"
path(multiqc_config)
path(extra_multiqc_config)
path(multiqc_logo)
output:
path "*multiqc_report.html", emit: report
......@@ -20,8 +23,27 @@ process MULTIQC {
script:
def args = task.ext.args ?: ''
def config = multiqc_config ? "--config $multiqc_config" : ''
def extra_config = extra_multiqc_config ? "--config $extra_multiqc_config" : ''
"""
multiqc -f $args .
multiqc \\
--force \\
$args \\
$config \\
$extra_config \\
.
cat <<-END_VERSIONS > versions.yml
"${task.process}":
multiqc: \$( multiqc --version | sed -e "s/multiqc, version //g" )
END_VERSIONS
"""
stub:
"""
touch multiqc_data
touch multiqc_plots
touch multiqc_report.html
cat <<-END_VERSIONS > versions.yml
"${task.process}":
......
......@@ -12,11 +12,25 @@ tools:
homepage: https://multiqc.info/
documentation: https://multiqc.info/docs/
licence: ["GPL-3.0-or-later"]
input:
- multiqc_files:
type: file
description: |
List of reports / files recognised by MultiQC, for example the html and zip output of FastQC
- multiqc_config:
type: file
description: Optional config yml for MultiQC
pattern: "*.{yml,yaml}"
- extra_multiqc_config:
type: file
description: Second optional config yml for MultiQC. Will override common sections in multiqc_config.
pattern: "*.{yml,yaml}"
- multiqc_logo:
type: file
description: Optional logo file for MultiQC
pattern: "*.{png}"
output:
- report:
type: file
......@@ -38,3 +52,4 @@ authors:
- "@abhi18av"
- "@bunop"
- "@drpatelh"
- "@jfy133"
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment