8000 Pulling Container with Apptainer at BASECALL:DORADO_BASECALL:downloadModel step fails · Issue #119 · biocorecrg/master_of_pores · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

Pulling Container with Apptainer at BASECALL:DORADO_BASECALL:downloadModel step fails #119

New issue
< 94B0 div class="px-4">

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
ashdederich opened this issue May 30, 2025 · 0 comments

Comments

@ashdederich
Copy link

I am working to get the Master of Pores pipeline running at my HPC. Apptainer is our HPC's preferred method for containerization. I have tried running the MOP pipeline as such:

module load nextflow
module load apptainer
module load cuda/11.8.0

nextflow run mop_preprocess.nf -with-apptainer -params-file params.yaml -w $scratchdir -profile slurm

Pulling the container at the BASECALL:DORADO_BASECALL:downloadModel step always fails, with this error message:

Command error:
  INFO:    gocryptfs not found, will not be able to use gocryptfs
  FATAL:   container creation failed: while getting /proc/self/ns/user file descriptor: while requesting file descriptors send: read unix @->@: read: connection reset by peer

However, if I run the command like this:

module load nextflow
module load singularity
module load cuda/11.8.0

nextflow run mop_preprocess.nf -with-singularity -params-file params.yaml -w $scratchdir -profile slurm

The container successfully downloads. My question is - Nextflow supports of apptainer, and reading through the MOP docs, it looks like MOP supports the use of apptainer. So why does the container creation step fail 100% of the time when I change to -with-apptainer?

Lastly, I'll add that, with the -with-singularity flag, the MOP pipeline fails a few steps further (due to a versioning issue with singularity). Due to this error and my HPC's strong preference for apptainer overs singularity, I switched to trying -with-apptainer, instead.

Here is my params.yml file:

pod5: "/scratch/general/nfs1/fleming_mop_input/pod5_converted_files/*.pod5"
fastq: ""
## This can be empty but then you need to add specify kit and flowcell via command line inside pars_tools
conffile: "final_hek_tnfa_1-1.txt"
# Basecalling can be either NO, dorado, dorado-mod or dorado-duplex
basecalling: "dorado"
## Can be OFF / cuda10 / cuda11. Newer version of GUPPY may require cuda11
GPU: "cuda11"
# OFF may not be legitimate options for demultiplexing and demulti_fast5.
demultiplexing: "NO"
demulti_fast5: "OFF"
demulti_pod5: "OFF"
### Number of fast5 basecalled per parallel job
granularity: 1
### File with the list of accepted barcodes. It can be empty
barcodes: ""

# Needed for fastq input
## fast5: ""

# Common
reference: "/scratch/general/nfs1/u0432542/all_ref_rrna_human.fasta"
## Can be transcriptome / genome
ref_type: "transcriptome"
annotation: ""
## command line options
pars_tools: "${projectDir}/tool_opts/drna_tool_splice_opt.tsv"
## Cut off quality for QC
qualityqc: 5
## Can be nanoq / nanofilt
filtering: "nanoq"
## Can be graphmap / graphmap2 / minimap2 / bwa
mapping: "minimap2"
## Can be nanocount for transcriptome / htseq for genome
counting: "nanocount"
## Can be NO / bambu / isoquant
discovery: "NO"
## Convert bam to cram
cram_conv: "NO"
subsampling_cram: 50

progPars:
  basecalling:
    dorado: "fast"
    dorado-duplex: ""
    dorado-mod: ""
  demultiplexing:
    seqtagger: ""
    dorado: ""
  filtering:
    seqkit: ""
    nanoq: ""
  mapping:
    graphmap: ""
    graphmap2: ""
    minimap2: ""
    bwa: ""
    winnowmap: ""
  counting:
    htseq: ""
    nanocount: ""
  discovery:
    bambu: ""
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant
0