Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Deeplexicon run error.. #31

Open
Seongmin-Jang-1165 opened this issue Oct 15, 2024 · 9 comments
Open

Deeplexicon run error.. #31

Seongmin-Jang-1165 opened this issue Oct 15, 2024 · 9 comments

Comments

@Seongmin-Jang-1165
Copy link

Seongmin-Jang-1165 commented Oct 15, 2024

hello again..!

i run the deeplexicon but error is occured...

how can i solve this..?

image

@Psy-Fer
Copy link
Owner

Psy-Fer commented Oct 15, 2024

This looks related to the numpy version. Double check it's at the correct version (not 2.0)

@Seongmin-Jang-1165
Copy link
Author

@Psy-Fer Thanks for advice!

i encountered another error that core dumped. it may be the issue about my operating environment. i ran Deeplexicon with CPU.

does it work well with CPU, or it works only with GPU..?

@Psy-Fer
Copy link
Owner

Psy-Fer commented Oct 16, 2024

It was designed to use GPU, but can use CPU, however it will be very slow.

It is also very sensitive to package and cuda/driver versions, as this is all before tensorflow 2.0.

I would highly recommend using these docker containers

https://github.com/Psy-Fer/deeplexicon/tree/master/dockerfiles

James

@Seongmin-Jang-1165
Copy link
Author

@Psy-Fer

i cannot use Docker because of authoritu of server... so i tried again... there are another error occur
image

how can i resolve this problem..? i attach the list of tools

image

@Psy-Fer
Copy link
Owner

Psy-Fer commented Oct 16, 2024

Ahh this is to do with h5py it seams
#13 (comment)

try using h5py==2.10

james

@Seongmin-Jang-1165
Copy link
Author

@Psy-Fer

thanks you James i downgrade h5py and re-run
image

but there is still error like below
image

when i process my job, i saw the error message that ont-fast5-api == 4.1.3 cannot work under h5py ==2.10 and installation is cancelled.

so i re-download ont-fast5-api and h5py is auto-update like below
image

is this process is related with my problems?? it seems like Deeplexicon cannot read my raw data...

@Psy-Fer
Copy link
Owner

Psy-Fer commented Oct 16, 2024

Deeplexicon was written in 2018/19 and a lot has changed since then in terms of file schemas, libraries, data.
Could it be related to VBZ compression of the fast5 files?

@enovoa has a working tool that she is happy to share with researchers (tool in review atm) if deeplexicon can't work on your data. See other issues for more details about the current limited support for deeplexicon.

Regards,
James

@enovoa
Copy link
Collaborator

enovoa commented Oct 16, 2024

HI @Seongmin-Jang-1165,

In addition to what @Psy-Fer mentions, I'd just like to add that DeePlexiCon (as well as EpiNano, for which you also opened a GitHub issue recently) are both available within MasterOfPores, a nextflow workflow that will precisely overcome all installation/dependency issues. I'd suggest you to try it.

Also I'd suggest to give a go to using the dockerfile as @Psy-Fer suggests, but running it via singularity which is typically the solution used by many institutes with computing clusters. You might want to try running some other dockerfile as "test" to check whether your issue is related to running dockerfiles, or to the dockerfile of deeplexicon per se.

Best, Eva

@Seongmin-Jang-1165
Copy link
Author

@Psy-Fer @enovoa
thanks for kind reply, i will take a look about MOP

i already e-mailed to enovoa about new developing tool for demultiplexing Direct RNA seq data and got friendly response. Thank you so much enovoa !

but my professor doesn’t seem too thrilled about meeting all the requirements to use the tool, so i'm trying out other methods instead.

i thought the error is about VBZ compression, so i tried to pre-process my data like below :
POD5(raw)
--> convert to fast5
--> split with single reads & merge to single fast5 data(multi_to_single_fast5, single_to_multi_fast5)
--> repack the data with VBZ compression(h5repack)

so i made one fast5 file and ran the Deeplexicon

but, as always, error is occured like below :
image

how can i solve this..?

i read several issue document, and i think i have to decompress the VBZ compression and use VBZ plugin...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants