-
Notifications
You must be signed in to change notification settings - Fork 7
coancestry() resulting in fatal error with large datasets #16
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Hi Cory:
Yes, I'm afraid that this is a known problem that does not have an easy fix.
I wrote related quite a long time ago, when microsatellites were still the main marker of choice. As a result, I did not put any thought into how the package uses memory. This was never a problem until people started trying to use related to analyze SNP data sets (with hundreds to thousands of loci). Fixing this problem will require a complete rewrite of the package, which I have not had time to do. I am, however, hoping to get a student this academic year to try to help with this, so hopefully a fix is in the not-too-distant future (but still probably a few months).
…-Tim
______________________________
Timothy R. Frasier
Coordinator: Forensic Sciences Program
Professor: Biology
Saint Mary's University
923 Robie Street
Halifax, Nova Scotia B3H 3C3
Canada
Tel: (902) 491-6382
E-mail: ***@***.***
frasierlab.ca
________________________________
From: fournier-c ***@***.***>
Sent: Thursday, August 17, 2023 6:03 PM
To: timothyfrasier/related ***@***.***>
Cc: Subscribed ***@***.***>
Subject: [timothyfrasier/related] coancestry() resulting in fatal error with large datasets (Issue #16)
Hi Timothy,
I have been using related successfully for the past two years with datasets consisting of 100-200 individuals and 10-15 loci. However, I am now trying to use it with a dataset consisting of ~500 individuals and 14 loci and when I use the coancestry function, it always returns a fatal error. I am using R 4.3.1.
Any help with this would be appreciated.
Thanks,
Cory
—
Reply to this email directly, view it on GitHub<#16>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/AA3NYWODBLDUVUWMKSM3J3LXV2BIPANCNFSM6AAAAAA3UUC4CA>.
You are receiving this because you are subscribed to this thread.Message ID: ***@***.***>
|
Hi Timothy, I am trying to run the same function with a dataset of 511 SNPs for 100 individuals. My R keeps aborting the section, and I figured this is indeed because of memory allocation. Since your reply to this issue was in 2023, I wonder if you were able to work on an update for this package. If not, would you recommend any other program similar to related to analysing pairwise relatedness? Thank you! Ingrid |
Hi Ingrid:
No, I'm afraid that this has not been fixed yet - I just haven't had the time.
In the short term, I'd be happy to give you temporary access to a server in my lab that should(!) be able to handle this data set without crashing. You could transfer your data to the server, run the analyses, and then transfer your results back.
Also, I am in discussion with the group who created the dartR<https://green-striped-gecko.github.io/dartR/> R package, and they are planning "take over" the functioning of related within their R package. I don't know where that is at yet though.
…-Tim
______________________________
Timothy R. Frasier
Professor: Biology
Coordinator: Forensic Sciences Program
Saint Mary's University
923 Robie Street
Halifax, Nova Scotia B3H 3C3
Canada
Tel: (902) 491-6382
E-mail: ***@***.***
frasierlab.ca
________________________________
From: Ingrid Bunholi ***@***.***>
Sent: Tuesday, April 15, 2025 2:25 PM
To: timothyfrasier/related ***@***.***>
Cc: Timothy Frasier ***@***.***>; Comment ***@***.***>
Subject: Re: [timothyfrasier/related] coancestry() resulting in fatal error with large datasets (Issue #16)
Hi Timothy,
I am trying to run the same function with a dataset of 511 SNPs for 100 individuals. My R keeps aborting the section, and I figured this is indeed because of memory allocation. Since your reply to this issue was in 2023, I wonder if you were able to work on an update for this package. If not, would you recommend any other program similar to related to analysing pairwise relatedness?
Thank you!
Ingrid
—
Reply to this email directly, view it on GitHub<#16 (comment)>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/AA3NYWMHY3BYQ3BHJFF72WD2ZU6KPAVCNFSM6AAAAAB3GGVSACVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDQMBWHE2TGOJYGM>.
You are receiving this because you commented.Message ID: ***@***.***>
[https://avatars.githubusercontent.com/u/63671838?s=20&v=4]Bunholi left a comment (timothyfrasier/related#16)<#16 (comment)>
Hi Timothy,
I am trying to run the same function with a dataset of 511 SNPs for 100 individuals. My R keeps aborting the section, and I figured this is indeed because of memory allocation. Since your reply to this issue was in 2023, I wonder if you were able to work on an update for this package. If not, would you recommend any other program similar to related to analysing pairwise relatedness?
Thank you!
Ingrid
—
Reply to this email directly, view it on GitHub<#16 (comment)>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/AA3NYWMHY3BYQ3BHJFF72WD2ZU6KPAVCNFSM6AAAAAB3GGVSACVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDQMBWHE2TGOJYGM>.
You are receiving this because you commented.Message ID: ***@***.***>
|
I would really appreciate it if I could run this on your server. Installing related on my university's server would require too many sudo permissions - that's why I was trying to run on my local machine. My email is ingrid.bunholi@utexas.edu Thank you very much! Ingrid |
Hi Timothy,
I have been using related successfully for the past two years with datasets consisting of 100-200 individuals and 10-15 loci. However, I am now trying to use it with a dataset consisting of ~500 individuals and 14 loci and when I use the coancestry function, it always returns a fatal error. I am using R 4.3.1.
Any help with this would be appreciated.
Thanks,
Cory
The text was updated successfully, but these errors were encountered: