Re: Similarity analysis using "find similar compounds..." - slow analysis of libraries [message #1135 is a reply to message #1133] |
Tue, 24 November 2020 21:33 |
SM2020
Messages: 13 Registered: June 2020 Location: UK
|
Junior Member |
|
|
Update:
My original post is a bit detail-lite and I am aware there are many factors that could be playing a role in the slow performance experienced (not necessarily DW related).
Looking in the MacOS activity monitor, the processor is not being fully consumed (~70% idle), RAM is not fully consumed (~8 GB of 16GB being used only ~4-5 GB being used by DW (this allocation could be user increased) but the task is currently sitting at ~50 hrs. It is progressing, but slowly.
The MacOS service manager, launchd, seems very disk heavy during this task (>11GB written vs. ~500 MB read), as is kernel_task (~10 GB written/100 MB read).
Ultimately, I would like to get some general feedback to help distinguish whether the task I have set DW is indeed very resource intensive (particularly for my computer) or whether there are other likely issues causing bottlenecks in task execution.
More generally, has any other user experienced similar >>24hr run times when running similar processes (with similar library sizes) and if so, has anyone performed any system upgrades/changes which have reduced computation times?
Best
PS. In case it is not clear, I think DW is a fantastic program!
|
|
|