openmolecules.org

 
Home » DataWarrior » Functionality » Can DW handle big data? (multithreading, low cost of memory)
Can DW handle big data? [message #682] Tue, 22 October 2019 08:00 Go to previous message
greatzdl is currently offline  greatzdl
Messages: 1
Registered: March 2019
Junior Member
Dear Thomas,
I have used datawarrior for many years since it was released as open source program. Thank you very much for your contibution.

During these years, I found that it is hard to open data with rows larger than 3M, especially with structure columns. Do you have any solutions to solve this problem? DO you have plans to use multithreading technology to open large data file?

Hope to get your relpy.

Best wishes

DaRong
 
Read Message
Read Message
Read Message
Previous Topic: Synchronise Colours
Next Topic: Exclude function in the structure filter does not work with 2 exclude groups
Goto Forum:
  


Current Time: Fri Nov 22 05:57:39 CET 2024

Total time taken to generate the page: 0.03812 seconds