Can DW handle big data? [message #682] |
Tue, 22 October 2019 08:00 |
greatzdl
Messages: 1 Registered: March 2019
|
Junior Member |
|
|
Dear Thomas,
I have used datawarrior for many years since it was released as open source program. Thank you very much for your contibution.
During these years, I found that it is hard to open data with rows larger than 3M, especially with structure columns. Do you have any solutions to solve this problem? DO you have plans to use multithreading technology to open large data file?
Hope to get your relpy.
Best wishes
DaRong
|
|
|
|
Re: Can DW handle big data? [message #694 is a reply to message #688] |
Sun, 27 October 2019 17:01 |
nbehrnd
Messages: 224 Registered: June 2019
|
Senior Member |
|
|
In addition, DaRong, if working in Linux and facing limitation by the RAM accessible
on your computer, you may supplement «working memory» with a swap partition. While
it won't be as performant in terms of read-write access speed, especially if it is
on a HDD platter, as a true RAM brick, this offers a noticable benefit quickly setup
(e.g. using an Ubuntu session on AWS).
A possible primer may be
https://linuxize.com/post/how-to-add-swap-space-on-ubuntu-18 -04/
Norwid
|
|
|