Search found 2 matches
Search found 2 matches • Page 1 of 1
- Mon Feb 26, 2018 12:14 pm
- Forum: Bader
- Topic: Bader Error when readin very large cube file
- Replies: 12
- Views: 5839
Hello, I'm having the same problem with a CHGCAR of 2.7 Gb. This is what I get using the ulimit -a command : core file size (blocks, -c) 0 data seg size (kbytes, -d) unlimited scheduling priority (-e) 0 file size (blocks, -f) unlimited pending signals (-i) 515338 max locked memory (kbytes, -l) 40861...