Bader Error when readin very large cube file

Bader charge density analysis

Moderator: moderators

Post Reply
fangliu
Posts: 3
Joined: Mon Feb 05, 2018 3:22 pm

Bader Error when readin very large cube file

Post by fangliu »

I am using Bader to process some large cube files (grid density >= 450*450*450 ). Then I found the that following error always appears at the beginning of Bader run:

GRID BASED BADER ANALYSIS (Version 1.03 11/13/17)

OPEN ... largegrid.cub
GAUSSIAN-STYLE INPUT FILE
At line 92 of file cube_mod.f90 (unit = 100, file = 'largegrid.cub')
Fortran runtime error: End of file

I believe that the cube file has no format error, as I loaded them successfully in another program, Multiwfn, with the correct total number of electron reported. However, other programs could not analyze Bader charge.

I'm wondering whether this is a bug in the Bader code? Or is it because of the memory on my machine is insufficient for processing this? Any potentail solution to this problem is high appreciated.

Thanks!
Fang
graeme
Site Admin
Posts: 2256
Joined: Tue Apr 26, 2005 4:25 am
Contact:

Re: Bader Error when readin very large cube file

Post by graeme »

I can test it if you can post the file somewhere that I can download it from.
fangliu
Posts: 3
Joined: Mon Feb 05, 2018 3:22 pm

Re: Bader Error when readin very large cube file

Post by fangliu »

Thanks! I have sent you a private message with the dropbox link to the cube file.
graeme
Site Admin
Posts: 2256
Joined: Tue Apr 26, 2005 4:25 am
Contact:

Re: Bader Error when readin very large cube file

Post by graeme »

It looks ok to me. The memory required is about 2.8 Gb. I'll attached the output.

When I have seen this problem before, it is related to operating system limits on the memory that any one process can use. Try setting these to unlimited values. In bash, this is the ulimit setting.

----------

# X Y Z CHARGE MIN DIST ATOMIC VOL
--------------------------------------------------------------------------------
1 0.000540 0.000040 0.000120 24.892638 1.907460 65.328571
2 -0.000590 0.000090 4.094950 9.289729 1.439315 595.545831
3 3.993800 -0.000970 -0.000220 9.307332 1.441146 613.445918
4 0.000490 -4.106820 -0.000040 9.296819 1.431886 583.732765
5 0.000000 0.000190 -4.094690 9.289823 1.439401 595.640542
6 -3.992770 0.000610 0.000400 9.306523 1.441621 612.445054
7 -0.001080 4.106920 -0.000200 9.296612 1.431699 583.949389
8 -0.000580 -1.450010 5.199210 0.328744 0.291693 183.541345
9 -0.000620 1.450240 5.199270 0.328807 0.291920 183.523149
10 5.073900 -0.001130 -1.466680 0.319062 0.272702 183.782463
11 5.074040 -0.000940 1.466110 0.318746 0.272361 178.457754
12 0.000620 -5.214610 -1.445190 0.328036 0.279314 189.490078
13 0.000450 -5.214820 1.444820 0.328032 0.279029 189.347591
14 -0.000110 -1.449950 -5.198970 0.328711 0.291595 183.520490
15 0.000010 1.450240 -5.198950 0.328750 0.291859 183.263275
16 -5.072900 0.000710 -1.466100 0.319031 0.271609 179.022571
17 -5.072980 0.000620 1.466780 0.319236 0.272177 184.390391
18 -0.001070 5.214710 -1.445290 0.328132 0.279503 189.410250
19 -0.001150 5.214900 1.444680 0.328037 0.279022 189.224605
--------------------------------------------------------------------------------
VACUUM CHARGE: 0.0000
VACUUM VOLUME: 0.0000
NUMBER OF ELECTRONS: 84.5828
fangliu
Posts: 3
Joined: Mon Feb 05, 2018 3:22 pm

Re: Bader Error when readin very large cube file

Post by fangliu »

Great! Thank you very much.
mlem
Posts: 2
Joined: Thu Feb 08, 2018 2:23 pm

Re: Bader Error when readin very large cube file

Post by mlem »

Hello,

I have exactly the same error but I am calculating with 50GB of RAM. Should I include some keywords or why does it give such an error?
graeme
Site Admin
Posts: 2256
Joined: Tue Apr 26, 2005 4:25 am
Contact:

Re: Bader Error when readin very large cube file

Post by graeme »

Check for any user limits placed on memory usage. In bash, this is controlled with the ulimit commands.
mlem
Posts: 2
Joined: Thu Feb 08, 2018 2:23 pm

Re: Bader Error when readin very large cube file

Post by mlem »

I checked even from the system administrator and he told me, that there is no limit, I can use 50GB. He wanted to look into what is happening but I only had bader's binary. If I use a smaller cube file, then it calculates well.
ypsilon
Posts: 2
Joined: Mon Feb 26, 2018 10:27 am

Re: Bader Error when readin very large cube file

Post by ypsilon »

Hello,

I'm having the same problem with a CHGCAR of 2.7 Gb.
This is what I get using the ulimit -a command :

core file size (blocks, -c) 0
data seg size (kbytes, -d) unlimited
scheduling priority (-e) 0
file size (blocks, -f) unlimited
pending signals (-i) 515338
max locked memory (kbytes, -l) 4086160
max memory size (kbytes, -m) unlimited
open files (-n) 1048576
pipe size (512 bytes, -p) 8
POSIX message queues (bytes, -q) 819200
real-time priority (-r) 0
stack size (kbytes, -s) unlimited
cpu time (seconds, -t) unlimited
max user processes (-u) 1024
virtual memory (kbytes, -v) unlimited
file locks (-x) unlimited

I don't know which limitation could be causing this...
graeme
Site Admin
Posts: 2256
Joined: Tue Apr 26, 2005 4:25 am
Contact:

Re: Bader Error when readin very large cube file

Post by graeme »

Also check that the code is 64 bit - 32 bit code will have a 2GB array limit.

You can also try --mcmodel=large

If anyone finds a fix or has an idea, please post.
ypsilon
Posts: 2
Joined: Mon Feb 26, 2018 10:27 am

Re: Bader Error when readin very large cube file

Post by ypsilon »

An older version (0.28a) of the bader binary worked for me while the 1.02 failed.
spot
Posts: 1
Joined: Thu Apr 26, 2018 7:13 am

Re: Bader Error when readin very large cube file

Post by spot »

I had the same problem with a large CHGCAR file (>4GB). 1.02 failed but a self compiled version of 1.03 did the trick for me. Hope this helps.
nhanhieulogo
Posts: 1
Joined: Wed May 09, 2018 7:24 pm
Contact:

Re: Bader Error when readin very large cube file

Post by nhanhieulogo »

I checked even from the system administrator and he told me, that there is no limit, I can use 50GB. He wanted to look into what is happening but I only had bader's binary. If I use a smaller cube file, then it calculates well.
http://nhanhieulogo.com/dang-ky-nhan-hi ... i-nd,24724
Post Reply