Re: experiences and lessons learned with a fresh Eumetcast install for WIndows and Linux

Markus Kempf
 

Ernst,
I can observe an increase in memory usage. Right now the service allocates 5.1GB of memory:
root@openmediavault:~# service tellicast-client status|grep Memory:
Memory: 5.1G
the main running tc
root@openmediavault:~# pmap 4679
4679: /usr/local/bin/tc-cast-client -c /etc/cast-client_bas.ini
0000000000400000 2760K r-x-- tc-cast-client
00000000007b1000 3036K rw--- tc-cast-client
0000000000aa8000 492K rw--- [ anon ]
0000000001a3f000 83204K rw--- [ anon ]
00007f661828e000 664K rw--- [ anon ]
00007f6618334000 12K r---- libnss_files-2.28.so
00007f6618337000 28K r-x-- libnss_files-2.28.so
00007f661833e000 8K r---- libnss_files-2.28.so
00007f6618340000 4K ----- libnss_files-2.28.so
00007f6618341000 4K r---- libnss_files-2.28.so
00007f6618342000 4K rw--- libnss_files-2.28.so
00007f6618343000 36K rw--- [ anon ]
00007f661834c000 136K r---- libc-2.28.so
00007f661836e000 1312K r-x-- libc-2.28.so
00007f66184b6000 304K r---- libc-2.28.so
00007f6618502000 4K ----- libc-2.28.so
00007f6618503000 16K r---- libc-2.28.so
00007f6618507000 8K rw--- libc-2.28.so
00007f6618509000 16K rw--- [ anon ]
00007f661850d000 52K r---- libm-2.28.so
00007f661851a000 636K r-x-- libm-2.28.so
00007f66185b9000 852K r---- libm-2.28.so
00007f661868e000 4K r---- libm-2.28.so
00007f661868f000 4K rw--- libm-2.28.so
00007f6618690000 4K r---- libdl-2.28.so
00007f6618691000 4K r-x-- libdl-2.28.so
00007f6618692000 4K r---- libdl-2.28.so
00007f6618693000 4K r---- libdl-2.28.so
00007f6618694000 4K rw--- libdl-2.28.so
00007f6618695000 8K rw--- [ anon ]
00007f6618699000 72K rw--- [ anon ]
00007f66186ab000 4K r---- ld-2.28.so
00007f66186ac000 120K r-x-- ld-2.28.so
00007f66186ca000 32K r---- ld-2.28.so
00007f66186d2000 4K r---- ld-2.28.so
00007f66186d3000 4K rw--- ld-2.28.so
00007f66186d4000 4K rw--- [ anon ]
00007fff5f5bb000 232K rw--- [ stack ]
00007fff5f5fc000 12K r---- [ anon ]
00007fff5f5ff000 4K r-x-- [ anon ]
total 94112K
root@openmediavault:~# grep VmPeak /proc/4679/status
VmPeak: 94168 kB

It slowly creeps up, will check again in a few hours after maybe some garbage colection has taken place.

Markus

Join MSG-1@groups.io to automatically receive all group messages.