Message: Re: Too many particles causing Segmentation Error Not Logged In (login)
 Next-in-Thread Next-in-Thread
 Next-in-Forum Next-in-Forum

None Re: Too many particles causing Segmentation Error 

Forum: Particles
Re: Question Too many particles causing Segmentation Error (Daniel O'Brien)
Re: Note Re: Too many particles causing Segmentation Error (Juan)
Date: 19 Mar, 2009
From: Paul Nicholas Colin Gloster <Paul Nicholas Colin Gloster>

Chuir Juan ar an 19ú Márta 2009:
|---------------------------------------------------------------------|
|"Hi Daniel,                                                          |
|                                                                     |
|I have tried your code.                                              |
|                                                                     |
|I think the problem is from your LINACPrimaryGenerator.cc            |
|                                                                     |
|you are creating to much particles in a event.                       |
|                                                                     |
|      G4int n_particle = 300000;                                     |
|      particleGun = new G4ParticleGun(n_particle);                   |
|                                                                     |
|Running this, I see my memory usage is 30%                           |
|                                                                     |
|When I put n_particle = 1 in LINACPrimaryGenerator.cc and in the main|
|LINACModel.cc I set                                                  |
|                                                                     |
|  runManager->BeamOn(300000);                                        |
|                                                                     |
|The program runs good an uses only 4% of the memory                  |
|                                                                     |
|Juan"                                                                |
|---------------------------------------------------------------------|

Dear all,

I have not looked at the code but I agree with Juan that the memory
being used was probably more than available, because I ran it with
Valgrind which reported:
"[..]
[..] Stack overflow in thread 1: can't grow stack to 0x7fe801ff8
[..]"
while a segmentation fault was happening. A stack is an important part
of memory.

Daniel O'Brien may doubt this as he reported:
!----------------------------------------------------------------------!
!"[..]                                                                 !
!                                                                      !
![..] Using the Gnome system monitor I checked to see if the RAM was an!
!issue but the RAM usage reached about 70% and never increased beyond  !
!that during the course of the program.                                !
!                                                                      !
![..]"                                                                 !
!----------------------------------------------------------------------!

However, Daniel O'Brien was probably running the program  LINACModel
with a BASH (Bourne Again SHell) or another shell in which memory for
child processes was restricted (by a default setting of the shell)
whereas the Gnome system monitor was probably showing something more
like the whole system instead of what was available to  LINACModel  .
For example, on one GNU/Linux system with GNU bash, version
3.1.17(2)-release (x86_64-pc-linux-gnu), the
ulimit -a
command reported:
"[..]
stack size              (kbytes, -s) 8192
[..]"
and running the example by Daniel O'Brien without the improvement by
Juan resulted in:
"[..]

annihil:   for  e+    SubType= 5
      Lambda tables from 100 eV  to 100 TeV in 84 bins, spline: 1
      ===== EM models for the G4Region  DefaultRegionForTheWorld ======
            eplus2gg :     Emin=          0 eV         Emax=   100 TeV
Segmentation fault"
whereas increasing the stack size by using the command
ulimit -s 900000
allowed the program to run apparently without a crash:
"[..]

annihil:   for  e+    SubType= 5
      Lambda tables from 100 eV  to 100 TeV in 84 bins, spline: 1
      ===== EM models for the G4Region  DefaultRegionForTheWorld ======
            eplus2gg :     Emin=          0 eV         Emax=   100 TeV".

However, Juan's suggestion seems to be the best approach to
try. Eventually when using "much more than 300000 electrons" it might
be beneficial to also increase the stack size as I have mentioned.

Regards,
Colin Paul

Inline Depth:
 1 1
 All All
Outline Depth:
 1 1
 2 2
 All All
Add message: (add)

1 None: Re: Too many particles causing Segmentation Error   (Daniel O'Brien - 19 Mar, 2009)
(_ Feedback: Re: Too many particles causing Segmentation Error   (Gumplinger Peter - 19 Mar, 2009)
(_ Agree: Re: Too many particles causing Segmentation Error   (Daniel O'Brien - 20 Mar, 2009)
 Add Message Add Message
to: "Re: Too many particles causing Segmentation Error"

 Subscribe Subscribe

This site runs SLAC HyperNews version 1.11-slac-98, derived from the original HyperNews


[ Geant 4 Home | Geant 4 HyperNews | Search | Request New Forum | Feedback ]