Message: Re: INCLXX plus ABLA/PRECO Not Logged In (login)
 Next-in-Thread Next-in-Thread
 Next-in-Forum Next-in-Forum

None Re: INCLXX plus ABLA/PRECO 

Forum: Hadronic Processes
Re: Note INCLXX plus ABLA/PRECO (Alexey Solovyev)
Re: None Re: INCLXX plus ABLA/PRECO (Davide Mancusi)
Re: None Re: INCLXX plus ABLA/PRECO (Alexey Solovyev)
Re: None Re: INCLXX plus ABLA/PRECO (Davide Mancusi)
Re: None Re: INCLXX plus ABLA/PRECO (Alexey Solovyev)
Date: 04 Mar, 2016
From: Davide Mancusi <Davide Mancusi>

Alexey,

What you report sounds like a memory leak. Can you please post your
message as a new thread? I would say the "Physics List" sub-category is
the most appropriate.

Also, it would be great if you could provide (by e-mail, perhaps) a
minimal working example that reproduces the problem, so that I can
investigate.

Thanks for reporting this anyway.
Davide

On Fri, Mar 04, 2016 at 03:15:47AM -0800, Alexey Solovyev wrote:
> 
> *** Discussion title: Hadronic Processes
> 
> I'm running my simulation on IHEP (ihep.su) cluster and as soon I'm not
> cluster administrator, it may take a while to retreive fully previous
> running logs (I can anyway ask qsub logs if needed), but here are things
> I noticed last several runs:
> 
> 0. I'm trying to reproduce the same data presented in
> http://geant4.slac.stanford.edu/Space06/presentations/03_Wednesday/05-Sasaki_ValidationOfIonPhysicsInGeant4AgainstCarbon.pdf
> p. 24 for different energies and also to differentiate each isotope
> contribution. So I have lots root histos, DetectorSD to put data
> depending on
> step->GetTrack()->GetParticleDefinition()->GetAtomicNumber/Mass, and 1D
> replica of 0.5-1mm G4Tubs in my geometry. So e.g. for 1e7 primary
> particles in single thread in Windows 7 (as my developer machine) the
> highest memory consumption of this simulation is approx. 1.2 Gb.
> 
> 1. I have 5Gb vmem limit (on cluster) for the whole task (no matter how
> much nodes I'm using). First I found that all my tasks which were using
> QGSP_INCLXX we killed by memory limit while code linked with 10.2.
> 
> 2. While investigating this issue I also found that e.g.
> QGSP_BIC/QBBC/Shielding don't die by memory in 10.2.
> 
> 3. For the chosen nps (1e7) using QGSP_BIC in 10.2 took about 300 hrs.
> cpu time (in 5 nodes, 60 hrs. wall-time) and 330 hrs. in 10.1. For the
> QGSP_INCLXX it tooks about 322 hrs. cpu-time in 10.1 and killed by
> memory limit in 10.2. When killed it completed approx. 3e6 nps, and
> while similar tasks runned I got with qstat -f that after 144. hrs
> cpu-time it utilized almost 4.3 Gb (and killed a bit later, again I
> can't now directly access kill logs so can't say now how much cpu/wall
> time passed before it has been killed and I can't say how much vmem used
> completed tasks in the past, but the completed tasks definitely not
> overcome 5Gb vmem limit). Actually all my simulations with QGSP_INCLXX
> were killed by vmem limit with same nps (1e3 on ui node and 1e6 worked
> succesfully btw) in all my geometries in 10.2 (I have single file for
> all cases, switching geometry depending on messenger or command line,
> but I think it's not essential here and all other geometries are more
> sophisticated).
> 
> 4. I'm not fully sure that's the INCLXX bug or something because I also
> store all the secondary partciles using SteppingAction the same way as
> in Hadr03 example and if INCLXX generate lots of excited particles this
> storage may overcome the memory limit, but anyway this is not happening
> in 10.1 nor with other phys lists in 10.2.
> 
> I can ask for the run logs if needed, and maybe we better use e-mail to
> not flood this topic a lot.
> 
> -------------------------------------------------------------
> Visit this GEANT4 at hypernews.slac.stanford.edu message (to reply or unsubscribe) at: 
> http://hypernews.slac.stanford.edu/HyperNews/geant4/get/hadronprocess/1522/1/1/1/1.html 

-- 
Davide Mancusi
CEA-Saclay
DEN/DM2S/SERMA/LTSD
91191 Gif-sur-Yvette CEDEX
France

 Add Message Add Message
to: "Re: INCLXX plus ABLA/PRECO"

 Subscribe Subscribe

This site runs SLAC HyperNews version 1.11-slac-98, derived from the original HyperNews


[ Geant 4 Home | Geant 4 HyperNews | Search | Request New Forum | Feedback ]