|Message: energy spectra too ideal?||Not Logged In (login)|
Click on the Forum title, e.g. on the "Forums by Category" page, to read a sequence of postings to the Forum and its threads all in one page. If you are only interested in one thread or the thread following a specific posting, click the thread or the posting, which takes you to a smaller page, which contains only the part you are interested in and may be easier to navigate.
Messages are "chained" if there are only replies at the first level, i.e. 1/1.html, 1/1/1.html etc. In case of "chained" messages the message number is replaced by the icon and there is no indentation.
Inline: Display the subject line only or also the text of the posting(s); for the choice "All" the "Outline" choices are switched off.
|1||0||1||no text / full text of posting|
|2||1||All||text for level 1 only / text for All postings|
Outline: Choose the depth of the posting thread, successive toggle controls provide increasing detail.
|1||2||1||2 levels / 1 level (original posting)|
|2||3||2||3 levels / 2 levels|
|3||3||All||3 levels / all levels (all postings)|
I'm simulating a CsI(Tl) detector, and I'm interested in protons of 1-100 MeV and electrons of 1-30 MeV.
My detector stops protons of low energy, say, 4 MeV completely. When I run a simulation of 100,000 protons of 4 MeV, I get a sharp spike at 4 MeV. But, in reality, due to the intrinsic detector efficiency, the spectrum should be a Gaussian with a mean energy deposition lower than 4MeV. This would be due to the fluctuations in the number of ionisations per event.
Shouldn't GEANT incorporate this INTRINSIC detector efficiency while carrying out the simulation? Or is there some option by which I can tell GEANT to incorporate this efficiency?
Thanks in advance, Athreya
|Inline Depth:||Outline Depth:||Add message:|