Recent Changes - Search:

Public

Physics

Engineering

Literature
Journal Club
People

MineralPET
DAQ

Computing

Africa


Claire's Awesome Page of Random Useful Stuff

When I first started working through the ATLAS Computing and Physics Analysis Workbooks I made myself a "quick-reference" file containing a bunch of useful bits of information... notes on the ATLAS data structure, useful Athena commands, etc. Anyway, I thought that this may be useful to not just me, so that's what this page is all about. I will try to update it as I go along, feel free to add your own stuff too.

Useful Athena Commands

At startup

To set up the athena environment (must have proper requirements file - see here for that):

> source cmthome/setup.sh -tag=14.5.2

you can do: > echo $CMTPATH to check that all is ok.

Checking out packages

First, see which version of the package you should check out (dependent on your version of athena)

> cmt show versions PhysicsAnalysis/AnalysisCommon/UserAnalysis

If you don't know the full package path:

> cmt show packages | grep 'UserAnalysis'

> cmt show versions | grep 'UserAnalysis'

To check out a package: (cmt co -r <package version> <package path>)

> cmt co -r UserAnalysis-00-13-09 PhysicsAnalysis/AnalysisCommon/UserAnalysis

If you are working with different versions of athena it is useful to do mkdir 14.5.2 (or whatever) and co the package in that directory.

Once you have checked out a package

> cd ..packagepath../cmt
> cmt config
> source setup.sh
> gmake

When checking out a large number of packages, it is better to do the following at the top of your work area:

> setupWorkArea.py
> cd WorkArea/cmt
> cmt config
> cmt broadcast gmake

the first script generates a temporary package that has dependency on all the packages that you have checked out, and the last command do the gmake in each package in proper order. Most likely, your gmake failed because it was not done in the right order.

Athena analysis

To rebuild package after making a change:

> cd ..packagepath../cmt
> cmt config
> source setup.sh  [or .csh]
> gmake

You only need to do cmt config and source setup.sh the first time - after that you can just do gmake. Once you have compiled the package with your change, go back to UserAnalysis/run where your joboptions file is and run by doing:

> athena.py myjobOptions.py

If you have a lot of output and you want to direct it to a file (call it whatever you want):

> athena.py myjobOptions.py > myoutfile.out

If you want to have the output on the screen as well as in the file do:

> athena.py myjobOptions.py | tee myoutfile.out

If you want to submit an interactive job and run it in the background:

> athena.py myjobOptions.py >& myoutfile.log &

Athena package structure

cmt : contains requirements + setup files. This is where you build your code.
<PackageName> : contains the C++ header files.
src : contains the C++ source files.
share : contains job option (.py) files.
run : where you run your jobs. ... also where you get_files HelloWorldOptions.py and run it
python : any additional python source.


Grid Commands

Setting up the grid environment

On lxplus (haven't got this running for the UJ cluster yet!):

> export PATHENA_GRID_SETUP_SH=/afs/cern.ch/project/gd/LCG-share/current/etc/profile.d/grid_env.sh
> source /afs/cern.ch/atlas/offline/external/GRID/ddm/DQ2Clients/setup.sh
> voms-proxy-init -voms atlas

and type in your cert password

Finding and getting stuff off the grid

List all files in a particular dataset (use -r for listing replicas of the dataset):

> dq2-ls -f dataset

To list, for eg, only the root files:

> dq2-ls -f user08.KeteviAAssamagan.005008.CavernInputSF05.ESD.pool.v2 |grep root

To get a particular file from a dataset:

> dq2-get -f file dataset

Here's a bash script (getfiles.sh) I made once to get a whole bunch of files from the grid:

#!/bin/bash
cd /tmp/claire/cavernonly/
for X in 01 37 23 33 29 26 11 18 36 14 19 27 24 34 06 15 17 32 07 13 31 35 10 39 16 08 20 04 21 22 30 12 05 28 09 03 38
do
        dq2-get -p lcg1 -f user08.KeteviAAssamagan.005008.CavernInputSF05.NTUP.pool.v1.AANT._000${X}.root user08.KeteviAAssamagan.005008.CavernInputSF05.NTUP.pool.v1
done

Submitting jobs on the grid

command goes something like: pathena --inDS <input dataset> --outDS <output dataset> --tmpDir <temp directory> <your jobOptions file>

> pathena --inDS user08.KeteviAAssamagan.005300.PythiaH130zz4l.ESD.pool.v9 --outDS user08.ClaireAlexandraLee.005300.PythiaH130zz4l.NTUP.pool.v9a --tmpDir /tmp MuonHits_jobOptions.py

Make sure your name is the same as that on your certificate, and that you include the user08 (perhaps it's 09 now? not sure...)


UJ Cluster Info

Available athena installation kit & path:

/nfs/osg/app/atlas_app/atlas_rel/14.5.2/

Getting packages on the cluster:

Log in to CERN, go to your test directory, check out the packages, and then make a tar file to bring back:

> tar -cvlpf checkout.tar Control PhysicsAnalysis

Then on your home computer:

> scp lxplus.cern.ch:testarea/14.2.23/checkout.tar .
> tar -xf checkout.tar

and proceed as above.


ATLAS Data Structure

Byte-stream Data is a persistent presentation of the event data flowing from the trigger system.

Raw Data Object Data (RDO) is a C++ object representation of the byte-stream information.

Event Summary Data (ESD) contains the detailed output of the detector reconstruction and is produced from the raw data. It contains sufficient information to allow particle identification, track re-fitting, jet calibration etc. thus allowing for the rapid tuning of reconstruction algorithms and calibrations. The target size for the ESD is 500 kB per event.

Analysis Object Data (AOD) is a summary of the reconstructed event, and contains sufficient information for common analyses. Several tailor-made streams of AODs are foreseen for the different needs of the physics community. The AOD can be produced from the ESD and thus makes it unnecessary in general to navigate back and process the raw data, adding significant cost and time benefits. The target size for the AOD is 100 kB per event.

TAG format is a simple and efficient way to search for events of interest. One can run on the TAG files and very quickly "filter" events that one is interested in.


Analyzing the Data

Athena tools

checkFile.py : this script can inspect a POOL file and dump on screen its 'high level' content. It will tell you what kind of containers have been stored (eg: ElectronContainer, TruthParticleContainer,...) but it won't tell you more detailed properties (such as pt,eta distributions). Useful to do on your input file to make sure you have all the containers you need.

Or, adding the following line:

ServiceMgr.StoreGateSvc.Dump = True

at the end of your jobOptions.py will print out a list of all available athena containers and specifies those that are used and those that are not. This is very useful to determine the required (and superfluous) content of a DPD for a given purpose.

ROOT analysis (this should probably form it's own page at some point)

To make ROOT .C and .h files from a root file, in root do:

> CollectionTree->MakeClass("myclass")

This will make a header file containing all the branches, and a C file that loops over all the events in the tree, and all the branches per event. You can edit these to make it analyse the root file as you wish.

Run your analysis (in root) like:

> .L myclass.C
> myclass t
> t.Loop()

If you have a whole bunch of root files you want to run over, you can add them all together in a chain, and loop over that chain. Here's an example of a script to do that:

{
gROOT->LoadMacro("CavernOnly.C")             ;
TChain *chain = new TChain("CollectionTree")               ;
chain->Add("/tmp/claire/cavernonly/user08.KeteviAAssamagan.005008.CavernInputSF05.NTUP.pool.v1/user08.KeteviAAssamagan.005008.CavernInputSF05.NTUP.pool.v1.AANT._0000*
.root")     ;
CavernOnly t(chain);
t.Loop();
chain->Draw("RpcClusterEta");
}

(at the end I just asked it to draw me one of the histograms I made)


Monte Carlo Particle Codes

bosons 
gluon21
gamma22
Z023
W+24
SM Higgs25
Z'32
fermions 
e11
mu13
p2212
N2112

Pythia_i for Athena

For moar pythia, read the manual.

Pythia_i commands

This is basically to show you the structure of the commands when using pythia in athena. For more options, refer to the pythia manual (above).

Turn off all processes:

'pysubs msel 0',

Setting particle mass (eg sets Z' mass to 20 GeV):

'pydat2 pmas 32 1 20',

Setting particle width (eg sets Z' width to 0.0000000027 GeV)... turns out this is overridden:

'pydat2 pmas 32 2 0.0000000027',

Particle generation processes (eg for higgs):

'pysubs msel 16', -> Turns on all h generation subprocesses (mainly f f' -> f f' h)
'pysubs msub 102 1', -> Turn on only gg->h subprocess

Turn off Z0/gamma* interference:

'pypars mstp 43 2',

Setting decay channels:

'pydat3 mdme 218 1 1', -> putting a 1 in the last column turns this channel ON (ee)
'pydat3 mdme 225 1 0', -> putting a 0 in the last column turns this channel OFF (Z0 Z0)

Messing with the decay channels:

'pydat3 kfdp 218 1 32', -> changes h->ee first particle decay to a Z'
'pydat3 kfdp 218 2 32', -> changes h->ee first particle decay to a Z', so now we have h->Z'Z'

Playing with couplings (eg Z' couplings):

'pydat1 paru 121 -0.0693',


Useful ROOT Titbits

TAttMarker: Histogram marker colours, styles & sizes


Edit - History - Print - Recent Changes - Search
Page last modified on December 07, 2009, at 12:40 pm