Literature
Africa |
ATLAS /
Claire's Awesome Page of Random Useful StuffWhen I first started working through the ATLAS Computing and Physics Analysis Workbooks I made myself a "quick-reference" file containing a bunch of useful bits of information... notes on the ATLAS data structure, useful Athena commands, etc. Anyway, I thought that this may be useful to not just me, so that's what this page is all about. I will try to update it as I go along, feel free to add your own stuff too. Useful Athena CommandsAt startupTo set up the athena environment (must have proper requirements file - see here for that):
you can do: Checking out packagesFirst, see which version of the package you should check out (dependent on your version of athena)
If you don't know the full package path: > cmt show packages | grep 'UserAnalysis' > cmt show versions | grep 'UserAnalysis' To check out a package: (cmt co -r <package version> <package path>)
If you are working with different versions of athena it is useful to do mkdir 14.5.2 (or whatever) and co the package in that directory. Once you have checked out a package> cd ..packagepath../cmt > cmt config > source setup.sh > gmake When checking out a large number of packages, it is better to do the following at the top of your work area: > setupWorkArea.py > cd WorkArea/cmt > cmt config > cmt broadcast gmake the first script generates a temporary package that has dependency on all the packages that you have checked out, and the last command do the gmake in each package in proper order. Most likely, your gmake failed because it was not done in the right order. Athena analysisTo rebuild package after making a change: > cd ..packagepath../cmt > cmt config > source setup.sh [or .csh] > gmake You only need to do
If you have a lot of output and you want to direct it to a file (call it whatever you want):
If you want to have the output on the screen as well as in the file do:
If you want to submit an interactive job and run it in the background:
Athena package structurecmt : contains requirements + setup files. This is where you build your code. Grid CommandsSetting up the grid environmentOn lxplus (haven't got this running for the UJ cluster yet!): > export PATHENA_GRID_SETUP_SH=/afs/cern.ch/project/gd/LCG-share/current/etc/profile.d/grid_env.sh > source /afs/cern.ch/atlas/offline/external/GRID/ddm/DQ2Clients/setup.sh > voms-proxy-init -voms atlas and type in your cert password Finding and getting stuff off the gridList all files in a particular dataset (use -r for listing replicas of the dataset):
To list, for eg, only the root files:
To get a particular file from a dataset:
Here's a bash script (getfiles.sh) I made once to get a whole bunch of files from the grid: #!/bin/bash cd /tmp/claire/cavernonly/ for X in 01 37 23 33 29 26 11 18 36 14 19 27 24 34 06 15 17 32 07 13 31 35 10 39 16 08 20 04 21 22 30 12 05 28 09 03 38 do dq2-get -p lcg1 -f user08.KeteviAAssamagan.005008.CavernInputSF05.NTUP.pool.v1.AANT._000${X}.root user08.KeteviAAssamagan.005008.CavernInputSF05.NTUP.pool.v1 done Submitting jobs on the gridcommand goes something like: pathena --inDS <input dataset> --outDS <output dataset> --tmpDir <temp directory> <your jobOptions file>
Make sure your name is the same as that on your certificate, and that you include the user08 (perhaps it's 09 now? not sure...) UJ Cluster InfoAvailable athena installation kit & path:/nfs/osg/app/atlas_app/atlas_rel/14.5.2/ Getting packages on the cluster: Log in to CERN, go to your test directory, check out the packages, and then make a tar file to bring back:
Then on your home computer: > scp lxplus.cern.ch:testarea/14.2.23/checkout.tar . > tar -xf checkout.tar and proceed as above. ATLAS Data StructureByte-stream Data is a persistent presentation of the event data flowing from the trigger system. Raw Data Object Data (RDO) is a C++ object representation of the byte-stream information. Event Summary Data (ESD) contains the detailed output of the detector reconstruction and is produced from the raw data. It contains sufficient information to allow particle identification, track re-fitting, jet calibration etc. thus allowing for the rapid tuning of reconstruction algorithms and calibrations. The target size for the ESD is 500 kB per event. Analysis Object Data (AOD) is a summary of the reconstructed event, and contains sufficient information for common analyses. Several tailor-made streams of AODs are foreseen for the different needs of the physics community. The AOD can be produced from the ESD and thus makes it unnecessary in general to navigate back and process the raw data, adding significant cost and time benefits. The target size for the AOD is 100 kB per event. TAG format is a simple and efficient way to search for events of interest. One can run on the TAG files and very quickly "filter" events that one is interested in. Analyzing the DataAthena toolscheckFile.py : this script can inspect a POOL file and dump on screen its 'high level' content. It will tell you what kind of containers have been stored (eg: ElectronContainer, TruthParticleContainer,...) but it won't tell you more detailed properties (such as pt,eta distributions). Useful to do on your input file to make sure you have all the containers you need. Or, adding the following line:
at the end of your jobOptions.py will print out a list of all available athena containers and specifies those that are used and those that are not. This is very useful to determine the required (and superfluous) content of a DPD for a given purpose. ROOT analysis (this should probably form it's own page at some point)To make ROOT .C and .h files from a root file, in root do:
This will make a header file containing all the branches, and a C file that loops over all the events in the tree, and all the branches per event. You can edit these to make it analyse the root file as you wish. Run your analysis (in root) like: > .L myclass.C > myclass t > t.Loop() If you have a whole bunch of root files you want to run over, you can add them all together in a chain, and loop over that chain. Here's an example of a script to do that: { gROOT->LoadMacro("CavernOnly.C") ; TChain *chain = new TChain("CollectionTree") ; chain->Add("/tmp/claire/cavernonly/user08.KeteviAAssamagan.005008.CavernInputSF05.NTUP.pool.v1/user08.KeteviAAssamagan.005008.CavernInputSF05.NTUP.pool.v1.AANT._0000* .root") ; CavernOnly t(chain); t.Loop(); chain->Draw("RpcClusterEta"); } (at the end I just asked it to draw me one of the histograms I made) Monte Carlo Particle Codes
Pythia_i for AthenaFor moar pythia, read the manual. Pythia_i commandsThis is basically to show you the structure of the commands when using pythia in athena. For more options, refer to the pythia manual (above). Turn off all processes:
Setting particle mass (eg sets Z' mass to 20 GeV):
Setting particle width (eg sets Z' width to 0.0000000027 GeV)... turns out this is overridden:
Particle generation processes (eg for higgs):
Turn off Z0/gamma* interference:
Setting decay channels:
Messing with the decay channels:
Playing with couplings (eg Z' couplings):
Useful ROOT TitbitsTAttMarker: Histogram marker colours, styles & sizes |