Skip to content
Snippets Groups Projects
Commit 9f74eabf authored by Matthias Schnepf's avatar Matthias Schnepf
Browse files

add examples with CamelCase HTCondor attributes

parent a66c9e2d
Branches
No related tags found
No related merge requests found
############################
BELLE 2 example
############################
The BELLE 2 software needs special software. These software is supported by the
HTCondor batch system with a docker image. This image runs only on resources
which has "ProvidesBELLE2".
The python script create_jdl.py creates a directory for the JDL file, the
executable and subdirecotries for log,error and output files.
Change in the create_jdl.py file the arguments, executable and requerments for
your job and run than the script.
In this example the job run the bash script job.sh. This script can setup your
software and start it.
Switch in the directory with the JDL file. There you can submit your jobs with
"condor_submit JDL..."
With "condor_q (your username)" you can see the status of your jobs
#!/usr/bin/env python
import os
import sys
sys.path.append(os.path.dirname('../../classes')) # change path to classes directory
from classes.JDLCreator import JDLCreator # import the class to create and submit JDL files
......@@ -15,7 +19,9 @@ def main():
arguments = [x for x in range(0, 5)]
jobs.arguments = arguments # set arguments for condor job
jobs.requirements = "(TARGET.PROVIDES_CPU == True) && (TARGET.PROVIDES_EKP_RESOURCES == True) && (TARGET.PROVIDES_BELLE_2 == True)"
# requirement CPU job for belle2 at EKP
jobs.requirements = "(TARGET.ProvidesCPU == True) && (TARGET.ProvidesEKPResources == True) && (TARGET.ProvidesBELLE2 == True)"
jobs.wall_time = 1 * 60 * 60 # set walltime to 1h in sec
jobs.memory = 2048 # set memory to 2048 MB
jobs.job_folder = "condor_jobs" # set name of the folder, where files and information are stored
......
#!/bin/bash
echo "hello world"
echo "hostname: " `hostname`
echo "how am I? " `id`
pwd
echo "start basf"
basf2 .....
############################
CMS local example
############################
The standard software enviremoent is the SLC6 enviroment.
That your jobs has acces to the file server or the home folder of the portal
machines, you need to set the requirement "TARET.ProvidesEKPResources"
The python script create_jdl.py creates a directory for the JDL file, the
executable and subdirecotries for log,error and output files.
Change in the create_jdl.py file the arguments, executable and requerments for
your job and run than the script. The script calculates 5 asymtotic limits. The
argument list defines which limits should be calculated.
!!! Change the variable OUTPUT_PATH at the begin of the script !!!
In this example the job run the bash script job.sh. This example script setup
the CMSSW software and combine and start a short job.
Switch in the directory with the JDL file. There you can submit your jobs with
"condor_submit JDL..."
With "condor_q (your username)" you can see the status of your jobs. Each jobs
should run about 10 mins
#!/usr/bin/env python
import os
import sys
sys.path.append(os.path.dirname('../../classes')) # change path to classes directory
from classes.JDLCreator import JDLCreator # import the class to create and submit JDL files
def main():
"""Submit a simple example job"""
jobs = JDLCreator("condocker") # Default (no Cloud Site supplied): Docker with SLC6 image
# Some example sites:
# site_name='ekpsupermachines' "Super Machines" IO intesiv jobs
jobs.executable = "job.sh" # name of the job script
jobs.wall_time = 10 * 60 * 60 # job will finish in 10 hours
jobs.memory = 2048 # Our regular 2048 MB per slot
# build list of arguments: 1,2,3,4,5
arguments = [x for x in range(0, 5)]
# you can also build a regular list via arg = []; arg.append(value)
jobs.arguments = arguments # set arguments for condor job
# Our job requires lots of CPU resources and needs access to the local EKP resources
jobs.requirements = "(TARGET.ProvidesCPU ==True) && (TARGET.ProvidesEKPResources == True)"
jobs.job_folder = "condor_jobs" # set name of the folder, where files and information are stored
jobs.WriteJDL() # write an JDL file and create folder for log files
if __name__ == "__main__":
main()
#!/bin/bash
OUTPUT_PATH="/storage/a/USER/results/"
echo "hostname: " `hostname`
echo "how am I? " `id`
pwd
SPAWNPOINT=`pwd`
## setup CMSSW
VO_CMS_SW_DIR=/cvmfs/cms.cern.ch
source $VO_CMS_SW_DIR/cmsset_default.sh
SCRAM_ARCH=slc6_amd64_gcc481
scramv1 project CMSSW_7_1_19
cd CMSSW_7_1_19/src
eval `scramv1 runtime -sh`
## setup combine
git clone https://github.com/cms-analysis/HiggsAnalysis-CombinedLimit.git HiggsAnalysis/CombinedLimit
echo 'after git clone'
cd HiggsAnalysis/CombinedLimit
git fetch origin
git checkout v5.0.2
scramv1 b clean; scramv1 b
echo "arguments:"
for a in ${BASH_ARGV[*]} ; do
echo -n "$a "
done
cd ${SPAWNPOINT}
pwd
ls
# copy files in work direcotry of workernode
cp /storage/a/mschnepf/tHq/13tev/full_workdir/${1}/tH_comb_${1}.txt .
cp /storage/a/mschnepf/tHq/13tev/full_workdir/${1}/tH_?m_${1}.root .
cp /storage/a/mschnepf/tHq/13tev/full_workdir/${1}/tH_?m_${1}.txt .
# start limit calulation
combine -M Asymptotic tH_comb_${1}.txt >> limits_${1}.txt
ls -la
# move result files to file server
mv limits_${1}.txt ${OUTPUT_PATH}/limits_${1}.txt
mv higgsCombineTest.Asymptotic.mH120.root ${OUTPUT_PATH}/limits_${1}.root
echo '### end of job ###'
## end of script
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment