manageTier3SW package

eGroup and Help

Please subscribe to the eGroup:

atlas-adc-tier3-managers

This is the forum to ask questions about this package or ATLASLocalRootBase, request help or new features, and to read announcements.

Archives of the forum can be found here.

Brief Overview

This package will install, update and remove software such as ATLASLocalRootBase, AtlasSetup, DQ2, Ganga, gLite, Pacman, PandaClient, ROOT and other software on your laptop, desktop or cluster. The software to be installed or updated are versions tested by ATLAS Canada User Support.

The software installation closely follows the instructions of the developers; if you want to understand the procedure, please refer to the package installation guides. The manageTier3SW, ATLASLocalRootBase and configuration packages are checked out from git and the code is available for view. Since the installation procedures of the individual package developers are followed, it can mean that there will be cron jobs running at your site. This is true for gLite where cron jobs update certificates on a regular basis.

Note that this provides a fast (~1 hour !) way to setup your computer.

The rest of this document is meant for the local ATLAS administrator to install manageTier3SW; if you are a local ATLAS user, please refer to ATLASLocalRootBase on how to use the user interface software.

Installation

For installation instructions, please see the README file. We strongly recommend you install this in a non-privileged account (eg. atlasadmin) which will not be used for other purposes.

manageTier3SW / ATLASLocalRootBase is also installed on cvmfs; if you have a uniform 64-bit OS environment and have cvmfs, you can access it from there and avoid doing a local installation.

Note that

  • Only the non-privileged atlasadmin account's directory and the $ATLAS_LOCAL_ROOT_BASE directory will be modified by this package.
  • You can install this on a nfs exported file system to share with desktops at your site.
  • Although this installation has only been tested on Scientific Linux 6 and 7, there is an option to make it work on other platforms; for example, use the option for pacman as shown here: updateManageTier3SW.sh --pacmanOptions="-pretend-platform SLC-6".
Warnings
  • Do not delete or change any of the directories in $ATLAS_LOCAL_ROOT_BASE.
  • You may only add directories/files of your own if their names begin with "local".
  • Please do not hack files which are installed or user support will no longer be possible; if you need changes or fixes, please request it in eGroups.
  • manageTier3SW and ATLASLocalRootBase are designed so that the only directory path that users need to know is the environment variable ATLAS_LOCAL_ROOT_BASE. Explicit directory paths are hidden from users; if your site requires users to define or change other directory paths, you may not be using this package correctly.
The updateManageTier3SW.sh script will install all the software needed by an ATLAS user at a Tier3 or personal desktop or laptop. It will also remove obsolete versions (see section below). Athena kits are not installed; these can be accessed from the centrally managed cvmfs (recommended - see this section). Please also read the Post-installation and How-to sections below after installation.

Post-installation

Daily Updating

To maintain your software, simply login to the ATLAS administrator account (atlasadmin) and type updateManageTier3SW.sh. It is recommended that you run this script daily - eg. as a cron job.

Specify Frontier-squid

There is an example file $ATLAS_LOCAL_ROOT_BASE/exampleLocalConfig/localFrontierSquid.sh; please:

cp $ATLAS_LOCAL_ROOT_BASE/exampleLocalConfig/localFrontierSquid.sh $ATLAS_LOCAL_ROOT_BASE/config/localFrontierSquid.sh 
cp $ATLAS_LOCAL_ROOT_BASE/exampleLocalConfig/localFrontierSquid.csh $ATLAS_LOCAL_ROOT_BASE/config/localFrontierSquid.csh 

Next, edit the files to point to your Tier3 squid server (if you have one) or to your nearest friendly squid server (if another Tier2 or Tier3 allows you access.)

If you want to install your own Tier3 squid server, please see these Frontier-squid Instructions.

Mount cvmfs (Recommended)

ATLASLocalRootBase is designed to work with cvmfs which provides Athena releases and CDB pool files; it will automatically make these available if cvmfs is mounted on the computing node. It is recommended that cvmfs be mounted on every machine that supports interactive and batch users; please see cvmfs Instructions.

Testing

There are a few tests you can do to check that your installation and setup is good; first, do

setupATLAS
diagnostics

  • run checkOS to see if you have any missing rpms (you can ignore strace64).
  • run db-fnget to check that your Frontier-squid server settings work.
  • run db-readReal to check that the Frontier-squid, CDB Pool File access work in a release. To do this:
    asetup 16.6.7,slc5
    db-readReal

Proof-on-Demand (PoD)

If you want to disable Proof-on-Demand (I recommend not to; this is only if you have a separate proof cluster which you prefer to utilize), you can do one of the following:

  • For sites installing manageTier3SW, create a file $ATLAS_LOCAL_ROOT_BASE/config/localNoPOD or
  • For sites using ATLASLocalRootBase from cvmfs, create a file $ALRB_localConfigDir/localNoPOD or
  • You can setup an environment variable ALRB_noPOD for all users.

How-to

Cleanup (auto-cleanup)

The command updateManageTier3SW.sh will do an auto-cleanup of obsolete software versions. Installations and cleanups are usually announced in the eGroup atlas-adc-tier3sw-install@cern.ch.

Logging

Information on installations, removals, patching, etc can be found in the file $ATLAS_LOCAL_ROOT_BASE/logDir/installed.

In addition, sites can write their customized logs. The way to do this is outlined in the file $ATLAS_LOCAL_ROOT_BASE/utilities/example_localLogFile.sh. The steps are:

  • cp $ATLAS_LOCAL_ROOT_BASE/utilities/example_localLogFile.sh $ATLAS_LOCAL_ROOT_BASE/config/localLogFile.sh
  • edit $ATLAS_LOCAL_ROOT_BASE/config/localLogFile.sh only in Section2.

Post setup scripts

Sites have the option to have local scripts run after a user does setupATLAS. Simply create 2 files in $ATLAS_LOCAL_ROOT_BASE/config named localPostUserSetup.sh and localPostUserSetup.csh. This facility should be used with great care and tested by the site admin as it will affect all users on site.

Users have the option to skip running this local file with the option setupATLAS --noLocalPostSetup.

Condition Pool Files

For Tier3s, snapshot that should satisfy all normal users is available on a fuse mounted volume (cvmfs) and ATLASLocalRootBase automatically uses this.

As the size of these conditions pool files are large and ever increasing, not to mention requiring frequent updating, I prefer to avoid supporting local installations as this may lead to disk space issues (if not carefully monitored) and then cascade to more serious problems.

However, if you need it locally, you can install it in $ATLAS_LOCAL_ROOT_BASE/localATLASConditionsData and create your PoolFileCatalog.xml as directed in the ATLAS instructions. Copy the PoolFileCatalog.xml to $ATLAS_LOCAL_ROOT_BASE/Athena/conditions/poolcond/PoolFileCatalog.xml (yes, for this one case, you can create the parent directories even though they are not prefixed by local). When your users next do setupATLAS, the environment will point to those conditions pool files if cvmfs is not available.

FAQ

Requirements for a Tool to be Installed

  • must be needed or potentially useful for many ATLAS users
  • must work on lxplus as well as outside CERN at Tier3s
  • should not be available through lcgenv (/cvmfs/sft.cern.ch)
  • must have clear support structure (ie person/people responsible or eGroup)
  • needs a Twiki page if it is not clear what the tool does
  • installation requirements:
    • on vanilla SL6 and SL7 machines with HepOSLibs (no virtualenv, pip, etc).
    • relocatable (ie once installed on /cvmfs, it must work if we mount /cvmfs on eg /myMount/cvmfs)
    • strict versioning means version installed today is the same as that installed in the future
    • installed files at any site shouldl be identical (except for local paths but it should be relocatable through env variables)

Release Notes

Release notes are found here.

-- AsokaDeSilva - 25 Jun 2008

Edit | Attach | Watch | Print version | History: r65 < r64 < r63 < r62 < r61 | Backlinks | Raw View | Raw edit | More topic actions
Topic revision: r65 - 2018-12-08 - AsokaDeSilva
 
This site is powered by the TWiki collaboration platform Powered by PerlCopyright © 2008-2019 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki? Send feedback