ATLAS

ATLAS-related resources on Greenplanet

There are currently several ATLAS/UCLHC/OSG resources operating on the cluster. Some are restricted to certain groups and users.

  • The general ATLAS login node: gpatlas1.ps.uci.edu can submit jobs to the local "atlas" slurm partition, as well as ATLAS resources on the Open Science Grid (includes OSG jobs on local atlas compute nodes)
  • The "atlas" partition compute nodes: 7, 8-core, 24GB RAM systems + 1, 12-core, 24GB RAM system.
  • The Taffard Group node uclhc-1.ps.uci.edu can submit jobs to the local "atlas" slurm partition, and through HTCondor, to the local uclhc pool and ATLAS resources on the Open Science Grid (includes OSG jobs on local atlas compute nodes)
  • The Taffard Group node uclhc-2.ps.uci.edu is like uclhc-1, but is running the local HTCondor instance. This node also has the node-locked FPGA simulator software.

User accounts are shared on gpatlas1, uclhc-1, uclhc-2, and the compute nodes. All mount the user home directories and /DFS-L. 

The general-purpose Greenplanet login nodes (gplogin2 and gplogin3) can also be used for account and file access, but don't have the ATLAS software stack installed. They also require use of the UCI VPN to access from off campus.

All login nodes are attached to the Science DMZ "LightPath" for low-bottleneck connections to the Internet. They are technically outside the UCI firewall, but allow inbound connections from UCI. The gpatlas and uclhc nodes specifically allow CERN and UCSD-related traffic.