Re: Multiple Jobs

From: Vermaas, Josh (vermaasj_at_msu.edu)
Date: Sun Feb 06 2022 - 10:52:58 CST

What I do is to use symlinks. All of my directories have a “system.psf”, which points to the correct psf in a “build” directory where I make all of my systems. Same story for “system.pdb”, and you can make this really creative. I use generic outputnames that increment based on how many dcdfiles are present in the directory, so something like:

set numrun [llength [glob -nocomplain run*dcd]]
set outputname [format “run.%03d” $numrun]

You could also do it with environment variables. I think everyone has their own way of doing this, but basically anyone who has hundreds of simulations to manage develops some sort of workflow script (or scripts) to manage everything.

-Josh

From: <owner-namd-l_at_ks.uiuc.edu> on behalf of "McGuire, Kelly" <mcg05004_at_byui.edu>
Reply-To: "namd-l_at_ks.uiuc.edu" <namd-l_at_ks.uiuc.edu>, "McGuire, Kelly" <mcg05004_at_byui.edu>
Date: Sunday, February 6, 2022 at 3:47 AM
To: "namd-l_at_ks.uiuc.edu" <namd-l_at_ks.uiuc.edu>
Subject: namd-l: Multiple Jobs

Has anyone ever submitted lots of jobs - (let's say for example 100 minimization jobs (100 different protein systems) - in parallel with SLURM using only one minimization configuration file, but defining environment variables for the coordinates, structure, consref, conskfile, outputname, dcdfile, and restartname that are passed from you bash script to the configuration file?

Or, is there no way around making 100 minimization files for each of the 100 protein systems, 100 annealing files, and 100 equilibration files?


Dr. Kelly L. McGuire

PhD Biophysics

Department of Physiology and Developmental Biology

Brigham Young University

LSB 3050

Provo, UT 84602


This archive was generated by hypermail 2.1.6 : Tue Dec 13 2022 - 14:32:44 CST