-
Notifications
You must be signed in to change notification settings - Fork 213
Add Urania machine and validate flag to scheduler #6184
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add Urania machine and validate flag to scheduler #6184
Conversation
2700b51 to
5a846f9
Compare
5a846f9 to
9ab841b
Compare
6f41978 to
455043b
Compare
| {{ super() -}} | ||
| #SBATCH -D ./ | ||
| #SBATCH --nodes {{ num_nodes | default(1) }} | ||
| #SBATCH --ntasks-per-node=1 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We have found that 2 tasks per node work well on other clusters of this size, maybe even 3. I see you currently do 1 task but 2 comm threads. Have you tried 2 tasks with 1 comm thread each? Not sure if there's a difference @knelli2 @nilsdeppe do you know?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Typically each task = an MPI rank = a comm core
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I haven't really experimented with this. So I don't know.
eddd75c to
3d4f862
Compare
nilsvu
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You can squash
support/Environments/urania.sh
Outdated
| spectre_setup_charm_paths() { | ||
| # Define Charm paths | ||
| export CHARM_ROOT=/u/guilara/charm_impi_2/mpi-linux-x86_64-smp | ||
| export PATH=$PATH:/u/guilara/charm_impi_2/mpi-linux-x86_64-smp/bin |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I suggest you create a module file like /u/guilara/modules/charm/7.0.0-impi-2 that sets these paths. Then you can module load it below and don't have to call this extra spectre_setup_charm_paths function.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think I want to tackle this right now. Installing charm was a bit tricky
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Then better set these paths in spectre_load_modules, or else you always have to call this extra function?
support/Environments/urania.sh
Outdated
| -D Python_EXECUTABLE=${SPECTRE_HOME}/env/bin/python \ | ||
| -D Catch2_DIR=/u/guilara/repos/Catch2/install_dir/lib64/cmake/Catch2 \ | ||
| -D MPI_C_COMPILER=/mpcdf/soft/SLE_15/packages/skylake\ | ||
| /impi/gcc_11-11.2.0/2021.7.1/bin/mpigcc \ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So cmake doesn't find these without help even though the impi/2021.7 module is loaded?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I just tried removing them. I must have added them because it couldn't find it.
-- Could NOT find MPI_C (missing: MPI_C_LIB_NAMES MPI_C_HEADER_DIR MPI_C_WORKS)
-- Could NOT find MPI (missing: MPI_C_FOUND C)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is MPI needed? If everything works without explicitly linking MPI then just remove these lines.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we do need it, no?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No we don't link to it explicitly, only charm is built with it.
2c0688e to
1438c07
Compare
knelli2
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this looks good for now. If you want to change anything later, you can.
Please squash this all into two commits (after @nilsvu has looked at this again). 1 for the new validate option in python, and 2 for adding the Urania env, machine, and submit files.
support/SubmitScripts/Urania.sh
Outdated
| source ${SPECTRE_HOME}/support/Environments/urania.sh | ||
| spectre_load_modules | ||
| spectre_setup_charm_paths |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[optional] Might also want to module list after this so you can see what's loaded
support/Environments/urania.sh
Outdated
| spectre_setup_charm_paths() { | ||
| # Define Charm paths | ||
| export CHARM_ROOT=/u/guilara/charm_impi_2/mpi-linux-x86_64-smp | ||
| export PATH=$PATH:/u/guilara/charm_impi_2/mpi-linux-x86_64-smp/bin |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Then better set these paths in spectre_load_modules, or else you always have to call this extra function?
support/Environments/urania.sh
Outdated
| -D Python_EXECUTABLE=${SPECTRE_HOME}/env/bin/python \ | ||
| -D Catch2_DIR=/u/guilara/repos/Catch2/install_dir/lib64/cmake/Catch2 \ | ||
| -D MPI_C_COMPILER=/mpcdf/soft/SLE_15/packages/skylake\ | ||
| /impi/gcc_11-11.2.0/2021.7.1/bin/mpigcc \ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No we don't link to it explicitly, only charm is built with it.
support/Environments/urania.sh
Outdated
| source /urania/u/guilara/repos/spack/var/spack/environments\ | ||
| /env3_spectre_impi/loads | ||
| # Load python environment | ||
| source /u/guilara/envs/spectre_env |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
missing "bin/activate" ?
aa58e5e to
303d00e
Compare
|
@nilsvu I know its been a while. I've rebased this. Hopefully we can get this merged. Unfortunately, spack loads doesn't work for me on either Urania nor Viper so I've kept the "slow" way of activating the spack environment |
Head branch was pushed to by a user without write access
dd3303e to
bfc8b53
Compare
|
@nilsvu I've just removed here the function to set the charm path |
Add no-validate flag to scheduler Change name of option Add environment file Clean script Fix validate flag Fix
bfc8b53 to
032dec9
Compare
Proposed changes
Add submit script information about Urania. Add no-validate flag to scheduler to prevent it from attempting to run mpi on the head node ( #6183).
Upgrade instructions
Code review checklist
make docto generate the documentation locally intoBUILD_DIR/docs/html.Then open
index.html.code review guide.
bugfixornew featureif appropriate.Further comments