Dear Yvon,
I am currently using a Cray EX HPC, and I am having an MPI issue. The installation instructions are for Cray X machines (https://www.code-saturne.org/documentat ... utotoc_md0), and I am not sure if this documentation can be updated for the EX version of Cray.
I have CS v8.2 running accross nodes, and it seems to run the calculation well. My issues comes from the post-processing algorithms that we can call up after runtime (https://www.code-saturne.org/documentat ... names.html). An example of such a field name is algo:rij_pressure_strain_correlation.
This issue is that I think that the colon (i.e. "algo:") is reserved for MPI related processes, and then an MPI error occurs in the post-processing stage of the code. To test this hypothesis, I changed the post-processing call to algo.rij_pressure_strain_correlation, and no error occurred.
Have other users encountered this as well? Is this a MPI installation issue, or a CS installation issue?
Any advice on navigating this issue would be greatly appreciated.
Best regards,
Sean Hanrahan
MPI Installation on Cray EX HPC
Forum rules
Please read the forum usage recommendations before posting.
Please read the forum usage recommendations before posting.
-
- Posts: 4157
- Joined: Mon Feb 20, 2012 3:25 pm
Re: MPI Installation on Cray EX HPC
Hello,
Yes, we have encountered this, and modified the src/fvm/fvm_to_ensight_case.c (now .cpp) file accordingly, to replace ":" with ".".
This is due to MPICH's ROMIO MPI-IO library reserving the ":" in filename for its own purpuses (i.e. some interpretation of metadata).
I just merged this to the 8.2 branch.
You can work around this in v8.2 by disableing MPI-IO.
Best regards,
Yvan
Yes, we have encountered this, and modified the src/fvm/fvm_to_ensight_case.c (now .cpp) file accordingly, to replace ":" with ".".
This is due to MPICH's ROMIO MPI-IO library reserving the ":" in filename for its own purpuses (i.e. some interpretation of metadata).
I just merged this to the 8.2 branch.
You can work around this in v8.2 by disableing MPI-IO.
Best regards,
Yvan
-
- Posts: 16
- Joined: Tue Apr 09, 2024 3:26 am
Re: MPI Installation on Cray EX HPC
Thanks for your help with this. I will modify my code on my git branch, and reinstall. I'll post here if I encounter any issues.
Best regards,
Sean Hanrahan
Best regards,
Sean Hanrahan
-
- Posts: 16
- Joined: Tue Apr 09, 2024 3:26 am
Re: MPI Installation on Cray EX HPC
Hi Yvon,
I have test this update and this worked for me.
Just a quick question about the algo:tke_production field. Is there a simple way to implement this in the k-omega model in CS8.2.2? The production field algorithms are availabe for elipsilon based models, but not omega ones, and it would be convenient to calculate production in the same way in the solver.
Best regards,
Sean Hanrahan
For the interested reader who isn't using CSV 8.2.2:
The correction for this issue is documented here - https://github.com/code-saturne/code_sa ... 682e50R426
I have test this update and this worked for me.
Just a quick question about the algo:tke_production field. Is there a simple way to implement this in the k-omega model in CS8.2.2? The production field algorithms are availabe for elipsilon based models, but not omega ones, and it would be convenient to calculate production in the same way in the solver.
Best regards,
Sean Hanrahan
For the interested reader who isn't using CSV 8.2.2:
The correction for this issue is documented here - https://github.com/code-saturne/code_sa ... 682e50R426