Atom topic feed | site map | contact | login | Protection des données personnelles | Powered by FluxBB | réalisation artaban
You are not logged in.
hi,
- can i work with code_aster as python module outside of the container..... executing a ".py" file on host desktop ? How can i import module outside of container ?
- salome_meca will be relased only whit singularity ? Or will be also relased universal binary ?
- how add macros in singularity folders ?
- how to run "./as_run" of aster fronted in salome meca ?
- you will continue to release full code sources packages for manual installation ?
Last edited by ing.nicola (2021-10-01 20:51:48)
Offline
Hi Nikola,
You will be able to do all that from within the singularity container of the LGPL version.
> Running a python script located on the host machine (the user Home is automatically mounted in the container)
> Running as_run (or any other app) using salome shell
> Adding macros to salome installation by using the bind features of Singularity
Not sure if a "universal" installer will be produced for the LGPL version.
Offline
For import code_aster in visual studio and run it outside of container ?
Offline
> Running a python script located on the host machine (the user Home is automatically mounted in the container)
> when i try to run python script in shell mode ... "import code_aster" fails : ImportError: No module named 'code_aster'
> Running as_run (or any other app) using salome shell
> in shell mode ,It give me error 4 for "./as_run --tes forma01a" : module numpy not found
> Adding macros to salome installation by using the bind features of Singularity
> in shell mode , when i try copy files of macros in ca folders it give error :"read only FileSytem"
I am very worried.
I use Salome_meca both from the interface and from the terminal via as_run and I have written many macros to facilitate some operations.
If only the singularity version is released .... I will have to continue using the 2020 version.
Offline
Request Experts at Code Aster.
Kindly Make a Video Tutorial on Installation. Lot of Queries will get solved.
Offline
Request Experts at Code Aster.
Kindly Make a Video Tutorial on Installation. Lot of Queries will get solved.
1. Aptly said by sameer21101970.
2. Hope, the developers/experts (within/outside EDF) would help us soon in the installation process.
3. This is necessary since many of us here, like me, are engineers with no formal training in IT or software engineering.
4. The container being very new, we do need professional help to get initiated.
Hope to listen from the moderator/experts soon.
With Best Regards
Sukumar
Offline
The container version of SM2021 (Singularity) is much more robust and is also easy to install.
Code_Asterの開発者
Offline
I have installed macros just creating an overlay writable like described in //sylabs.io/guides/3.5/user-guide/persistent_overlays.html
So using
singularity exec --bind <path_local>:<path_container> -w <name_file_sif>.sif cp <path_container>/macro_ops.py /opt/..../MacroCommands/macro_ops.py
, i installed the macros.
The container version of SM2021 (Singularity) is much more robust and is also easy to install.
I agree. Excellent solution .. but some problems arise ....
i'm continuing to try to run "./as_run" of code aster fronted but it doesn't work.... it continue to give me "can't import numpy"
An option like this would be useful :
./<same_meca_launcher> --as_run <file export>
It would be wonderful to have the mpi version too ( .... no more struggles with installations! ..)
Last edited by ing.nicola (2021-10-05 10:48:47)
Offline
The MPI prerequisities are included in the container. You have to recompile only code_aster inside the container
Offline
i'm continuing to try to run "./as_run" of code aster fronted but it doesn't work.... it continue to give me "can't import numpy"
singularity run -B /tmp:/local00/tmp ~/containers/xxxxx.sif shell -- as_run --run case.export
Works like a charm
Offline
singularity run -B /tmp:/local00/tmp ~/containers/xxxxx.sif shell -- as_run --run case.export
Works like a charm
It works for me ! How the hell did you do !?!?
The MPI prerequisities are included in the container. You have to recompile only code_aster inside the container
I think that prerequisites are this ? :
Singularity> cd /opt/public/20210811/gcc8-openblas-ompi2
Singularity> ls
asrun-2021.0.0-1 grace-0.0.1 med-4.1.0 mfront-3.4.0 mumps-5.2.1_consortium_aster3 scalapack-2.1.0 VERSION
ecrevisse-3.2.2 hdf5-1.10.3 medcoupling-V9_7_0 miss3d-6.7_aster6 parmetis-4.0.3_aster3 scibian9_mpi.sh
gmsh-2.12.0-Linux64 homard-11.12_aster2 metis-5.1.0_aster4 mpi4py-3.0.3 petsc-3.15.3_aster3 scotch-6.0.4_aster7
- I have to compile sequential and the mpi version in new folder?
- How to compile only mpi version with links to previous libraries?
- How to make visbile mpi version to salome-meca ? in way to choice for stable,oldstable,...stable mpi
- what if the mpi version was already installed but i can't see it?
Last edited by ing.nicola (2021-10-06 10:31:18)
Offline
- I have to compile sequential and the mpi version in new folder?
you only need to compile mpi version.
- what if the mpi version was already installed but i can't see it?
- there isn't pre-installed mpi version.
For import code_aster in visual studio and run it outside of container ?
Visual studio support only docker remote
- How to compile only mpi version with links to previous libraries?
I builded version without PETSC . Issues occours with PETSC.
I followed this procedure:
- download stable version from here sourceforge.net/p/codeaster/src/ci/15.4.0/tree/
- place in /home/ the unzipped and renamed "aster-src" folder
- it is necessary add an overlay to sif image ( see singulatity site)
- in terminal :
sudo singularity run --bind /home:/home -w <full_path_of_image>.sif shell
Singularity> export TOOLS="/opt/salome_meca/Salome-V2021-s9/tools"
Singularity> export ASTER_ROOT="${TOOLS}/Code_aster_15_4_0_mpi"
Singularity> cd /home/aster-src
Singularity> ./waf_mpi configure --prefix=${ASTER_ROOT} --disable-petsc --install-tests --jobs=4
Singularity> ./waf_mpi build --jobs=4
Singularity> ./waf_mpi install
Singularity>echo "vers : stable_mpi:/opt/salome_meca/Salome-V2021-s9/tools/Code_aster_15_4_0_mpi/share/aster" >> ${TOOLS}/Code_aster_frontend-2021001/etc/codeaster/aster
That's all ! (?)
- How to make visbile mpi version to salome-meca ? in way to choice for stable,oldstable,...stable mpi
Last istruction in previous code make mpi version visible to salome_meca. and to as_run .
Excited about the singularity version of salome!
I'm testing it ..
Last edited by ing.nicola (2021-10-11 11:41:47)
Offline
Hello Ing. Nicola!
Hats off, if that works. Unfortunately, if I execute
sudo singularity run --bind /home:/home -w ~/salome_meca-lgpl-2021.0.0-1-20210811-scibian-9.sif shell
I get:
FATAL: no SIF writable overlay partition found in /home/simulation2/salome_meca-lgpl-2021.0.0-1-20210811-scibian-9.sif
What am I doing wrong (the file is writeable, if that is your first idea..)?
Thank you,
Mario.
EDIT: If I add a tmpfs I get:
sudo singularity run --bind --writable-tmpfs /home:/home -w ~/salome_meca-lgpl-2021.0.0-1-20210811-scibian-9.sif shell
FATAL: could not open image /home:/home: failed to retrieve path for /home:/home: lstat /home:: no such file or directory
Last edited by mf (2021-10-11 10:53:47)
Offline
- it is necessary add an overlay to sif image ( see singulatity site)
see sylabs.io/guides/3.5/user-guide/persistent_overlays.html - section ( Overlay embedded in SIF ) . it is necessary for writing files.
The overlay is embedded , so i can delete overlay.img . Use 1000MB min , not 500MB.
dd if=/dev/zero of=overlay.img bs=1M count=1000 && mkfs.ext3 overlay.img
singularity sif add --datatype 4 --partfs 2 --parttype 4 --partarch 2 --groupid 1 <full_path_to_sif_file>.sif overlay.img
"-w " afer bind...
sudo singularity run --bind /home:/home -w <full_path_of_image>.sif shell
Last edited by ing.nicola (2021-10-11 15:11:19)
Offline
Hello again,
I encountered some errors which I was able to solve. For anyone who wants to recreate this, I write down the corrected commands below:
dd if=/dev/zero of=overlay.img bs=1M count=1000 && mkfs.ext3 overlay.img
singularity sif add --datatype 4 --partfs 2 --parttype 4 --partarch 2 --groupid 1 <full_path_to_sif_file>.sif overlay.img (without 'ubuntu' in between)
sudo singularity run --bind /home:/home -w <full_path_of_image>.sif shell
Singularity> export TOOLS="/opt/salome_meca/Salome-V2021-s9/tools"
Singularity> export ASTER_ROOT="${TOOLS}/Code_aster_15_4_0_mpi"
Singularity> cd /home/aster-src
Singularity> ./waf_mpi configure --prefix=${ASTER_ROOT} --disable-petsc --install-tests --without-hg --jobs=4 (without '--without-hg' gave an error)
Singularity> ./waf_mpi build --jobs=4
Singularity> ./waf_mpi install
Singularity>echo "vers : stable_mpi:/opt/salome_meca/Salome-V2021-s9/tools/Code_aster_15_4_0_mpi/share/aster" >> ${TOOLS}/Code_aster_frontend-2021001/etc/codeaster/aster
Amazing! Grazie ingegnere! I will do some tests now,
thanks a lot,
Mario.
Last edited by mf (2021-10-11 19:03:18)
Offline
Hello,
too bad, although compilation was successful I still get an error using 'stable_mpi', a config file is missing. Did you leave anything out in your description?
Mario.
Offline
To use PESTC with the mpi version, you need a recent version of code_aster (at least 16.0.6)
To have code_aster in visual studio code, source profile.sh in $INSTALL_DIR/std/share/aster or add to your PYTHONPATH what you need
Offline
Hello,
too bad, although compilation was successful I still get an error using 'stable_mpi', a config file is missing. Did you leave anything out in your description?
Mario.
Attached my config.txt file. Try to place in "/opt/salome_meca/Salome-V2021-s9/tools/Code_aster_15_4_0_mpi/share/aster"
Mybe you have to modify some paths in the files. (SRC rows maybe doesn't need )
To have code_aster in visual studio code, source profile.sh in $INSTALL_DIR/std/share/aster or add to your PYTHONPATH what you need
How to source a file in the container =? it is in .sif file.
Last edited by ing.nicola (2021-10-11 15:04:58)
Offline
To use PESTC with the mpi version, you need a recent version of code_aster (at least 16.0.6)
I think mpi is working progress for EDF...
Last edited by ing.nicola (2021-10-11 15:10:46)
Offline
Hello,
thank you, but that didn't help either. Now I am just getting EXIT_CODE=0. I am missing the whole
/opt/salome_meca/Salome-V2021-s9/tools/Code_aster_15_4_0_mpi/
directory in the container. It was not installed. I suppose it is, because I activated --without-hg (better: had to activate --without-hg)?
I will repeat the compilation, something is not right.
Mario.
Last edited by mf (2021-10-11 15:18:13)
Offline
When you are in the container in shell mode "./salome_meca_lgpl --shell", this is exactely if you are in a terminal. So "source profile.sh" works
Petsc change often of C API so if you want to use mpi version of code_aster with petsc you have to be up to date.
Offline
Hello,
thank you, but that didn't help either. Now I am just getting EXIT_CODE=0. I am missing the whole
/opt/salome_meca/Salome-V2021-s9/tools/Code_aster_15_4_0_mpi/
directory in the container. It was not installed. I suppose it is, because I activated --without-hg (better: had to activate --without-hg)?
I will repeat the compilation, something is not right.
Mario.
I got the same error as the "config.txt" file
I honestly don't remember how I solved it,
yes maybe I recompiled without the --without-hg option.
I have also made several attempts ... I have only reported the latest version of the instructions I used.
Offline
Hello again,
I started over but I cannot compile without setting --without-hg (which in turn maybe the cause for the missing directory...).
The command
./waf_mpi configure --prefix=${ASTER_ROOT} --disable-petsc --install-tests --jobs=4
gives me the error in the attached image.
So, at the moment I am stuck.
Ingegnere, could you please take a look at your history? Something maybe missing. How did you download the CA sources? With
hg clone http: //hg.code.sf.net/p/codeaster/src codeaster-src ?
thank you
Mario.
ADDENDUM: I tried mercurial from 'sudo snap install mercurial' and 'sudo apt-get install mercurial'. They are different versions, both lead to this error.
ADDENDUM2: I cloned the repos also with the option in hg --config_format.sparse-revlog=no. No success either.
ADDENDUM3: I think the ./waf configure line is not entirely correct....
Last edited by mf (2021-10-11 17:03:36)
Offline
I have retried the whole procedure and it works.
I downloaded the sources from the https //sourceforge.net/p/codeaster/src/ci/15.4.0/tree/ above (version 15_4_0) by pressing the downaload snapshot button.
Erase everything and start from scratch.
- created the overlay
- first compile with --without-hg option .. it doesn't create config.txt file -(run "find /opt -name config.txt" to see)
- I compiled the second time without the --without-hg option. The error appears, but I ignored it and kept running "./waf_mpi build" and "./waf_mpi install". Since everything is already compiled in it is much faster. now it creates the config.txt file. ( rerun "find /opt -name config.txt" )
- echo "vers : stable_mpi:/opt/salome_meca/Salome-V2021-s9/tools/Code_aster_15_4_0_mpi/share/aster" >> ${TOOLS}/Code_aster_frontend-2021001/etc/codeaster/aster
-then :
singularity run -w --app install <path_image> .sif
- I launched same_meca from the script ./<im_name>
Last edited by ing.nicola (2021-10-11 18:29:30)
Offline
Hello,
I retried on another machine (@home) now and it works! Even with --without-hg it writes the config.txt now, but with the .zipped CodeAster-SRC instead of the hg. It I am not sure, maybe I made a mistake in my last try somewhere. On this machine I have Singularity 3.7.0 installed (but I don't think that is the reason, the previous machine had V3.5.3).
Tomorrow I will try the other workstation again from scratch. Maybe I will write a little tutorial on this with your permission.
Thank you very much, have a nice evening,
Mario.
Last edited by mf (2021-10-11 21:08:55)
Offline