Atom topic feed | site map | contact | login | Protection des données personnelles | Powered by FluxBB | réalisation artaban
You are not logged in.
Hi there,
I get the following error trying to run salome in GPU mode:
not exist: /etc/krb5.conf
/usr/bin/nvidia-smi
**************************************
INFO : Running salome_meca in GPU mode
**************************************
runSalome running on ruy-Precision-T7610
Searching for a free port for naming service: 2811 - OK
Searching Naming Service + found in 0.1 seconds
Searching /Kernel/Session in Naming Service ++++SALOME_Session_Server: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.34' not found (required by /.singularity.d/libs/libGLX.so.0)
Traceback (most recent call last):
File "/opt/salome_meca/Salome-V2021-s9/modules/KERNEL_V9_7_0/bin/salome/orbmodule.py", line 181, in waitNSPID
os.kill(thePID,0)
ProcessLookupError: [Errno 3] No such process
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/opt/salome_meca/appli_V2021/bin/salome/runSalome.py", line 694, in useSalome
clt = startSalome(args, modules_list, modules_root_dir)
File "/opt/salome_meca/appli_V2021/bin/salome/runSalome.py", line 639, in startSalome
session=clt.waitNSPID("/Kernel/Session",mySessionServ.PID,SALOME.Session)
File "/opt/salome_meca/Salome-V2021-s9/modules/KERNEL_V9_7_0/bin/salome/orbmodule.py", line 183, in waitNSPID
raise RuntimeError("Process %d for %s not found" % (thePID,theName))
RuntimeError: Process 11753 for /Kernel/Session not found
I'm on Ubuntu 22.04 LTS
Runs fine on --soft mode
Last edited by ruy (2022-08-01 23:12:04)
Offline
Hi Ruy,
It seems that you have the Nvidia drivers installed. The launch script detects /usr/bin/nvidia-smi. However, when it tries to run in graphical mode it seems it cannot connect to the Nvidia drivers. This may be because your computer has a "dual" GPU.
For example, in my laptop, my Intel CPU comes with a low-power GPU (Intel Graphics), but I also have an Nvidia GPU. By default, the Intel drivers are loaded in my system, that allows it to be more power efficient. If this is your case, what you want to do is to make sure your system is using the Nvidia graphics. This, however, varies from system to system and driver version... Since you are using the newest Ubuntu LTS version, I would guess you can use the new recommended method for newer Nvidia drivers:
Prefix your singularity command with: __NV_PRIME_RENDER_OFFLOAD=1 __GLX_VENDOR_LIBRARY_NAME=nvidia
This will tell your system to activate the Nvidia drivers for such command. You can test this by using the "glxinfo" command (in Ubuntu I do not know what package provides it). Run glxinfo with and without the variables above and see if the Vendor is Intel or Nvidia.
If that does not work, tell me and I can share more info.
Best regards,
Fernando Oleo Blanco
Offline
hello Irvise
i have some trouble with the GUI of Salome-Meca and i try to follow your advice by typing glxinfo
on my laptop this produces an output of over 500 lines
which ones are the intersting ones?
thanks
jean pierre aubry
consider reading my book
freely available here https://framabook.org/beginning-with-code_aster/
Offline
Hello Jean-Pierre,
first things first, thank you for your book, it helped me quite a bit.
Secondly, yes, the output of glxinfo can be overwhelming. I would recommend filtering out with grep. A quick and useful command would be "glxinfo | grep -i vendor" and "glxinfo | grep -i device". The actual useful lines are more or less at the top of the output of glxinfo, under the sections "Extended renderer info" and "OpenGL". You can get a bit more useful output with "glxinfo | grep -i opengl".
Could you copy the output of those commands? That way I can see what graphics driver is being used.
Best regards,
Fer
Offline
I ran the singularity command with the prefix
__NV_PRIME_RENDER_OFFLOAD=1 __GLX_VENDOR_LIBRARY_NAME=nvidia singularity run --app install salome_meca-lgpl-2021.0.0-2-20211014-scibian-9.sif
But still get the same error while running salome:
not exist: /etc/krb5.conf
/usr/bin/nvidia-smi
**************************************
INFO : Running salome_meca in GPU mode
**************************************
runSalome running on ruy-Precision-T7610
Searching for a free port for naming service: 2810 - OK
Searching Naming Service + found in 0.1 seconds
Searching /Kernel/Session in Naming Service +++++SALOME_Session_Server: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.34' not found (required by /.singularity.d/libs/libGLX.so.0)
Traceback (most recent call last):
File "/opt/salome_meca/Salome-V2021-s9/modules/KERNEL_V9_7_0/bin/salome/orbmodule.py", line 181, in waitNSPID
os.kill(thePID,0)
ProcessLookupError: [Errno 3] No such process
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/opt/salome_meca/appli_V2021/bin/salome/runSalome.py", line 694, in useSalome
clt = startSalome(args, modules_list, modules_root_dir)
File "/opt/salome_meca/appli_V2021/bin/salome/runSalome.py", line 639, in startSalome
session=clt.waitNSPID("/Kernel/Session",mySessionServ.PID,SALOME.Session)
File "/opt/salome_meca/Salome-V2021-s9/modules/KERNEL_V9_7_0/bin/salome/orbmodule.py", line 183, in waitNSPID
raise RuntimeError("Process %d for %s not found" % (thePID,theName))
RuntimeError: Process 15226 for /Kernel/Session not found
Offline
Hi Ruy,
run "singularity run --app install salome_meca-lgpl-2021.0.0-2-20211014-scibian-9.sif" as is alone. It should then create a Python excecutable with the same name as the container in that folder. Then run
__NV_PRIME_RENDER_OFFLOAD=1 __GLX_VENDOR_LIBRARY_NAME=nvidia ./salome_meca-lgpl-2021.0.0-2-20211014-scibian-9
It will most likely fail however... Your error is related to GLIBC having issues with the GLX library. It happened to me to a few times. I just tried several times and then it finally worked in a virtualised Linux Mint 21 (same base as Ubuntu 22.04). GLIBC errors are a bit tricky to fix because they are the "core" of the system. If a library has incompatibilities with it... Nonetheless, it should work. Ubuntu 22.04 has a newer GLIBC version that should be compatible as far as I know...
Regards,
Fer
Offline
hello Irvise
it is good to know that my writing helped
concerning Salome-Meca here are my findings
first
glxinfo | grep -i vendor
gives no return
second
glxinfo | grep -i device
Device: Mesa DRI Intel(R) HD Graphics 520 (Skylake GT2) (0x1921)
and finally
dedalus@stephen:~> glxinfo | grep -i opengl
OpenGL vendor string: Intel Open Source Technology Center
OpenGL renderer string: Mesa DRI Intel(R) HD Graphics 520 (Skylake GT2)
OpenGL core profile version string: 4.5 (Core Profile) Mesa 18.3.2
OpenGL core profile shading language version string: 4.50
OpenGL core profile context flags: (none)
OpenGL core profile profile mask: core profile
OpenGL core profile extensions:
OpenGL version string: 3.0 Mesa 18.3.2
OpenGL shading language version string: 1.30
OpenGL context flags: (none)
OpenGL extensions:
OpenGL ES profile version string: OpenGL ES 3.2 Mesa 18.3.2
OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.20
OpenGL ES profile extensions:
in addition i attach a view of Salome-Meca window with a mesh opened
jean pierre aubry
Last edited by jeanpierreaubry (2022-08-06 07:43:36)
consider reading my book
freely available here https://framabook.org/beginning-with-code_aster/
Offline
Hi Jean-Pierre,
as you can see from the output, your graphics provider is Intel, see the first line from "glxinfo | grep -i opengl".
From the output of glxinfo and your wallpaper I think I can infer that you are using OpenSUSE Leap or SLES. Your Mesa driver is quite old (v18) we are already in v22 in bleeding edge Linux distributions (I personally use OpenSUSE Tumbleweed and Leap). In the newest version of OpenSUSE Leap, version 15.4, Mesa is v21.2.
So, here are my recommendations:
- Make sure that you have your NVIDIA drivers installed. I am assuming you do have an NVIDIA graphics card! Luckily, OpenSUSE does have an official NVIDIA repository, but it has to be enabled manually. You can see the instructions here: en.opensuse.org/SDB:NVIDIA_drivers
- The process is simple and painless, though it may be convoluted in a couple of steps. However, if you have doubts or questions, just ask them.
- Then we need to make sure that your computer actually picks the Nvidia drivers. This is a bit tougher... Since your system is quite old, it will most likely need the bumblebee/prime-select approach. You can read more about it here en.opensuse.org/SDB:NVIDIA_SUSE_Prime
- Though this step may be a little bit tougher to pull correctly with the expected outcome. If you have any issues, feel free to write me here or directly.
- Afterwards, depending on the method that actually makes your computer use the Nvidia drivers, you should be set and everything should just run.
Best regards,
Fer
Offline
thanks for that prompt answer
yes i have openSuSE 15.1
would it be more efficient to update straight away to 15.4
however a question is puzzling me for quite a long time now:
why a software like Gmsh with quite good graphic capabilities runs straight away out of the box?
and why Some-Meca keeps bringing such difficulties years after years?
and why the singularity container, as far as i can see, complicated the things even more?
thanks again
consider reading my book
freely available here https://framabook.org/beginning-with-code_aster/
Offline
Hi Jean-Pierre,
I am not part of the CA team, so I cannot answer your questions regarding their decisions, however, I can shine some light later.
Now, regarding OpenSUSE Leap. Version 15.1 has been EOL (End Of Life) for a while now! Older Leap versions tend to be supported for 9 months (+-) after the new release comes out, so only 15.3 and 15.4 are currently supported. I would assume that you cannot (normally) update your systems as most repositories for version 15.1 have been offline for a while. So you may indeed need to jump to 15.3 and 15.4 directly. This is bad as normally only sequential point upgrades are supported: 15.1 to 15.2, 15.2 to 15.3, etc. So I would expect a bit of pain during the update... Depending on how much data you have in your system and backups, you may want to install 15.4 directly and wipe 15.1, but I leave that decision to you. Nontheless, I would make a backup independently of which option you take.
Regarding graphical problems:
- Most software that we use comes from our operating system's repositories. It is built with the tools that our OS provides. You probably installed GMSH from your repositories and it just works, because it is compiled with the tools and libraries of your system. Even if you compile it locally, it still uses the libraries that your system has, so everything should work nicely.
- Salome (and by extension Salome_Meca) is a very complex and large program. I do not know of any system that actually provides packages for Salome. This means, that the binaries that we can download from this page or Salome's are built with the libraries and tools that the developers chose. This may introduce incompatibilities with your system as your system may not have the same versions as the developers'.
- Singularity was introduced to fix that. Singularity is a container technology. It is basically running an entire operating system internally (that is why the file is so large!). In this case, it runs Scibian 9, which is the Debian version that the CA people seem to work with. The CA people built Salome_Meca in that container in a way that they know works... However, containers are also problematic... Most containers are just like lightweight virtualised systems. This means that programs running on containers are not just simply running in your computer. They run in their own container. The container in turn has to plug itself into the hardware and the underlying operating system for some functionality, such as graphical hardware access...
- This is where the problem comes... Singularity is a container technology developed for HPC (High Performance Computing). This means it tries to get as much power from the host operating system while it itself consumes the least amount of resources. But at the same time it has to keep the container software from doing weird stuff to the host (due to security, stability, etc). This means that graphical access is not as straight forward as it would seem... Singularity, sadly, only officially supports graphics acceleration for Nvidia and AMD cards, this last one is a bit experimental and not supported by the CA container. Intel cards, which are great in Linux, are not supported by Singularity, as they are not found in HPC environment... This is sad, I know.
I hope this answers some of your questions. Feel free to ask more if I was not clear or more issues pop up.
Regards,
Fer
Offline
Hi Ruy,
run "singularity run --app install salome_meca-lgpl-2021.0.0-2-20211014-scibian-9.sif" as is alone. It should then create a Python excecutable with the same name as the container in that folder. Then run
__NV_PRIME_RENDER_OFFLOAD=1 __GLX_VENDOR_LIBRARY_NAME=nvidia ./salome_meca-lgpl-2021.0.0-2-20211014-scibian-9
It will most likely fail however... Your error is related to GLIBC having issues with the GLX library. It happened to me to a few times. I just tried several times and then it finally worked in a virtualised Linux Mint 21 (same base as Ubuntu 22.04). GLIBC errors are a bit tricky to fix because they are the "core" of the system. If a library has incompatibilities with it... Nonetheless, it should work. Ubuntu 22.04 has a newer GLIBC version that should be compatible as far as I know...
Regards,
Fer
Thanks for the help. I tried out the procedure, but still get the error. I checkout my GLIBC version:
ldd (Ubuntu GLIBC 2.35-0ubuntu3.1) 2.35
Could that be the issue?
I ran
readelf -V /lib/x86_64-linux-gnu/libc.so.6 | grep GLIBC_2.34
And got several hits:
010: 26 (GLIBC_PRIVATE) 26 (GLIBC_PRIVATE) 10h(GLIBC_2.12) 23 (GLIBC_2.34)
014: 23 (GLIBC_2.34) 1dh(GLIBC_2.28) 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5)
018: 23 (GLIBC_2.34) 25 (GLIBC_PRIVATE) 2h(GLIBC_2.2.5) 25 (GLIBC_PRIVATE)
01c: 23 (GLIBC_2.34) 2 (GLIBC_2.2.5) 18 (GLIBC_2.23) 2h(GLIBC_2.2.5)
044: 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5) 23 (GLIBC_2.34)
060: 25 (GLIBC_PRIVATE) 2 (GLIBC_2.2.5) 2h(GLIBC_2.2.5) 23 (GLIBC_2.34)
064: 2 (GLIBC_2.2.5) 1dh(GLIBC_2.28) 23 (GLIBC_2.34) 2 (GLIBC_2.2.5)
07c: 2h(GLIBC_2.2.5) 6h(GLIBC_2.3.3) 23 (GLIBC_2.34) 2 (GLIBC_2.2.5)
088: 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5) 23 (GLIBC_2.34) 2h(GLIBC_2.2.5)
094: 1c (GLIBC_2.27) 2 (GLIBC_2.2.5) 1dh(GLIBC_2.28) 23 (GLIBC_2.34)
098: 25 (GLIBC_PRIVATE) 2 (GLIBC_2.2.5) 25 (GLIBC_PRIVATE) 23 (GLIBC_2.34)
0a0: 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5) 23 (GLIBC_2.34) 2h(GLIBC_2.2.5)
0c8: 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5) 23 (GLIBC_2.34) 13 (GLIBC_2.15)
0d0: 23 (GLIBC_2.34) 2h(GLIBC_2.2.5) 2h(GLIBC_2.2.5) 2 (GLIBC_2.2.5)
0d8: 24 (GLIBC_2.35) 1dh(GLIBC_2.28) 23 (GLIBC_2.34) 2 (GLIBC_2.2.5)
0e0: 2h(GLIBC_2.2.5) 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5) 23 (GLIBC_2.34)
0e4: 2 (GLIBC_2.2.5) 8h(GLIBC_2.4) 23 (GLIBC_2.34) 2h(GLIBC_2.2.5)
0ec: 23 (GLIBC_2.34) 2 (GLIBC_2.2.5) 2h(GLIBC_2.2.5) 2 (GLIBC_2.2.5)
118: 2h(GLIBC_2.2.5) 23 (GLIBC_2.34) e (GLIBC_2.10) 2 (GLIBC_2.2.5)
11c: 2 (GLIBC_2.2.5) 23 (GLIBC_2.34) 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5)
124: 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5) 23 (GLIBC_2.34) 2h(GLIBC_2.2.5)
138: 23 (GLIBC_2.34) 2h(GLIBC_2.2.5) 6h(GLIBC_2.3.3) 2 (GLIBC_2.2.5)
140: 2 (GLIBC_2.2.5) 23 (GLIBC_2.34) 23 (GLIBC_2.34) 1dh(GLIBC_2.28)
158: a (GLIBC_2.6) 23 (GLIBC_2.34) 25 (GLIBC_PRIVATE) 2 (GLIBC_2.2.5)
160: 25 (GLIBC_PRIVATE) 2 (GLIBC_2.2.5) 23 (GLIBC_2.34) 2 (GLIBC_2.2.5)
170: 2 (GLIBC_2.2.5) 5 (GLIBC_2.3.2) 4 (GLIBC_2.3) 23 (GLIBC_2.34)
174: 2 (GLIBC_2.2.5) 23 (GLIBC_2.34) 2 (GLIBC_2.2.5) 2h(GLIBC_2.2.5)
178: 2 (GLIBC_2.2.5) 23 (GLIBC_2.34) 1fh(GLIBC_2.30) 2 (GLIBC_2.2.5)
1a4: 25 (GLIBC_PRIVATE) dh(GLIBC_2.9) 23 (GLIBC_2.34) 23 (GLIBC_2.34)
1c0: 2 (GLIBC_2.2.5) 23 (GLIBC_2.34) 2 (GLIBC_2.2.5) 2h(GLIBC_2.2.5)
1c4: 2h(GLIBC_2.2.5) 23 (GLIBC_2.34) 25 (GLIBC_PRIVATE) 2h(GLIBC_2.2.5)
1cc: 2h(GLIBC_2.2.5) 2 (GLIBC_2.2.5) 4 (GLIBC_2.3) 23 (GLIBC_2.34)
1d8: 23 (GLIBC_2.34) 25 (GLIBC_PRIVATE) 7h(GLIBC_2.3.4) 2 (GLIBC_2.2.5)
1dc: 25 (GLIBC_PRIVATE) 2 (GLIBC_2.2.5) 23 (GLIBC_2.34) 2 (GLIBC_2.2.5)
1f0: 25 (GLIBC_PRIVATE) 23 (GLIBC_2.34) 2h(GLIBC_2.2.5) 2 (GLIBC_2.2.5)
204: 2 (GLIBC_2.2.5) 23 (GLIBC_2.34) 2h(GLIBC_2.2.5) 2h(GLIBC_2.2.5)
214: 23 (GLIBC_2.34) 2h(GLIBC_2.2.5) 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5)
260: 25 (GLIBC_PRIVATE) 23 (GLIBC_2.34) 2h(GLIBC_2.2.5) 1dh(GLIBC_2.28)
268: 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5) 23 (GLIBC_2.34) 2 (GLIBC_2.2.5)
274: 2h(GLIBC_2.2.5) 2h(GLIBC_2.2.5) 23 (GLIBC_2.34) 25 (GLIBC_PRIVATE)
27c: 2 (GLIBC_2.2.5) 23 (GLIBC_2.34) 25 (GLIBC_PRIVATE) 2 (GLIBC_2.2.5)
280: 7h(GLIBC_2.3.4) 2 (GLIBC_2.2.5) 23 (GLIBC_2.34) 2h(GLIBC_2.2.5)
29c: 23 (GLIBC_2.34) 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5)
2a0: 2h(GLIBC_2.2.5) 2 (GLIBC_2.2.5) 23 (GLIBC_2.34) 2 (GLIBC_2.2.5)
2c8: 2 (GLIBC_2.2.5) 23 (GLIBC_2.34) 2 (GLIBC_2.2.5) 25 (GLIBC_PRIVATE)
2d0: 2 (GLIBC_2.2.5) 23 (GLIBC_2.34) 10h(GLIBC_2.12) 8 (GLIBC_2.4)
2d8: 2 (GLIBC_2.2.5) 23 (GLIBC_2.34) 2h(GLIBC_2.2.5) 2 (GLIBC_2.2.5)
2dc: 25 (GLIBC_PRIVATE) 23 (GLIBC_2.34) 7 (GLIBC_2.3.4) 2h(GLIBC_2.2.5)
2f0: 2h(GLIBC_2.2.5) 23 (GLIBC_2.34) 2 (GLIBC_2.2.5) 23 (GLIBC_2.34)
308: 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5) 23 (GLIBC_2.34)
310: eh(GLIBC_2.10) 2 (GLIBC_2.2.5) 1a (GLIBC_2.25) 23 (GLIBC_2.34)
318: 25 (GLIBC_PRIVATE) 23 (GLIBC_2.34) 2 (GLIBC_2.2.5) 2h(GLIBC_2.2.5)
320: b (GLIBC_2.7) 7h(GLIBC_2.3.4) 23 (GLIBC_2.34) 2 (GLIBC_2.2.5)
33c: 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5) 23 (GLIBC_2.34) 2 (GLIBC_2.2.5)
354: 2h(GLIBC_2.2.5) 23 (GLIBC_2.34) 4 (GLIBC_2.3) 2 (GLIBC_2.2.5)
358: 1dh(GLIBC_2.28) 23 (GLIBC_2.34) 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5)
35c: 1dh(GLIBC_2.28) 23 (GLIBC_2.34) 2h(GLIBC_2.2.5) 23 (GLIBC_2.34)
364: 2 (GLIBC_2.2.5) 23 (GLIBC_2.34) 2h(GLIBC_2.2.5) 2 (GLIBC_2.2.5)
36c: 23 (GLIBC_2.34) 2 (GLIBC_2.2.5) 2h(GLIBC_2.2.5) 8 (GLIBC_2.4)
37c: 8h(GLIBC_2.4) 23 (GLIBC_2.34) 2 (GLIBC_2.2.5) 1c (GLIBC_2.27)
3a4: 23 (GLIBC_2.34) 25 (GLIBC_PRIVATE) 2 (GLIBC_2.2.5) 2h(GLIBC_2.2.5)
3b4: 23 (GLIBC_2.34) 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5)
3c4: 2 (GLIBC_2.2.5) 2h(GLIBC_2.2.5) 25 (GLIBC_PRIVATE) 23 (GLIBC_2.34)
3c8: 23 (GLIBC_2.34) 8 (GLIBC_2.4) 2h(GLIBC_2.2.5) 2h(GLIBC_2.2.5)
3cc: 2h(GLIBC_2.2.5) 25 (GLIBC_PRIVATE) 23 (GLIBC_2.34) 2 (GLIBC_2.2.5)
3d0: 25 (GLIBC_PRIVATE) 6h(GLIBC_2.3.3) 23 (GLIBC_2.34) 23 (GLIBC_2.34)
3dc: 2 (GLIBC_2.2.5) 2h(GLIBC_2.2.5) 23 (GLIBC_2.34) 23 (GLIBC_2.34)
3e4: 4 (GLIBC_2.3) 2 (GLIBC_2.2.5) 23 (GLIBC_2.34) 6h(GLIBC_2.3.3)
3f0: 23 (GLIBC_2.34) 2h(GLIBC_2.2.5) 2 (GLIBC_2.2.5) 23 (GLIBC_2.34)
3f8: 23 (GLIBC_2.34) 2 (GLIBC_2.2.5) 2h(GLIBC_2.2.5) 2h(GLIBC_2.2.5)
420: 2h(GLIBC_2.2.5) 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5) 23 (GLIBC_2.34)
460: 23 (GLIBC_2.34) 2h(GLIBC_2.2.5) 4 (GLIBC_2.3) 2 (GLIBC_2.2.5)
468: 2 (GLIBC_2.2.5) 23 (GLIBC_2.34) 23 (GLIBC_2.34) 23 (GLIBC_2.34)
47c: 4 (GLIBC_2.3) 23 (GLIBC_2.34) 22 (GLIBC_2.33) 2 (GLIBC_2.2.5)
4bc: 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5) 23 (GLIBC_2.34) 2h(GLIBC_2.2.5)
4c0: 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5) 23 (GLIBC_2.34) 2h(GLIBC_2.2.5)
4c8: 7h(GLIBC_2.3.4) 23 (GLIBC_2.34) 23 (GLIBC_2.34) 2 (GLIBC_2.2.5)
4d8: 2 (GLIBC_2.2.5) 2h(GLIBC_2.2.5) 23 (GLIBC_2.34) 14 (GLIBC_2.16)
4e4: 23 (GLIBC_2.34) 2 (GLIBC_2.2.5) 1fh(GLIBC_2.30) 1fh(GLIBC_2.30)
4e8: 23 (GLIBC_2.34) 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5)
4f4: 2 (GLIBC_2.2.5) 2h(GLIBC_2.2.5) 23 (GLIBC_2.34) 4 (GLIBC_2.3)
4fc: f (GLIBC_2.11) 23 (GLIBC_2.34) 23 (GLIBC_2.34) 7h(GLIBC_2.3.4)
520: 23 (GLIBC_2.34) 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5) 2h(GLIBC_2.2.5)
524: 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5) 23 (GLIBC_2.34) 25 (GLIBC_PRIVATE)
538: 2 (GLIBC_2.2.5) 18 (GLIBC_2.23) 23 (GLIBC_2.34) 7 (GLIBC_2.3.4)
578: 23 (GLIBC_2.34) 2h(GLIBC_2.2.5) 2 (GLIBC_2.2.5) 25 (GLIBC_PRIVATE)
58c: 2h(GLIBC_2.2.5) 2 (GLIBC_2.2.5) 23 (GLIBC_2.34) 2h(GLIBC_2.2.5)
598: 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5) 23 (GLIBC_2.34) 2h(GLIBC_2.2.5)
5b0: 2h(GLIBC_2.2.5) 2 (GLIBC_2.2.5) 23 (GLIBC_2.34) 12 (GLIBC_2.14)
5c4: 2 (GLIBC_2.2.5) 2h(GLIBC_2.2.5) 2 (GLIBC_2.2.5) 23 (GLIBC_2.34)
5d0: 13 (GLIBC_2.15) 2h(GLIBC_2.2.5) 23 (GLIBC_2.34) 2 (GLIBC_2.2.5)
5d4: 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5) 23 (GLIBC_2.34) 23 (GLIBC_2.34)
5ec: 2h(GLIBC_2.2.5) 23 (GLIBC_2.34) 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5)
5f4: 2 (GLIBC_2.2.5) 8 (GLIBC_2.4) 23 (GLIBC_2.34) 25 (GLIBC_PRIVATE)
600: 2h(GLIBC_2.2.5) 23 (GLIBC_2.34) 2h(GLIBC_2.2.5) 25 (GLIBC_PRIVATE)
614: 23 (GLIBC_2.34) 6h(GLIBC_2.3.3) 2h(GLIBC_2.2.5) 23 (GLIBC_2.34)
618: 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5) 23 (GLIBC_2.34) 25 (GLIBC_PRIVATE)
624: 2 (GLIBC_2.2.5) 23 (GLIBC_2.34) 25 (GLIBC_PRIVATE) 12 (GLIBC_2.14)
62c: 2 (GLIBC_2.2.5) 23 (GLIBC_2.34) 2 (GLIBC_2.2.5) 2h(GLIBC_2.2.5)
630: 2h(GLIBC_2.2.5) 23 (GLIBC_2.34) b (GLIBC_2.7) 2 (GLIBC_2.2.5)
644: 2 (GLIBC_2.2.5) 16h(GLIBC_2.18) 2h(GLIBC_2.2.5) 23 (GLIBC_2.34)
64c: 2 (GLIBC_2.2.5) 23 (GLIBC_2.34) 2 (GLIBC_2.2.5) 2h(GLIBC_2.2.5)
650: 2 (GLIBC_2.2.5) dh(GLIBC_2.9) 23 (GLIBC_2.34) 2h(GLIBC_2.2.5)
654: 2 (GLIBC_2.2.5) 25 (GLIBC_PRIVATE) 23 (GLIBC_2.34) 2 (GLIBC_2.2.5)
668: 23 (GLIBC_2.34) 2h(GLIBC_2.2.5) 21 (GLIBC_2.32) 2h(GLIBC_2.2.5)
674: 25 (GLIBC_PRIVATE) 2h(GLIBC_2.2.5) 23 (GLIBC_2.34) 24 (GLIBC_2.35)
680: 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5) 23 (GLIBC_2.34)
690: a (GLIBC_2.6) 25 (GLIBC_PRIVATE) 2 (GLIBC_2.2.5) 23 (GLIBC_2.34)
6ac: 23 (GLIBC_2.34) 1a (GLIBC_2.25) 2 (GLIBC_2.2.5) 25 (GLIBC_PRIVATE)
6c8: 2 (GLIBC_2.2.5) 2h(GLIBC_2.2.5) 2 (GLIBC_2.2.5) 23 (GLIBC_2.34)
6cc: 2h(GLIBC_2.2.5) 25 (GLIBC_PRIVATE) 25 (GLIBC_PRIVATE) 23 (GLIBC_2.34)
6d0: 2 (GLIBC_2.2.5) 23 (GLIBC_2.34) 2h(GLIBC_2.2.5) 2 (GLIBC_2.2.5)
6e0: 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5) 23 (GLIBC_2.34)
6e8: 2 (GLIBC_2.2.5) 23 (GLIBC_2.34) 2 (GLIBC_2.2.5) 2h(GLIBC_2.2.5)
6ec: 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5) 2h(GLIBC_2.2.5) 23 (GLIBC_2.34)
6f0: 2h(GLIBC_2.2.5) 7h(GLIBC_2.3.4) 23 (GLIBC_2.34) b (GLIBC_2.7)
704: 23 (GLIBC_2.34) 2h(GLIBC_2.2.5) 2 (GLIBC_2.2.5) 2h(GLIBC_2.2.5)
714: 2 (GLIBC_2.2.5) 8 (GLIBC_2.4) 1fh(GLIBC_2.30) 23 (GLIBC_2.34)
718: 2 (GLIBC_2.2.5) 7h(GLIBC_2.3.4) 23 (GLIBC_2.34) 7 (GLIBC_2.3.4)
71c: 2 (GLIBC_2.2.5) 23 (GLIBC_2.34) 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5)
720: 2 (GLIBC_2.2.5) 6h(GLIBC_2.3.3) 23 (GLIBC_2.34) 23 (GLIBC_2.34)
72c: 9 (GLIBC_2.5) 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5) 23 (GLIBC_2.34)
734: 2h(GLIBC_2.2.5) 2h(GLIBC_2.2.5) 23 (GLIBC_2.34) 14 (GLIBC_2.16)
744: 2 (GLIBC_2.2.5) 4 (GLIBC_2.3) 23 (GLIBC_2.34) 2h(GLIBC_2.2.5)
760: 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5) 23 (GLIBC_2.34) 2 (GLIBC_2.2.5)
770: 4 (GLIBC_2.3) 2 (GLIBC_2.2.5) 23 (GLIBC_2.34) 2h(GLIBC_2.2.5)
778: 2h(GLIBC_2.2.5) 2 (GLIBC_2.2.5) 2h(GLIBC_2.2.5) 23 (GLIBC_2.34)
77c: 2 (GLIBC_2.2.5) b (GLIBC_2.7) 2 (GLIBC_2.2.5) 23 (GLIBC_2.34)
780: 7h(GLIBC_2.3.4) 2 (GLIBC_2.2.5) 2h(GLIBC_2.2.5) 23 (GLIBC_2.34)
78c: 23 (GLIBC_2.34) 2 (GLIBC_2.2.5) 25 (GLIBC_PRIVATE) 3 (GLIBC_2.2.6)
7a0: 14 (GLIBC_2.16) 23 (GLIBC_2.34) 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5)
7a4: 20h(GLIBC_2.31) 23 (GLIBC_2.34) 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5)
7c4: 25 (GLIBC_PRIVATE) 2 (GLIBC_2.2.5) 2h(GLIBC_2.2.5) 23 (GLIBC_2.34)
7d0: 23 (GLIBC_2.34) 2 (GLIBC_2.2.5) 2h(GLIBC_2.2.5) 2h(GLIBC_2.2.5)
7d8: 23 (GLIBC_2.34) 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5)
7f0: 2 (GLIBC_2.2.5) dh(GLIBC_2.9) 25 (GLIBC_PRIVATE) 23 (GLIBC_2.34)
804: 23 (GLIBC_2.34) dh(GLIBC_2.9) 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5)
81c: 2 (GLIBC_2.2.5) fh(GLIBC_2.11) 2h(GLIBC_2.2.5) 23 (GLIBC_2.34)
84c: 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5) 23 (GLIBC_2.34) 2 (GLIBC_2.2.5)
854: 2 (GLIBC_2.2.5) 23 (GLIBC_2.34) 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5)
864: 2 (GLIBC_2.2.5) 25 (GLIBC_PRIVATE) 23 (GLIBC_2.34) 2 (GLIBC_2.2.5)
874: 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5) 23 (GLIBC_2.34)
87c: 2h(GLIBC_2.2.5) 23 (GLIBC_2.34) 2h(GLIBC_2.2.5) 2h(GLIBC_2.2.5)
8c0: 2 (GLIBC_2.2.5) 23 (GLIBC_2.34) 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5)
8c4: 7h(GLIBC_2.3.4) 6h(GLIBC_2.3.3) 23 (GLIBC_2.34) 25 (GLIBC_PRIVATE)
8dc: 23 (GLIBC_2.34) 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5)
8f0: 2 (GLIBC_2.2.5) 23 (GLIBC_2.34) 7 (GLIBC_2.3.4) 2h(GLIBC_2.2.5)
8f8: 1d (GLIBC_2.28) 23 (GLIBC_2.34) 2 (GLIBC_2.2.5) 2h(GLIBC_2.2.5)
924: 23 (GLIBC_2.34) 2h(GLIBC_2.2.5) 6h(GLIBC_2.3.3) e (GLIBC_2.10)
928: 2 (GLIBC_2.2.5) 2h(GLIBC_2.2.5) 23 (GLIBC_2.34) 2 (GLIBC_2.2.5)
92c: 6h(GLIBC_2.3.3) 25 (GLIBC_PRIVATE) 23 (GLIBC_2.34) 2 (GLIBC_2.2.5)
940: f (GLIBC_2.11) 23 (GLIBC_2.34) 2h(GLIBC_2.2.5) 2 (GLIBC_2.2.5)
944: 23 (GLIBC_2.34) 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5)
948: 8h(GLIBC_2.4) 23 (GLIBC_2.34) 23 (GLIBC_2.34) 2 (GLIBC_2.2.5)
950: 23 (GLIBC_2.34) 23 (GLIBC_2.34) 2h(GLIBC_2.2.5) 5 (GLIBC_2.3.2)
960: 2h(GLIBC_2.2.5) 23 (GLIBC_2.34) 2 (GLIBC_2.2.5) 25 (GLIBC_PRIVATE)
970: bh(GLIBC_2.7) 23 (GLIBC_2.34) 23 (GLIBC_2.34) 2 (GLIBC_2.2.5)
974: 6h(GLIBC_2.3.3) 23 (GLIBC_2.34) 2 (GLIBC_2.2.5) 7h(GLIBC_2.3.4)
988: 2 (GLIBC_2.2.5) 2h(GLIBC_2.2.5) 23 (GLIBC_2.34) 12 (GLIBC_2.14)
9a0: 2 (GLIBC_2.2.5) 2h(GLIBC_2.2.5) 2h(GLIBC_2.2.5) 23 (GLIBC_2.34)
9a8: b (GLIBC_2.7) 23 (GLIBC_2.34) 25 (GLIBC_PRIVATE) 2h(GLIBC_2.2.5)
9c8: 23 (GLIBC_2.34) 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5)
9d4: 23 (GLIBC_2.34) 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5)
9e4: 2 (GLIBC_2.2.5) 23 (GLIBC_2.34) 25 (GLIBC_PRIVATE) 2 (GLIBC_2.2.5)
9f8: 2 (GLIBC_2.2.5) f (GLIBC_2.11) 1a (GLIBC_2.25) 23 (GLIBC_2.34)
a08: 23 (GLIBC_2.34) 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5) 2h(GLIBC_2.2.5)
a28: 2h(GLIBC_2.2.5) 2 (GLIBC_2.2.5) 25 (GLIBC_PRIVATE) 23 (GLIBC_2.34)
a50: 25 (GLIBC_PRIVATE) 6h(GLIBC_2.3.3) 23 (GLIBC_2.34) 2 (GLIBC_2.2.5)
a5c: 23 (GLIBC_2.34) 2 (GLIBC_2.2.5) 1dh(GLIBC_2.28) 2 (GLIBC_2.2.5)
a68: 2 (GLIBC_2.2.5) 2h(GLIBC_2.2.5) 23 (GLIBC_2.34) 2h(GLIBC_2.2.5)
a6c: 2 (GLIBC_2.2.5) 6h(GLIBC_2.3.3) 2 (GLIBC_2.2.5) 23 (GLIBC_2.34)
a78: 2 (GLIBC_2.2.5) 23 (GLIBC_2.34) 1dh(GLIBC_2.28) 7 (GLIBC_2.3.4)
a94: 8 (GLIBC_2.4) 5 (GLIBC_2.3.2) 2h(GLIBC_2.2.5) 23 (GLIBC_2.34)
a9c: 23 (GLIBC_2.34) 2 (GLIBC_2.2.5) 2h(GLIBC_2.2.5) 2 (GLIBC_2.2.5)
aa8: 6h(GLIBC_2.3.3) 23 (GLIBC_2.34) 23 (GLIBC_2.34) 2h(GLIBC_2.2.5)
ad0: 25 (GLIBC_PRIVATE) b (GLIBC_2.7) 2 (GLIBC_2.2.5) 23 (GLIBC_2.34)
af4: 2 (GLIBC_2.2.5) 23 (GLIBC_2.34) 2 (GLIBC_2.2.5) 8 (GLIBC_2.4)
afc: 6 (GLIBC_2.3.3) 23 (GLIBC_2.34) 16h(GLIBC_2.18) 2 (GLIBC_2.2.5)
b08: 5 (GLIBC_2.3.2) 2 (GLIBC_2.2.5) 2h(GLIBC_2.2.5) 23 (GLIBC_2.34)
b28: 2 (GLIBC_2.2.5) 2h(GLIBC_2.2.5) 23 (GLIBC_2.34) 2 (GLIBC_2.2.5)
b30: 23 (GLIBC_2.34) 2h(GLIBC_2.2.5) 2 (GLIBC_2.2.5) 23 (GLIBC_2.34)
b34: 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5) 23 (GLIBC_2.34)
b3c: 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5) 23 (GLIBC_2.34)
b44: 23 (GLIBC_2.34) 2 (GLIBC_2.2.5) 2 (GLIBC_2.2.5) 2h(GLIBC_2.2.5)
b58: 1d (GLIBC_2.28) 2 (GLIBC_2.2.5) 23 (GLIBC_2.34) 8h(GLIBC_2.4)
b60: 2 (GLIBC_2.2.5) 23 (GLIBC_2.34) 23 (GLIBC_2.34) 2h(GLIBC_2.2.5)
b8c: 2 (GLIBC_2.2.5) 23 (GLIBC_2.34) 2 (GLIBC_2.2.5) 2h(GLIBC_2.2.5)
b94: 2 (GLIBC_2.2.5) 23 (GLIBC_2.34) 2 (GLIBC_2.2.5) dh(GLIBC_2.9)
b9c: 2h(GLIBC_2.2.5) 23 (GLIBC_2.34) 2h(GLIBC_2.2.5) 2 (GLIBC_2.2.5)
ba4: 2h(GLIBC_2.2.5) 1b (GLIBC_2.26) 23 (GLIBC_2.34) 2h(GLIBC_2.2.5)
bb8: 23 (GLIBC_2.34) 2h(GLIBC_2.2.5) 23 (GLIBC_2.34) 23 (GLIBC_2.34)
0x04b8: Rev: 1 Flags: none Index: 35 Cnt: 2 Name: GLIBC_2.34
0x04f8: Parent 1: GLIBC_2.34
Last edited by ruy (2022-08-06 16:33:40)
Offline
Hi Ruy,
yes, Glibc being a newer version is most likely causing the issue. However, I intentionally left out a detail in my last message hoping that it would not be required. There is a new version of the Singularity container, you can download it here: code-aster.org/FICHIERS/singularity/salome_meca-lgpl-2021.1.0-1-20220405-scibian-9.sif That is the version that I managed to get running on the new version of Linux Mint. It had the same error as you a few times, but then it managed to work after a few tries. I hope this helps.
Regards,
Fer
Offline
Hi Fer,
Still no luck, get the same error:
"Searching /Kernel/Session in Naming Service ++++SALOME_Session_Server: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.34' not found (required by /.singularity.d/libs/libGLX.so.0)"
Maybe an older version of Ubuntu 20.4 or something may work?
Thanks a lot
Ruy
Offline
hey ruy, GLIBC-2.34 not found error on my system was fixed by installing nvidia-container-toolkit from the official nvidia github location : https: //nvidia.github.io/libnvidia-container (remove the spaces after the https:). Now the singularity container is working with the nvidia hw rendering. Also the member emeff's instruction to add mpi version of code aster also worked. It is on https: //github.com/emeff/Code-Aster-MPI...... hope this helps.
Offline
Regarding graphical problems:
- Most software that we use comes from our operating system's repositories. It is built with the tools that our OS provides. You probably installed GMSH from your repositories and it just works, because it is compiled with the tools and libraries of your system. Even if you compile it locally, it still uses the libraries that your system has, so everything should work nicely.
no repository for me
Gmsh come as binaries for linux mac and windows under the form of an archive which you just unpack somewhere and run without bothering what is your distribution or your graphic card
it comes also under the form of source file and here the compilations requires many prerequisite and can be quite difficult
there are more possibilities that can be seen in the gmsh web site
as a matter of fact salome comes also under a similar form of an archive that can be run much more easily than salome-meca with singularity
concerning why EDF developed the singularity container i am well aware of the underlying reasons
i just simply regrets the old days of code_aster stand alone with its small foot print
and to make things even worse the download of the scibian file is such a pain in the neck
i have never bee able to download it in less than half a day without a good dozen of wget restart
for the rest i do all my work with stand alone 15.2 that does the job well!!
consider reading my book
freely available here https://framabook.org/beginning-with-code_aster/
Offline
Hi Ruy,
run "singularity run --app install salome_meca-lgpl-2021.0.0-2-20211014-scibian-9.sif" as is alone. It should then create a Python excecutable with the same name as the container in that folder. Then run
__NV_PRIME_RENDER_OFFLOAD=1 __GLX_VENDOR_LIBRARY_NAME=nvidia ./salome_meca-lgpl-2021.0.0-2-20211014-scibian-9
It will most likely fail however... Your error is related to GLIBC having issues with the GLX library. It happened to me to a few times. I just tried several times and then it finally worked in a virtualised Linux Mint 21 (same base as Ubuntu 22.04). GLIBC errors are a bit tricky to fix because they are the "core" of the system. If a library has incompatibilities with it... Nonetheless, it should work. Ubuntu 22.04 has a newer GLIBC version that should be compatible as far as I know...
Regards,
Fer
Hi Fernando,
I am facing the same error even after trying running Salome in GPU mode, is there a fix for this?
below is the error:
not exist: /etc/krb5.conf
/usr/bin/nvidia-smi
**************************************
INFO : Running salome_meca in GPU mode
**************************************
runSalome running on linux
Searching for a free port for naming service: 2814 - OK
Searching Naming Service + found in 0.1 seconds
Searching /Kernel/Session in Naming Service ++++SALOME_Session_Server: /lib/x86_64-linux-gnu/libc.so.6: version `GLIBC_2.34' not found (required by /.singularity.d/libs/libGLX.so.0)
Traceback (most recent call last):
File "/opt/salome_meca/Salome-V2021-s9/modules/KERNEL_V9_7_0/bin/salome/orbmodule.py", line 181, in waitNSPID
os.kill(thePID,0)
ProcessLookupError: [Errno 3] No such process
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/opt/salome_meca/appli_V2021/bin/salome/runSalome.py", line 694, in useSalome
clt = startSalome(args, modules_list, modules_root_dir)
File "/opt/salome_meca/appli_V2021/bin/salome/runSalome.py", line 639, in startSalome
session=clt.waitNSPID("/Kernel/Session",mySessionServ.PID,SALOME.Session)
File "/opt/salome_meca/Salome-V2021-s9/modules/KERNEL_V9_7_0/bin/salome/orbmodule.py", line 183, in waitNSPID
raise RuntimeError("Process %d for %s not found" % (thePID,theName))
RuntimeError: Process 10903 for /Kernel/Session not found
--- Error during Salome launch ---
Offline
Hi aadicfd,
I some people have been successful in fixing the GLIBC issue by following what osman indicated. I am quoting him, but his post is two posts before yours:
hey ruy, GLIBC-2.34 not found error on my system was fixed by installing nvidia-container-toolkit from the official nvidia github location : https: //nvidia.github.io/libnvidia-container (remove the spaces after the https:). Now the singularity container is working with the nvidia hw rendering. Also the member emeff's instruction to add mpi version of code aster also worked. It is on https: //github.com/emeff/Code-Aster-MPI...... hope this helps.
Regards,
Fer
Offline
Personally , installing the nvidia-container-toolkit didn't solve the glibc related issue. If someone has an other solution, I would greatly appreciate it.
Last edited by CharlesM (2022-09-04 15:17:40)
Offline
I rolled-back to Ubuntu 20.04 and it works, after unsuccessful attempts with 22.04.
Offline
Hi.
A solution that worked for me on Fedora 36 is
1. Search for a file named `nvliblist.conf` in your installation. It should be under your Singularity installation directory somewhere.
2. Make a back-up of this file `mv nvliblist.conf nvliblist.conf.bak`.
3. Open the file `nvliblist.conf` using a text editor.
4. Delete the lines `libGLX.so.0`, `libglx.so.0`, and `libGLdispatch.so`.
Try rerunning the Salome container, it should work this time.
Reference: https: //github.com/apptainer/apptainer/issues/598
Offline
Hi,
I did nothing from any of the solutions mentioned above except for installing the Nvidia driver and activating it using Prime ( but I wanted to do it anyway because it does improve the performance of my laptop. This however, did not solve my problem.
Finally, I read in one of the post that some users ran the Salome singularity container directly, without installing it (or generating the python script). So I deleted the already generated python script and then I ran it directly using the command :
singularity run /home/user/Downloads/salome_meca-lgpl-2021.0.0-0-20210601-scibian-9.sif
And it simply worked.
The only thing that bugged me was that it was throwing some errors druing the startup, but overall the functionality has been as expected so those errors can be safely ignored. I have been running a meshing process on it and so far it has been working properly. I can only use NETGEN so far but that is probably because other options for mesh require separate license.
Note: I am using Ubuntu 22 (jammy) and my singularity version is 3.6.4 with go version 1.13.
I hope this helps.
Cheers,
Aadi
Offline
sudo nano /etc/singularity/nvliblist.conf to comment out libGLX.so as shown below worked for me on Ubuntu 22.04:
....
libglx.so
#libGLX.so
libnvcuvid.so
....
Offline
Here is a general fix to test for recent Linux distributions.
At least it works for ubuntu 22.04.3
code-aster.org/forum2/viewtopic.php?pid=69607#p69607
Offline