Clarification: baseline GLIBC version for 2.8.x releases #2012
Unanswered
luisbarrancos
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi
I'm trying to build from source flash attention 2.8.3 on a docker image based on Ubuntu 22.04LTS, Python 3.10, with a CUDA 12.4.1, Pytorch 2.6.0 stack and although the build completes successfully, when testing importing flash_attn, I ran into a undefined symbol error which seems to be related to GLIBC and particularly CXX11 behaviour.
There are many reports about such error, but the discussions all revolve around which wheel to use, which version to download, but my interest is knowing exactly which is the baseline GLIBC version that is meant to be used by FlashAttention 2.8.x, and if the CXX11 ABI behaviour is meant to be enabled or disabled. I assume, perhaps incorrectly, there may be some incompatibility with PyTorch 2.6.0 perhaps using a pre-CXX11 ABI behaviour, for the mangled symbol would resolve to something like
std::__cxx11::basic_string, only used with CXX11 ABI is active.Any clarification would be most welcome.
Thanks in advance
Beta Was this translation helpful? Give feedback.
All reactions