-
Notifications
You must be signed in to change notification settings - Fork 3k
variable python version #2152
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
There are a few questions here so let me answer them
The reason is that To change the python version you indeed need to build all the images in correct order and set this variable when building
Unless you can provide more details, I don't think we can help here - it might have been some network error, sometimes it happens.
As far as I understand your main goal is to have an If you really want to use python3.12 with spark right now, I think you can modify our script so it install Hope this helps |
Thank you, mathbunnyru! |
Great. I think we can close this issue then |
What docker image(s) is this feature applicable to?
docker-stacks-foundation, all-spark-notebook, scipy-notebook
What change(s) are you proposing?
Hi
It's actually a question.
But may be a feature request
python version is super important to be variable, because even minor conflict with spark python version is preventing driver from working.
I'm aware that in docker-stacks-foundation there is an ARG for python version
But when I try to build the top project: "all-spark-notebook" with this arg, that does not lead to python version change in the container.
(The reason, I think, that all the images already pre-built in the repo)
I even tried to build all the hierarchy one by one on my docker-desktop, assigning local images, but it did not help, there is a [line] in scipy-notebook where everything fails during the local build without proper error message.
Thus I can not change the python from 3.11 to 3.12
Do you have any suggestion?
Thanks.
(Finally I just built entire hierarchy with the newer python.. )
How does this affect the user?
Can nor really work with spark that works with other python minor version
Anything else?
No response
The text was updated successfully, but these errors were encountered: