I’m using command docker run -e GRB_WLSACCESSID=xxxxxxx
to set environment variables for Gurobi authorization. The OS of container is Ubuntu 16.04. This is OK if I login the container via SSH interactively and read the environment variables by python code os.getenv()
.
But, when I add this container as remote SSH interpreter in Pycharm and execute the python code along Pycharm, I can’t get the environment variables.
At last I found the problem is that the environment variables generated by docker run -e
can only be read by interactive shell. This can be validated by executing ssh root@x.x.x.x env
, and interactively execute env
after logging into the container. The former outputs less.
One possible solution is write some configuration manually after the container is generated, e.g., set the variables in /etc/environment
(provided by this).
The other possible solution is add the variables manually in Pycharm edit configuration
.
Is there a more elegant solution? 🙁
Advertisement
Answer
I finally understood the specific meaning of the relevant answers
It means, in the remote VM or container, create a linux script file named mypython
as the python wrapper with the content:
#!/bin/bash -l /path/to/interpreter/bin/python "$@"
, where /path/to/interpreter/bin/python
is the path to the python interpreter. For conda interpreter, it might look like /root/miniconda3/envs/py37/bin/python
.
The script mypython
should be placed in the same path as binary python
, i.e., /root/miniconda3/envs/py37/bin/mypython
And then add the execute permission to mypython
:
chmod +x /root/miniconda3/envs/py37/bin/mypython
The above two steps can also be executed by command as an alternative:
echo '#!/bin/bash -l /root/miniconda3/envs/py37/bin/python "$@"' > /root/miniconda3/envs/py37/bin/mypython chmod +x /root/miniconda3/envs/py37/bin/mypython
At last, add the SSH interpreter in Pycharm, make sure the interpreter path is /root/miniconda3/envs/py37/bin/mypython
And the problem is solved.