在 Python 中模拟 Bash 的“源代码”

我有一个剧本是这样的:

export foo=/tmp/foo
export bar=/tmp/bar

每次构建时,我都会运行‘ source init _ env’(其中 init _ env 是上面的脚本)来设置一些变量。

为了在 Python 中完成同样的任务,我运行了这段代码,

reg = re.compile('export (?P<name>\w+)(\=(?P<value>.+))*')
for line in open(file):
m = reg.match(line)
if m:
name = m.group('name')
value = ''
if m.group('value'):
value = m.group('value')
os.putenv(name, value)

但是之后 某人决定在 init_env文件中添加如下代码行:

export PATH="/foo/bar:/bar/foo:$PATH"

很明显我的 Python 脚本崩溃了。我可以修改 Python 脚本来处理这一行,但是当 某人提供了在 init_env文件中使用的新特性时,这一行就会中断。

问题是,是否有一种简单的方法来运行 Bash 命令并让它修改我的 os.environ

97043 次浏览

Rather than having your Python script source the bash script, it would be simpler and more elegant to have a wrapper script source init_env and then run your Python script with the modified environment.

#!/bin/bash
source init_env
/run/python/script.py

The problem with your approach is that you are trying to interpret bash scripts. First you just try to interpret the export statement. Then you notice people are using variable expansion. Later people will put conditionals in their files, or process substitutions. In the end you will have a full blown bash script interpreter with a gazillion bugs. Don't do that.

Let Bash interpret the file for you and then collect the results.

You can do it like this:

#! /usr/bin/env python


import os
import pprint
import shlex
import subprocess


command = shlex.split("env -i bash -c 'source init_env && env'")
proc = subprocess.Popen(command, stdout = subprocess.PIPE)
for line in proc.stdout:
(key, _, value) = line.partition("=")
os.environ[key] = value
proc.communicate()


pprint.pprint(dict(os.environ))

Make sure that you handle errors in case bash fails to source init_env, or bash itself fails to execute, or subprocess fails to execute bash, or any other errors.

the env -i at the beginning of the command line creates a clean environment. that means you will only get the environment variables from init_env. if you want the inherited system environment then omit env -i.

Read the documentation on subprocess for more details.

Note: this will only capture variables set with the export statement, as env only prints exported variables.

Enjoy.

Note that the Python documentation says that if you want to manipulate the environment you should manipulate os.environ directly instead of using os.putenv(). I consider that a bug, but I digress.

Using pickle:

import os, pickle
# For clarity, I moved this string out of the command
source = 'source init_env'
dump = '/usr/bin/python -c "import os,pickle;print pickle.dumps(os.environ)"'
penv = os.popen('%s && %s' %(source,dump))
env = pickle.loads(penv.read())
os.environ = env

Updated:

This uses json, subprocess, and explicitly uses /bin/bash (for ubuntu support):

import os, subprocess as sp, json
source = 'source init_env'
dump = '/usr/bin/python -c "import os, json;print json.dumps(dict(os.environ))"'
pipe = sp.Popen(['/bin/bash', '-c', '%s && %s' %(source,dump)], stdout=sp.PIPE)
env = json.loads(pipe.stdout.read())
os.environ = env

Updated @lesmana's answer for Python 3. Notice the use of env -i which prevents extraneous environment variables from being set/reset (potentially incorrectly given the lack of handling for multiline env variables).

import os, subprocess
if os.path.isfile("init_env"):
command = 'env -i sh -c "source init_env && env"'
for line in subprocess.getoutput(command).split("\n"):
key, value = line.split("=")
os.environ[key]= value

Example wrapping @Brian's excellent answer in a function:

import json
import subprocess


# returns a dictionary of the environment variables resulting from sourcing a file
def env_from_sourcing(file_to_source_path, include_unexported_variables=False):
source = '%ssource %s' % ("set -a && " if include_unexported_variables else "", file_to_source_path)
dump = '/usr/bin/python -c "import os, json; print json.dumps(dict(os.environ))"'
pipe = subprocess.Popen(['/bin/bash', '-c', '%s && %s' % (source, dump)], stdout=subprocess.PIPE)
return json.loads(pipe.stdout.read())

I'm using this utility function to read aws credentials and docker .env files with include_unexported_variables=True.

Best workaround I found is like this :

  • Write a wrapper bash script that calls your python script
  • In that bash script you can source or call that script after sourcing your current terminal