You can set sys.stdout = Logger() where Logger is a class whose write method (immediately, or accumulating until a \n is detected) calls logging.info (or any other way you want to log). An example of this in action.
I'm not sure what you mean by "a given" process (who's given it, what distinguishes it from all others...?), but if you mean you know what process you want to single out that way at the time you instantiate it, then you could wrap its target function (and that only) -- or the run method you're overriding in a Process subclass -- into a wrapper that performs this sys.stdout "redirection" -- and leave other processes alone.
Maybe if you nail down the specs a bit I can help in more detail...?
The easiest way might be to just override sys.stdout. Slightly modifying an example from the multiprocessing manual:
from multiprocessing import Process
import os
import sys
def info(title):
print title
print 'module name:', __name__
print 'parent process:', os.getppid()
print 'process id:', os.getpid()
def f(name):
sys.stdout = open(str(os.getpid()) + ".out", "w")
info('function f')
print 'hello', name
if __name__ == '__main__':
p = Process(target=f, args=('bob',))
p.start()
q = Process(target=f, args=('fred',))
q.start()
p.join()
q.join()
And running it:
$ ls
m.py
$ python m.py
$ ls
27493.out 27494.out m.py
$ cat 27493.out
function f
module name: __main__
parent process: 27492
process id: 27493
hello bob
$ cat 27494.out
function f
module name: __main__
parent process: 27492
process id: 27494
hello fred
There are only two things I would add to @Mark Rushakoff answer. When debugging, I found it really useful to change the buffering parameter of my open() calls to 0.
Also, for convenience you might dump that into a separate process class if you wish,
class MyProc(Process):
def run(self):
# Define the logging in run(), MyProc's entry function when it is .start()-ed
# p = MyProc()
# p.start()
self.initialize_logging()
print 'Now output is captured.'
# Now do stuff...
def initialize_logging(self):
sys.stdout = open(str(os.getpid()) + ".out", "a", buffering=0)
sys.stderr = open(str(os.getpid()) + "_error.out", "a", buffering=0)
print 'stdout initialized'
multiprocessing has a convenient module-level function to enable logging called log_to_stderr(). It sets up a logger object using logging and adds a handler so that log messages are sent to the standard error channel. By default, the logging level is set to NOTSET so no messages are produced. Pass a different level to initialize the logger to the level of detail desired.
import logging
from multiprocessing import Process, log_to_stderr
print("Running main script...")
def my_process(my_var):
print(f"Running my_process with {my_var}...")
# Initialize logging for multiprocessing.
log_to_stderr(logging.DEBUG)
# Start the process.
my_var = 100;
process = Process(target=my_process, args=(my_var,))
process.start()
process.kill()
This code will output both print() statements to stderr.