You can use shell redirection while executing the Python file:
python foo_bar.py > file
This will write all results being printed on stdout from the Python source to file to the logfile.
Or if you want logging from within the script:
import sys
class Logger(object):
def __init__(self):
self.terminal = sys.stdout
self.log = open("logfile.log", "a")
def write(self, message):
self.terminal.write(message)
self.log.write(message)
def flush(self):
# this flush method is needed for python 3 compatibility.
# this handles the flush command by doing nothing.
# you might want to specify some extra behavior here.
pass
sys.stdout = Logger()
Now you can use:
print "Hello"
This will write "Hello" to both stdout and the logfile.
I got the way to redirect the out put to console as well as to a text file as well simultaneously:
te = open('log.txt','w') # File where you need to keep the logs
class Unbuffered:
def __init__(self, stream):
self.stream = stream
def write(self, data):
self.stream.write(data)
self.stream.flush()
te.write(data) # Write the data of stdout here to a text file as well
sys.stdout=Unbuffered(sys.stdout)
To redirect output to a file and a terminal without modifying how your Python script is used outside, you could use pty.spawn(itself):
#!/usr/bin/env python
"""Redirect stdout to a file and a terminal inside a script."""
import os
import pty
import sys
def main():
print('put your code here')
if __name__=="__main__":
sentinel_option = '--dont-spawn'
if sentinel_option not in sys.argv:
# run itself copying output to the log file
with open('script.log', 'wb') as log_file:
def read(fd):
data = os.read(fd, 1024)
log_file.write(data)
return data
argv = [sys.executable] + sys.argv + [sentinel_option]
rc = pty.spawn(argv, read)
else:
sys.argv.remove(sentinel_option)
rc = main()
sys.exit(rc)
If pty module is not available (on Windows) then you could replace it with teed_call() function that is more portable but it provides ordinary pipes instead of a pseudo-terminal -- it may change behaviour of some programs.
The advantage of pty.spawn and subprocess.Popen -based solutions over replacing sys.stdout with a file-like object is that they can capture the output at a file descriptor level e.g., if the script starts other processes that can also produce output on stdout/stderr. See my answer to the related question: Redirect stdout to a file in Python?
I devised an easier solution. Just define a function that will print to file or to screen or to both of them. In the example below I allow the user to input the outputfile name as an argument but that is not mandatory:
OutputFile= args.Output_File
OF = open(OutputFile, 'w')
def printing(text):
print text
if args.Output_File:
OF.write(text + "\n")
After this, all that is needed to print a line both to file and/or screen is:
printing(Line_to_be_printed)
Here is how I managed to log to file and to console / stdout
import logging
logging.basicConfig(level=logging.INFO,
format='%(asctime)s - %(levelname)s - %(message)s',
filename='logs_file',
filemode='w')
# Until here logs only to file: 'logs_file'
# define a new Handler to log to console as well
console = logging.StreamHandler()
# optional, set the logging level
console.setLevel(logging.INFO)
# set a format which is the same for console use
formatter = logging.Formatter('%(asctime)s - %(levelname)s - %(message)s')
# tell the handler to use this format
console.setFormatter(formatter)
# add the handler to the root logger
logging.getLogger('').addHandler(console)
# Now, we can log to both ti file and console
logging.info('Jackdaws love my big sphinx of quartz.')
logging.info('Hello world')
from IPython.utils.io import Tee
from contextlib import closing
print('This is not in the output file.')
with closing(Tee("outputfile.log", "w", channel="stdout")) as outputstream:
print('This is written to the output file and the console.')
# raise Exception('The file "outputfile.log" is closed anyway.')
print('This is not written to the output file.')
# Output on console:
# This is not in the output file.
# This is written to the output file and the console.
# This is not written to the output file.
# Content of file outputfile.txt:
# This is written to the output file and the console.
The Tee class in IPython.utils.io does what you want, but it lacks the __enter__ and __exit__ methods needed to call it in the with-statement. Those are added by contextlib.closing.
I've tried a few solutions here and didn't find the one that writes into file and into console at the same time. So here is what I did (based on this answer)
class Logger(object):
def __init__(self):
self.terminal = sys.stdout
def write(self, message):
with open ("logfile.log", "a", encoding = 'utf-8') as self.log:
self.log.write(message)
self.terminal.write(message)
def flush(self):
#this flush method is needed for python 3 compatibility.
#this handles the flush command by doing nothing.
#you might want to specify some extra behavior here.
pass
sys.stdout = Logger()
This solution uses more computing power, but reliably saves all of the data from stdout into logger file and uses less memeory. For my needs I've added time stamp into self.log.write(message) aswell. Works great.
This way worked very well in my situation. I just added some modifications based on other code presented in this thread.
import sys, os
orig_stdout = sys.stdout # capture original state of stdout
te = open('log.txt','w') # File where you need to keep the logs
class Unbuffered:
def __init__(self, stream):
self.stream = stream
def write(self, data):
self.stream.write(data)
self.stream.flush()
te.write(data) # Write the data of stdout here to a text file as well
sys.stdout=Unbuffered(sys.stdout)
#######################################
## Feel free to use print function ##
#######################################
print("Here is an Example =)")
#######################################
## Feel free to use print function ##
#######################################
# Stop capturing printouts of the application from Windows CMD
sys.stdout = orig_stdout # put back the original state of stdout
te.flush() # forces python to write to file
te.close() # closes the log file
# read all lines at once and capture it to the variable named sys_prints
with open('log.txt', 'r+') as file:
sys_prints = file.readlines()
# erase the file contents of log file
open('log.txt', 'w').close()
Based on @Arnold Suiza's answer, here is a function you can run once in the beginning and afterwards all will be immediately printed to stdout & file:
def print_to_file(filename):
orig_stdout = sys.stdout # capture original state of stdout
class Unbuffered:
def __init__(self, filename):
self.stream = orig_stdout
self.te = open(filename,'w') # File where you need to keep the logs
def write(self, data):
self.stream.write(data)
self.stream.flush()
self.te.write(data) # Write the data of stdout here to a text file as well
self.te.flush()
sys.stdout=Unbuffered(filename)
Now just run print_to_file('log.txt') at program start and you're good to go!