It would be better to have a dictionary of such functions than to look in globals().
The usual approach is to write a class with such functions:
class Cleaner(object):
def clean_name(self):
pass
and then use getattr to get access to them:
cleaner = Cleaner()
for f in fields:
getattr(cleaner, 'clean_%s' % f)()
You could even move further and do something like this:
class Cleaner(object):
def __init__(self, fields):
self.fields = fields
def clean(self):
for f in self.fields:
getattr(self, 'clean_%s' % f)()
Then inherit it and declare your clean_<name> methods on an inherited class:
cleaner = Cleaner(['one', 'two'])
cleaner.clean()
Actually this can be extended even further to make it more clean. The first step probably will be adding a check with hasattr() if such method exists in your class.
I would use a dictionary which mapped field names to cleaning functions. If some fields don't have corresponding cleaning function, the for loop handling them can be kept simple by providing some sort of default function for those cases. Here's what I mean:
fields = ['name', 'email', 'subject']
def clean_name():
pass
def clean_email():
pass
# (one-time) field to cleaning-function map construction
def get_clean_func(field):
try:
return eval('clean_'+field)
except NameError:
return lambda: None # do nothing
clean = dict((field, get_clean_func(field)) for field in fields)
# sample usage
for field in fields:
clean[field]()
The code above constructs the function dictionary dynamically by determining if a corresponding function named clean_<field> exists for each one named in the fields list. You likely would only have to execute it once since it would remain the same as long as the field list or available cleaning functions aren't changed.
If don't want to use globals, vars and don't want make a separate module and/or class to encapsulate functions you want to call dynamically, you can call them as the attributes of the current module:
I have come across this problem twice now, and finally came up with a safe and not ugly solution (in my humble opinion).
RECAP of previous answers:
globals is the hacky, fast & easy method, but you have to be super consistent with your function names, and it can break at runtime if variables get overwritten. Also it's un-pythonic, unsafe, unethical, yadda yadda...
Dictionaries (i.e. string-to-function maps) are safer and easy to use... but it annoys me to no end, that i have to spread dictionary assignments across my file, that are easy to lose track of.
Decorators made the dictionary solution come together for me. Decorators are a pretty way to attach side-effects & transformations to a function definition.
Example time
fields = ['name', 'email', 'address']
# set up our function dictionary
cleaners = {}
# this is a parametered decorator
def add_cleaner(key):
# this is the actual decorator
def _add_cleaner(func):
cleaners[key] = func
return func
return _add_cleaner
Whenever you define a cleaner function, add this to the declaration:
@add_cleaner('email')
def email_cleaner(email):
#do stuff here
return result
The functions are added to the dictionary as soon as their definition is parsed and can be called like this:
This uses the function name of the cleaner method as its dictionary key.
It is more concise, though I think the method names become a little awkward.
Pick your favorite.
I had a requirement to call different methods of a class in a method of itself on the basis of list of method names passed as input (for running periodic tasks in FastAPI). For executing methods of Python classes, I have expanded the answer provided by @khachik. Here is how you can achieve it from inside or outside of the class: