I have an application that runs on Postgres & Mysql. Each program checks to determine the database type and then imports either postgres_db as db_util or mysql_db as db_util. This works without a problem if all code referencing the class db_util is in the __main__ module. When I started putting code in classes and importing the class, the class would throw an error: “global name ‘db_util’ is not defined”. What follows is a solution to this problem. I don’t know if it is Pythonic and would like to hear any commments.
When a python script starts, the main script is loaded and given the __name__ of ‘__main__’ and it is a module unto itself. If you import a file, it creates a module with its own namespace and the two do not share, unless explicitly told to share.
What must be done is to develop a pattern to forces the modules to share. But as an application developer, you must decide if the classes or instantances are to be shared and if the shared object is a global within the module (refenced by name) or within the class (referenced by self.name).
Another concern is that modules and classes should be stand-alone and if there are dependencies, these need to be explicitly defined in module. That way, everything needed for the code to run is defined in the code. In the long run, it should make debugging much easier.
The code samples that follow passes the instance, but the commented code passes the class.
The structure is:
|-dbi.py (import the db module)
|-dbfunct (uses db module)
#!/usr/bin/python ''' test of scope ''' import classes.dbi as dbi #get db_util based on dbtype #du=dbi.db_util #add class to module global du=dbi.du #add instance du to module global from classes.dbfunc import dbfunc as fn print 'globals:',dir() print 'modules','fn:',fn,'du:',du print '' print 'from main' du.m1() #method 1 du.m2() #method 2 print '' print 'from imports' fd=fn() fd.ref_1() fd.ref_2()
#database import #put logic to get db type #db='MySQL' db='Postgres' if db == 'MySQL': from mysql_db import db_util elif db == 'Postgres': from postgres_db import postgres_db as db_util du=db_util() #create the instance print 'dbi globals:',dir() print ''
class db_util(object): _db='mysql' def __init__(self): print 'created',self._db,'class' pass def m1(self): print "this is m1 from",self._db def m2(self): print "this is m2 from",self._db
class postgres_db(object): _db='postgres' def __init__(self): print 'created',self._db,'class' def m1(self): print "this is m1 from",self._db def m2(self): print "this is m2 from",self._db
import dbi as dbi #get db_util based on dbtype #db_util=dbi.db_util #add class to module global du=dbi.du #add the instance as a global class dbfunc(object): ''' a unique function or process that justifies being a class and needs the db_util instance ''' def print_globals(self): print 'from classc' print 'globals:',dir() print 'modules','ca:',ca,'CA:',ClassA print '' def ref_1(self): print 'from dbfunc ref_1 ==>', du.m1() def ref_2(self): print 'from dbfunc ref_2 ==>', du.m2()
The main program imports the db import module which determines the database type and imports the appropriate db_util file. Then creates an instance of db_util as du. In the module that imports the dbi.py module, the db_util instance defined in dbi is defined in the globals for the module with the command du=dbi.du.
Once the concept of each module is a namespace and there is not a ‘global’ as in C/C++, this makes sense.