PYTHON: 2.3, 2.4
PLATFORMS: Solaris, Linux, Cygwin, Win32
Local sister modules seem to hide global ones (of the
standard Python library) when import occurs in a
submodule. This statement even holds for indirect
imports from the standard Python library.
FILE STRUCTURE for EXAMPLES:
- my/
+-- __init__.py
+-- main.py
+-- main2.py
+-- symbol.py
\-- types.py
EXAMPLE 1: Local submodule shadows global one.
# -- file:my.main.py
# COMMAND-LINE: python my/main.py
# MY INTENTION: Import standard module "types".
import types #< FAILURE: Imports my.types
if __name__ == "__main__":
print types.StringTypes #< EXCEPTION: StringTypes
are not known.
# -- FILE-END
EXAMPLE 2: Indirect import uses "my.symbol" instead.
# -- file:my.main2.py
# COMMAND-LINE: python my/main2.py
# MY INTENTION: Import standard module "compiler".
# NOTE: Module "compiler" imports module "symbol"
import compiler #< FAILURE: Imports my.symbol instead
if __name__ == "__main__":
pass
# -- FILE-END
NOTE: Module import problems can be better checked
with "python -v".
I have not found a work-around that let me decide if I
want to import the global module or the local one. The
only solution seens to be to relocate the module
where "__main__" is used to another place where no
such import conflict occurs.
If my analysis is correct, the "main" module provides
another ROOT filesystem for Python libraries that is
normally preferred over the PYTHONHOME filesystem.
If this is true, module names at this level must be
UNIQUE in a GLOBAL namespace (that is only partly
under my control) which I consider BAD.
NOTE: In C++ if have the "::" prefix to indicate that
I want to use the global/default namespace (=module)
and not a sub-namespace. I am not aware of such a idom
in Python.
|