try:
file = open('...', 'r')
data = process_file(file)
except IOError:
data = process_something_else()
the assumption is that we don't know for sure whether process_file will
raise any exceptions, but we don't _expect_ it to, and if it does we're
so screwed up the only sensible thing to do is to give up (at this point
-- it doesn't preclude a panic handler up the call chain from cleaning
up).
For example, maybe you got process_file from me, and I documented it as
never raising any exceptions, but because of an error in my code-- or
even a hardware failure --it does raise IOError. The problem we're
trying to solve is how to stop _unexpected_ exceptions from getting
caught unintentionally in a try block. E.g., I've been burned by a
statement as innocuous as
n = n + 1
in a try block that expected to need to handle overflow in very rare
cases -- but never in the "n=n+1" statement (& the overflow in that
statement, which was an error in the design of the program, got handled
silently by mistake).
I think you're right that, today, if people _do_ anticipate getting
IOError from process_file, and want to treat it differently than an
IOError from open, they handle it (as you suggest) inside process_file,
or (as you also suggest) use a nested form:
try:
file = open('...','r')
try:
data = process_file(file)
except IOError:
do something about process_file's IOError
except IOError:
do something about open's IOError
But that's a different problem. Jaap is right that, today, when people
are trying to solve _his_ problem <grin>, they do it as he sketched (with
an extra variable and a conditional block after the try) -- or worse but
probably more common, leave the code unsafe because the extra-variable-
plus-conditional-block business is perceived as too painful.
I agree it's unpleasant to separate the non-exceptional-case code from
the try block, but if it's in the try block it may be unsafe, and
try:
file = open('...', 'r')
except IOError:
data = process_something_else()
else:
data = process_file(file)
is a good deal more obvious than the current
try:
file = open('...', 'r')
openworked = 1
except IOError:
data = process_something_else()
openworked = 0
if openworked:
data = process_file(file)
especially as the blocks grow larger.
> I _am_ assuming that this isnt someones idea of a 'test' condition for
> the files existance.
Bad assumption, John! This is the way people (usually) do it in Python.
Opening a file for reading can fail for a number of reasons (like doesn't
exist, or does exist but inadequate permission), and most programmers
don't usually care exactly _which_ reason caused a failure. Having open
raise an exception meets the most-common use fine (and for programmers
who do care about the details, they're available as IOError's "detail"
tuple).
> It is more efficient to do the stat() & branch then to go through the
> exception handling mechanism.
Two problems with that:
1) By actual timing, it's much cheaper for Python to set up a try block
than it is to make a stat system call (I measured this under SunOS).
2) More fundamentally, if Python's posix.stat can't find a file, it
raises Python's posix.error exception! I.e., like a failing Python
open, a failing Python stat doesn't return a failure value. So even
if a stat call were cheaper, it wouldn't get us out of the exception-
handling business.
struck-by-the-fundamental-interconnectedness-of-all-things-ly y'rs - tim
Tim Peters tim@ksr.com
not speaking for Kendall Square Research Corp