Re: New version of C-like interpreter available

Steven D. Majewski (
Sat, 9 Apr 1994 04:49:29 GMT

In article <>,
Leslie Mikesell <> wrote:
>The one part of "unix" programming that I never felt was captured
>properly in perl4 was the concept of piping through distinct
>entities. I haven't looked at perl5 so perhaps this has been fixed.
>What I mean by this is that you commonly attack problems at different
>times in different ways (or perhaps different people do different
>parts). Then you find that you can combine the solutions into
>a pipeline of several operations at once without making any changes
>to the individual programs. You can, of course, do this with perl
>by continuing to run multiple processes with real pipes, but since
>it can do everything else internally it just "feels" wrong. It
>seems like there should be a way to simulate a pipeline of programs
>from a set of separate scripts within a single process. You can,
>of course, rewrite the program to pass $_ around to subroutines,
>but that's not the way unix people think. The unix tradition is
>to write each piece separately with dozens of command line switches
>and expect it to run indepentently of anything else except it's
>i/o streams. Having to write subroutines and packages that allow
>re-use but don't run independently doesn't quite fit.

In Python, I wanted to reuse a piece of code - a Python module -
which printed it's output, rather than returning an output value.
The object oriented nature of Python made it easy to write an
output redirector that caused printed output from a function to
be returned as a list of strings. Any object that supports read/
write/readline(s)/writelines/etc. can be used in place of a file,
including the stdin and stdout.

tolines( func, arg1 [, ... argn ] )
applies func to args, returning stdout output as a list of lines,
and is typically used in the calling program as:

for line in tolines( func, arg1 [,... argn] ) :
# process line
# and write output or accumulate results

( In the particular case that inspired this, I wanted to insert
source code statements into the output of the python byte-code
disassembler, but I didn't want to have to read and understand
the disassembler to modify it. )

If one were planning for code reuse in the first place, rather
that printing output, it would be better to return an object
that could be processed further if desired, but which had a
formatted print representation, so that out could either:

print func_a( args ) # print returned objects print repr string,or

func_b( func_a( args )) # compose it's result with another function.

But nested function would get awkward if the got to be nested very
deep. It would be possible to return a class that had a 'pipe' method,

fa( args ).pipe( fb, args ).pipe( fc, args ).pipe( fd, args ).print()

or maybe, to make it look more regular:
Mkpipe( fa,args ).pipe( fb, args )...

with a helper function like tolines to coerce output from functions
that weren't written to return the proper object.

Python also has a module that sets up pipeline templates for external
commands, fills in filenames or other variables into the template and
executes them. It should be possible to do the same sort of thing
with internal functions - build a pipeline template and execute then
execute it with args.

-- Steve Majewski (804-982-0831) <sdm7g@Virginia.EDU> --
-- UVA Department of Molecular Physiology and Biological Physics --
"Cognitive Science is where Philosopy goes when it dies, if it hasn't
been good" - Jerry Fodor.


-- Steve Majewski (804-982-0831) <sdm7g@Virginia.EDU> -- -- UVA Department of Molecular Physiology and Biological Physics -- -- Box 449 Health Science Center Charlottesville,VA 22908 --