[Date Prev][Date Next][Thread Prev][Thread Next]
[Date Index]
[Thread Index]
- Subject: Re: Macros and Token Filters Revisited
- From: steve donovan <steve.j.donovan@...>
- Date: Wed, 16 Jun 2010 18:04:22 +0200
On Wed, Jun 16, 2010 at 5:50 PM, Jerome Vuarand
<jerome.vuarand@gmail.com> wrote:
> the filter). So chaining filters may be of little use, unless the
> upstream filter is more laxist (ie. let go through token chains that
> the compiler wouldn't accept).
That's the basic problem - there are no guarantees that chaining
arbitrary filters will work, and the results would often be
order-dependent. It would very much be a 'caveat emptor' situation.
> While saving bytecode reduces portability, what prevents you from
> re-serializing the output of the token filter in source form ?
Nothing, actually. There are a few straight source-to-source macro
systems for Lua already; I've always preferred the direct token-filter
approach because there's no separate compile step, but not a bad way
to solve the distribution problem.
One thing that doesn't get carried over with a source translation
would be original line numbers.
steve d.