lua-users home
lua-l archive

[Date Prev][Date Next][Thread Prev][Thread Next] [Date Index] [Thread Index]


Hi there,

>> If you use the wrong get() function ... lua throws a memory exception.

LHdF> Yes, I've already noticed that. It's in my TODO list. I think there are
LHdF> also GC issues. Anyway, that code is experimental code at best.

I think I should read docs more thoroughly, especially if they come
with something that is marked as experimental.

Nonetheless, I've experimented further with the experimental stuff,
writing something vaguely similar to schemes syntax-rules, i.e.
something that allows one to extend luas syntax, but not (or to a much
lesser degree) change it. It seems to work reasonably well.

I still think this token filtering mechanism is brilliant. It does not
give one as much control as such a preprocessor would have when tied
in with the parser, but for a simple preprocessor you can get away
with minimal knowledge of the lua grammar. A few things I found along
the way: 

You can not define a token filter in the same file you use it in, as
this would have to be done using a function, and functions are only
ever called after the current file has completely been parsed.

Same thing goes for require(), you can not define a token filter in a
file, then require the file and subsequently use the new syntax
defined in it, as it seems to evaluate the file it loads only after
all parsing has finished.

Using -l works, though, as it actually runs the files it loads before
carrying on.

For the same reasons outlined above, luac would also not be able to
make use of anything implemented using the token filter.

Because of this, currently using syntaxtic extensions requires calling
your program like so: lua -l mynewsyntax myfile.lua (worse if you
separate the underlying mechanism from the actual rules). This can get
a bit tedious, but is not really a biggie.
However, I wonder if it might be a sensible idea to change require
such that it should actually evaluate the loaded file. This would make
using stuff that uses the token filter much easier. However, I see no
other need for such a change, plus it would probably break luac.

Another undesirable feature I found, if there is a runtime error in
the token filtering code (that would include the conversion code for
all custom syntactic extensions), lua more often than not silently
exits. 

Next, if I may, I'll make a few remarks on the todo list:

A word about the core dump: actually the FILTER function does seem to
be re-initialized with a new lexer function after a syntactic error.
The core occurs when the code calls the previous lexer function.
Unless you do that, all is (or at least seems to be) fine. If an
invalid lexer function just returned nil, everything would be fine as
well.

The points concerning providing a full blown macro facility are not
that pressing, IMO. It is not too hard to do that yourself with the
token filter mechanism.

Also, I'm not quite sure about what the INIT and FINI functions are
supposed to do. I am also not convinced that there would be an actual
need for extra callbacks, as an initialisation is neatly signaled by
receiving a lexer function from the call to FILTER, and the need for
finalisation is signaled by the lexer function returning an eof token
(or another lexer function...).

To sum it all up:

After implementing a somewhat complex token filter based solely on the
FILTER function, I feel that that is all that needs to be exposed to
lua (and renamed, but I already mentioned that earlier on...). I would
really like to see this as a standard lua feature, as it does open up
a lot of possibilities with a rather small change to the core.

Finally, I hope that anybody is interested in this, otherwise I would
be spamming the list with enormous posts... ;-)

Have a nice day,

Gunnar