[Date Prev][Date Next][Thread Prev][Thread Next]
[Date Index]
[Thread Index]
- Subject: Re: What's up with token filters (Re: New operators?)
- From: Asko Kauppi <askok@...>
- Date: Sat, 14 Apr 2007 10:46:12 +0300
I think Lua parsing happens stream-wise (which is Very Good). The
complete tree does not need to be there, in order to start parsing
the program.
Where such a difference counts, is i.e. sending data (or code) over a
(slow) network connection.
What I would see best fit for the issue is:
- generalizing lexer so that it works with a 'queue' of lexer
components
(like token filters in coroutines now do)
- having "lua core" and "lua default sugar" always on in such a queue
The main point here would be, the _default_ lexer would serve as a
sample of how addon lexers can be made. They would essentially be the
same.
Whether lexers should produce VM right away... maybe. My syntax
aware token filter actually has the _whole_ Lua syntax built in. Once
I had that done, I wondered the same. Basically, what we'd have ahead
of us is "any syntax" to Lua VM converter.
Weird. But Lua is...? ;)
-asko
Fabien kirjoitti 14.4.2007 kello 0:42:
LHF wrote:
I think that many filters would just need to maintain one (or
several) token queues.
Rather than hacking token queues to translate from an arbitrary
concrete syntax to another, wouldn't you think that the right level
of abstraction is the abstract syntax tree, with some tree
manipulation library, and a generic mechanism to plug stuff into
the parser during compilation? :)