|
In my tests, using the -l loading for token filters was absolutely okay; 'require' not required.. ;) I can see the problems arising from a module that relies on some token-modded behaviour, but such modules should just not be made. I think this isn't a problem in the generic sense; people would use token filters for small syntax "enhancements" easying the application level work they do, knowing which ones they load in _their_ environment. Modules and redistributable code should use plain Lua only. Combining different token filters was already solved (I think you knew), based on lhf's and my code. So is syntax awareness of the filtering (_without_ metalua). I am _not_ pushing for a premature verdict on the technology; there i.e. needs to be guidelines as to when code can be said to be "Lua" if it uses token filtered syntax. Can it ever? Do other languages have this feature available, and how did they avoid (did they?) the potentional fragmentation of source code base. I agree, it truly is a whole new world... Maybe call it something else than Lua, until we map the world? -asko
Roberto Ierusalimschy kirjoitti 12.4.2007 kello 19:31:
|