[Date Prev][Date Next][Thread Prev][Thread Next]
[Date Index]
[Thread Index]
- Subject: RE: Lua presentation at XGDC3.0
- From: Mathew Hendry <Mathew.Hendry@...>
- Date: Wed, 19 Sep 2001 11:03:44 +0100
> From: Joshua Jensen [mailto:jjensen@workspacewhiz.com]
>
> I concur with your "Lua vs. Rolling-Your-Own" slide. If I
> were to write
> a language, it would almost be Lua. :) The only things I'd
> change are:
>
>...
>
> 2) I'd change the default number type from double to float.
> Yeah, yeah,
> I know I can #define it, but the truth is, it doesn't work in Lua 4.1
> Alpha. :) (Amped crashes...)
I've been looking at this recently with v4.0 and it seems it would be rather
awkward. I haven't delved too deeply yet, but typedef Number is used for
both floating and integer values in struct Value - and a 32-bit float cannot
accurately represent a 32-bit int! What you may be seeing is "collisions"
between nearby integer values.
Having everything as double will be affecting performance as well as
footprint: on the PS2, one of the platforms I'm working with, double
precision has to be emulated in software.
-- Mat.