Our system uses a lot randomly generated uint64s that can't be
truncated, and while we have a UINT64 class, using it is a pain, they
are too big to be numeric literals, complicates the C bindings, etc.
It occurs to me that long double is big enough to represent a UINT32
without truncation.
I'm working through the changes right now, I think the conversion on
input is working, but some locations in the code are down-casting
lua_Number before printing it. I'm ferreting them out, and I'll
post the
diffs when I get it working, but I could use some advice:
- is this a bad idea that I should forget about? will it make it more
difficult down the road to integrate 3rd party lua extensions, for
example?
- can anybody suggest some simple benchmarks to get a feel for
what, if
any, impact this has on general (non-numeric intensive) lua
performance? I was thinking of using life.lua, and stubbing out the
io.
- Would extending the numeric literal syntax to include hex
initialializers be a bad idea? I'd really like 0xffff to be a valid
number in lua, decimal isn't so readable for the kind of stuff we
do a
lot of.
- Does that fast rounding trick have an equivalent asm for long
double?
Sam