On Mon, Jun 16, 2008 at 6:06 AM, Duck <
duck@roaming.ath.cx> wrote:
Under Linux (gcc 4.x and associated libraries)
Lua 5.1.3 Copyright (C) 1994-2008 Lua.org, PUC-Rio
% = 0x12345678
305419896
% = -0x12345678
-305419896
% = tonumber(-0x12345678)
-305419896
% return tonumber('-0x12345678')
-305419896
>
Under Windows (MinGW 3.x and associated libraries)
Lua 5.1.3 Copyright (C) 1994-2008 Lua.org, PUC-Rio
> = 0x12345678
305419896
> - 0x12345678
stdin:1: unexpected symbol near '-'
> = -0x12345678
-305419896
> = tonumber(-0x12345678)
-305419896
> return tonumber('-0x12345678')
39895547400
>
The Windows string conversion seems to be "unsigned 32-ing" the converted
number, whereas the Linux one is not. I presume the compiler does the
negation of hex constants itself, therefore producing consistent results.
I think it can be considered a bug that (de)hexadecimalisation isn't coded
inside Lua itself, making hex and decimal numbers equal-class lexical
citizens on all platforms instead of relying on runtime library vagaries.
Is there some luaconf.h finessing I can do to make good under MinGW on
Windows? .