[Date Prev][Date Next][Thread Prev][Thread Next]
[Date Index]
[Thread Index]
- Subject: Table Garbage Collection
- From: "Nick Nugent" <nugent@...>
- Date: Wed, 22 Nov 2006 13:32:11 -0800
I'm trying to wrap my head around some garbage collection basics,
here, and I'm having a hard time really getting to the bottom of
things. I'm hoping someone can explain (or point to an explanation
for) the following behavior. For the record, I am using this example
with the Lua 5.1.1 standalone interpreter on Linux.
Consider the following chunk:
-- BEGIN
collectgarbage("collect");
print(collectgarbage("count"));
do
for i = 1, 1000000 do
local t = {};
end
end
print(collectgarbage("count"));
collectgarbage("collect");
print(collectgarbage("count"));
io.read();
-- END
In this case, everything works as I would expect. The output shows
moderate memory growth before the final garbage collection and then a
return to the VM's initial size.
If, however, I replace the contents of the do block with the
following, I see something very different.
-- BEGIN
do
local g = {};
for i = 1, 1000000 do
local t = {};
g[i] = t;
end
end
-- END
Now, as I understand it, at the end of this block, all the strong
references should be out of scope, and the garbage collector should be
able to reap everything I've just done. The script output actually
appears to agree, showing significant growth before the collection,
and a return to the initial size again afterward. In this second
case, however, in contrast to the first, inspecting the size of the
interpreter's process in the host operating system shows that very
little of the total memory allocated by this script is ever freed.
Am I missing something glaring in either my script or my analysis?
I've searched pretty significantly online and in the list archives for
the answer, but I haven't stumbled upon it yet, so I thought I'd ask.
Thanks.
--Nick