|
It’s always seemed to me that the root of the issue is not allowing a sequence to have an element with a value of nil. Which to me is odd, because every other part of Lua is happy with nil as a value (locals, upvalues, arguments, return values and even globals if you allow that an undefined global reads as nil). So if you *do* allow nil in a sequence, what are the consequences? — The definition of a sequence changes (obviously). — There is no algorithmic way to determine sequence length by examination of table contents. — So, there needs to be a declarative way to determine sequence length; hence #t becomes read/write. — (Probably) Every table is also always a sequence, but most/many tables have an empty sequence (#t == 0). — Reading #t becomes O(1) since it is just stored state. — The need for “.n” goes away in table.pack(), and pack/unpack become symmetrical. — ipairs() respects #t, and there is no discontinuity between length, iterations, and the behavior of __len metamethods when using ipairs() etc. But… — How should constructors set length? What are the lengths of {}, {1, 2} and {1, 2, [3]=3} ? — Existing shallow table copies will copy the contents but not the length (table.copy() anyone?). — Decreasing the length of a sequence would orphan keys; what should happen to these? — Code relying on ipairs() stopping at nil will break. — Code relying on current #t behavior will (sometimes) break. — Code relying on “.n” from table.pacl() will break. — table.insert()/remove() semantics would need visiting (should they modify length?). The last four items are backward compatibility issues. My opinion, fwiw, is that if you ignored these, then the declarative length scheme is more coherent than the existing scheme. However, when you factor in how much existing code would break, then things are far less clear. —Tim |