In practice, using maxn is almost always a mistake. It has inferior performance (and the bigger your table is, the worse the gap gets) and any design where you care about the difference between a sequence and a non-sequence and you don't know in advance which you're looking at is dubious at best. Seriously, if you're dealing in sequences, then you ought to know you're dealing in sequences, at which point you should be using #t for the performance. And if you're dealing in arbitrary tables that may or may not be a sequence, you shouldn't be trying to treat them as sequences at all, but rather explicitly tracking a length or something.
As I’ve said before, all this is true *if* you have control of the entire set of Lua code running. It’s not clear to me how an author of a library/API that consumes a sequence can efficiently ensure that he is passed a valid sequence. Of course, the API could just assume validity and fail in unexpected ways, but I don’t regard this as good API design.
—Tim
|