|
On 6/23/16 3:10 PM, Martin wrote:
On 16-06-23 04:48 AM, Viacheslav Usov wrote:1. (More important) The reference manual should point out that {...} is the best way, overall, to deal with variadic arguments, and select is deprecated.You ignore memory allocation. As I understand, select(i, ...) requires constant amount of memory while "{...}" allocates table and consumes memory proportional to length of "...". For example in World of Warcraft lua API there are many sick functions which returns dozens of results. Instead of single table with results. Maybe just because it's faster in C and does not create table which will occupy memory till next garbage collection. Also if your addon needs just couple of these results you'd better to use select(). Again, to avoid creating temporary table which will occupy memory. (Lua-L doesn't always like my mailhost, so I'll respond directly to you) Since I was heavily involved in the WoW UI community at the time this change was made, I can comment on your observation The shift from tables to varargs wasn't really about the impact at time of creation, but instead was about avoiding the buildup of garbage since GC passes could happen at inopportune moments resulting in a poor player experience. Since the WoW UI is an environment that is essentially running dozens (hundreds for some extreme cases) of independent small applications including some which were from Blizzard and needed to be protected from the others, the normal "single application" approaches to avoiding GC (Shared result tables, work table re-use) didn't really apply.A secondary "benefit" of varargs from a security perspective is that they don't actually exist as objects so one cannot capture the reference to a vararg object and then manipulate it later. So, taking the (slight, in most cases) additional overhead of working with varargs to get predictable stack-based allocation/cleanup was preferable to using tables and having to deal with unpredictable GC. Certainly one can't avoid garbage collection entirely, but not allocating tons of small tables many times a second greatly reduced the frequency and size of the pauses. In practice for performance sensitive code making a lot of use/re-use of a result of a particular UI call, it was generally better to stick it in a table (if possible a re-usable scratch table) - though the exact sweet spot was something you'd have to determine for each situation. Daniel (Iriel). |