On Tue, Apr 03, at 03:25 Oliver Kroth wrote:
Am 03.04.2018 um 15:02 schrieb Ahmed Charles:
Written before I was born, EWD831
(http://www.cs.utexas.edu/users/EWD/transcriptions/EWD08xx/EWD831.html)
describes why 0 based ranges are superior better than I can. It has
nothing to do with programming languages and everything to do with math.
Actually, Dijkstra just prefers to have items counted like 0,...,N-1 instead
of 1,...,N
odd.
When I count things, let's say, cows, I start with the first cow and say
"one", second cow "two", third cow counted "three"...
I would not start with the first cow and say "zero", I say "zero" when there
is no cow at all.
A zero-th cow is the cow in the empty corral, and actually "not a cow"...
I'm going to be another one that bites the dust here.
So, lets say that we could talk with a just born human child, and asked him:
how old are you?
Is not one, but is not zero either. He actually is zero and an offset.
Back in the old times learned to say that we are __at__ [age] or walking
at [age].
In this regard and in a humanish way of thinking and without second
thoughts, Lua got it right. But i do not think that machines thinks the
same.