Naming considered MOSTLY harmless

Posted by Venerable High Pope Swanage I, Cogent Animal of Our Lady of Discord 20 June 2014 at 08:10PM

This is a continuance of a thought that I'd briefly mulled over on twitter to some reaction. No point in summarizing as it's only a few characters: "In I find it best to try and avoid naming anything except for the tools (fns and constants) used to construct a running system."

The reactions were great, as I would more or less expect, namely trying to drive at the thrust of why I would say such a thing. "What are examples of things I would avoid naming?" "By naming do I mean assigning to a Var, or something else?" and "My examples are primarily stateful things, is this to cure live programming woes?"

I'll do my best to expound on my reasoning in more detail, become more crisp in defining what I mean, and perhaps persuade you to strive in the vicinity of this same goal.

Fun With Maths

Posted by Venerable High Pope Swanage I, Cogent Animal of Our Lady of Discord 10 June 2014 at 08:13PM

I have lived in Baltimore County, Maryland exclusively since November of 2004.

In that time I have only received one Jury summons, it was for tomorrow, and my call in number was 365. I ducked getting called, as they only wanted 1 through 225.

Looking at the call in messages for today and for tomorrow, it appears that ~400 individuals are chosen each day. If one presumes the courts are only operating on standard business days, that yields about 248 days of 400 individuals. Each year, then, 99,200 individuals are randomly summoned to appear as jurors.

Presently the population of Baltimore County is approximately 817,455. If the proportion of summoned jurors to population held constant over the past ten years (a groundless assumption), then each year an individual has a ~12.13% chance of being summoned. Consequently your odds are ~87.87% of not being selected in any given year.

The odds of successfully avoiding summonses for 9 years in a row are ~31.22%. By purely random chance, one could expect to get a Jury Summons once every 5 years (.8787^5 = .5238). The odds of getting summoned two years in a row are ~1.5%.

nil, null, let's call the whole thing off.

Posted by Venerable High Pope Swanage I, Cogent Animal of Our Lady of Discord 04 June 2014 at 01:30AM

Apple has recently announced the Swift programming language, along with language specifications and documents to support it. I'm not particularly interested in learning much about this language, in no small part because it seems like it's tightly coupled to Apple's walled garden. However, I am interested in discussing nil and/or null, which I will call nil from now on, because Swift calls it that, Clojure calls it that, and Ruby calls it that.

Swift promises to disallow the use of nil in a variable except in those cases which someone has specifically allowed for nil; be it a parameter, a variable declaration or what have you.

Having written 0 Swift programs, my opinion on this is tangentially valuable at best, but I have written many programs in many other languages, and each of those languages has always had some concept of nil, and many of them were entirely birthed after nil was well explored and people understood the costs of nil, and chose to include it anyway. I'd like to discuss some arguments about why I don't find Swift's approach valuable. Additionally I'd like to discuss some other ways languages have approached the problems nil introduces.