Skip to main content

oh god the data miscellany



The reason to use quantitative methods wherever possible* is not that numbers are generally better representations — they're not, for psychological or social or art phenomena — but because of what they do to your method: first, they minimise the space that our raging biases get to act in; and, more, because they force the enquirer to think clearly about the Thing.

("What about this Thing can be counted?" implies the prior questions "What are the distinct features of the Thing?" and "From which of its features arise which features?")


* And they are possibly always possible.


****************************************************************************************


The quantitative omits most of the lived world; the qualitative includes all of its bullshit. (The former is leashed to one rich dimension, numbers; the latter is leashed to an insensitive kludge, human perception and language.)


****************************************************************************************


In the course of learning to code, one faces up to metaphysics; even trivial programs require a solid grasp of the differences between type, token, sub-type, condition, participation, inherence and all that good stuff.

I admit that not many students would benefit from having object-oriented ideas rendered in old conceptual dress. But I really would've, so:


[   CONCEPTUAL CONVERSIONS LOOKUP TABLE  ]



****************************************************************************************


Alisdair wouldn’t stay after that... there was nothing more inhospitable than leaving your television on in front of a visitor.

- Iain Crichton Smith, who died before smartphones



****************************************************************************************


Even in Java – which is not an explicitly philosophical programming language * – there are five different sorts of nothingness:
  1. zero (that is, a variable of type int with value 0),
  2. nulls (that is, an address that points nowhere),
  3. uninitialised^ (that is, a variable with spooky undetermined contents),
  4. 'void' (that is, a placeholder telling you to expect no results)
  5. and Void (some clothing for the small-v void).


* Compare Haskell or Lisp if you can, which I can’t.

^ Only local variables are ever left uninitialised in Java, and compilers don't let them through without a value. But still the nought is there.



****************************************************************************************

We argue that we should accept the fact that we live in a qualitative world when it comes to [global] natural processes. We must rely on qualitative models that predict only direction, trends, or magnitudes of natural phenomena, and accept the possibility of being imprecise or wrong to some degree.

- Orrin Pilkey


When did we* start to view qualitative data as lesser? When we finally became able to properly collect and analyse number representations rapidly and en masse?

(There is a rearguard of Classical empiricists, who point out that science needs the qualitative because that is what the most direct, non-mediated data is made of. While I'm not sure what to make of them, they never refer to biases, or calibration, or computation, or any of these things that modern science is totally constructed from...)


* Assuming we're all part of the logocentric analytical élite here (or at least the Bayesian Conspiracy).


****************************************************************************************

Extensive background radiation studies by IBM in the 1990s suggest that computers typically experience about one cosmic-ray-induced error per 256 megabytes of RAM per month.

- 1996 source



...[so] if you have 4 GiB of memory, you have [a] 96%
chance of getting a bit flip in three days because of cosmic rays.

- 2010 source


In modern devices*, much data corruption is the result of cosmic rays from spaaaace. You can solve this by increasing the ‘critical charge’ of the device, so that fewer rays are strong enough to mess up the bits. Trouble is, that move consists in decreasing the power efficiency of every single operation, of which there are now billions per second. So the more green you make your device, the more vulnerable to the universe it becomes.

(While we're on trade-offs in formal matters: the above electrical trade-off is nothing compared to the mathematically inexorable choice involved in how much you let your properly formal system do: you can have generality or you can have consistency and provability.)


* Legendarily, in pre-modern devices hardware failures were just one of the hourly explosions or insects in the 'tubes.



Try to live at or below sea level, or work under a mountain à la NORAD You might also try to live in the lower floors of high-rise buildings, but avoid being near windows.

- expert advice



****************************************************************************************

Goethe was much closer to reality than Hegel... We would rather choose
Goethe as the man who knew better life in its entirety and the numerous
relationships within it.


- a particularly eccentric passage in
a superficially dry book about nonlinear functions


****************************************************************************************


Dumb koans from learning to code:

  • In order to understand recursion, you must first understand recursion.

  • The threads share a heap.

  • Objects is instances is structures is clusters.

  • An abstract class can never have instances of itself.

  • Pointers are in the stack; primitives in the stack.

  • With us, recursion grows the stack.

  • The Tower of Hanoi is a Stack.

  • Always put thread sleep in a try-catch.

  • There is no escaping the loop invariant.

  • If no one knows where you are, you get deleted.

  • Make things final if you can.




****************************************************************************************

All that is needed for evil to triumph is for good men
to respond rationally to incentives.

- Misha Gurevich



****************************************************************************************

L just finished working as an exam invigilator. She says Comp Sci students are the most adept at filling in the front of their scripts, Economics students the absolute worst. Any thoughts, as you’re a man from both sides?

What the two fields have most in common is the very mixed blessing of sheer job-marketability, which obviously draws a certain sort. (Though also pragmatists like the EAs.)

There’s weak evidence that economists are already kinda scummy when they enrol. Nothing to suggest that studying it changes one, except our prejudices. There’s a perception that this slime is intellectual as well as moral, but I couldn't say; all my tutorials were about as hollow, demoralising and eristic as each other, whether the subject was philosophy, economics, history, or English.

CS students are a slim majority of this kind of opportunist, but 1) at least they have the integrity to try and succeed via the manipulation of objects rather than people; 2) there are also many real engineer-philosophers with really new solutions to really profound questions. I have learned so much here.



Ben Shneiderman (2011),
treemap of World Bank new-business data


Comments