anti-résumé, 2015

(Not a portrait of me by Frederic Leighton.)

Name: Not Clive James, Not Robin Hanson

Age: Missed most of the Twentieth Century

Address: Almost nowhere, really.

Nationality: Not a great deal. I don't participate in Scottish culture any more than I have to by merit of enculturation.

Languages: I can't speak Gaelic, Spanish, Cornish, Saxon, nor many many others...

Non-interests: Sport, war, conlangs, Tarski, collecting anything, scale-modelling, Dr Who, surfing, bell-ringing, spelunking.

Education lapses:
At Secondary level:
No economics, no philosophy, no gender, no business studies, no psychology, no politics, no French, no Chinese, no grammar (properly). I'm also quite bad at geography.

Tertiary: No law, no anthropology, no engineering, no geology, no medicals, no German philosophy to speak of.

Code: I don't know any functional languages (unless you count the pariah JavaScript). I've never used a static code analyser, or rigorous optimisation without one. I have never proved my program. I have not yet contributed to the mainline of FOSS.

I've made definitive surveys of no field - nor any phenomenon, physical, cultural, or other. I know nothing of Nussbaum, I am ignorant of Avenarius, I haven't a clue about Conway. I've contributed nothing to either the mainstream nor dissenting schools of economic thought.

I've never read Hemingway, Goethe, Brookner, Isherwood - nor anything from the Harlem Renaissance (nor much of the Euro Renaissance) - nor Houellebecq, Duras, Vargos Llosa, Thackaray, Musil, Bainbridge, Naipaul, that Girl With the Dragon stuff, Zola, Behn, Updike, Wolfe, Richardson, Barth, Byatt, Bellow, Brecht, Kazantzakis, Paz - almost no Classical stuff - Smollett, Wharton, Trollope, Nin, neither Amis, Eco, Roth, Coetzee, Tóibín, no Christie or Hammett, nothing I could afterwards identify as "chicklit" (except Austen?) and, despite heroic efforts, I have not yet successfully climbed a Pynchon.

There are also a vast number of things I do not know that I don't know about. (I imagine.)

References who don't know me at all: Nassim Nicholas Taleb, Bruce Campbell, Johann Hari, Deirdre McCloskey, Abdul Fattah Younis al Abidi, John Worrall, Guo Qian, James Corden.


When we compete for jobs, or lovers, or whatever, there's pressure on us to distort upwards, to portray ourselves as positively as possible (if not more). We are unable to disclose everything about ourselves - it'd take too long - so selection is obviously not wrong in itself. But there's a kind of anxiety that that our approach to our CVs is mirrored in our approach to life.

It becomes the job of rivals to point out our limitations and lacks; being clear about negative facts about oneself - a prerequisite for avoiding delusion - is seen as unnecessary modesty.

The scepticism that might save us gets eroded by all kinds of things: by the market operation of selling oneself, by our epistemic frailty, and above all by psychological foibles. Nassim Taleb has spent his life pointing out that this leads to a dangerous fragility in our theories and lives; all the way down to our metaphysics, all the way back to the MRCA.
It is useful and impressing to be able to list one's strengths. To have an eye on one's weakness is noble.


oh god the data miscellany

The reason to use quantitative methods wherever possible* is not that numbers are generally better representations — they're not, for psychological or social or art phenomena — but because of what they do to your method: first, they minimise the space that our raging biases get to act in; and, more, because they force the enquirer to think clearly about the Thing.

("What about this Thing can be counted?" implies the prior questions "What are the distinct features of the Thing?" and "From which of its features arise which features?")

* And they are possibly always possible.


The quantitative omits most of the lived world; the qualitative includes all of its bullshit. (The former is leashed to one rich dimension, numbers; the latter is leashed to an insensitive kludge, human perception and language.)


In the course of learning to code, one faces up to metaphysics; even trivial programs require a solid grasp of the differences between type, token, sub-type, condition, participation, inherence and all that good stuff.

I admit that not many students would benefit from having object-oriented ideas rendered in old conceptual dress. But I really would've, so:



Alisdair wouldn’t stay after that... there was nothing more inhospitable than leaving your television on in front of a visitor.

- Iain Crichton Smith, who died before smartphones


Even in Java – which is not an explicitly philosophical programming language * – there are five different sorts of nothingness:
  1. zero (that is, a variable of type int with value 0),
  2. nulls (that is, an address that points nowhere),
  3. uninitialised^ (that is, a variable with spooky undetermined contents),
  4. 'void' (that is, a placeholder telling you to expect no results)
  5. and Void (some clothing for the small-v void).

* Compare Haskell or Lisp if you can, which I can’t.

^ Only local variables are ever left uninitialised in Java, and compilers don't let them through without a value. But still the nought is there.


We argue that we should accept the fact that we live in a qualitative world when it comes to [global] natural processes. We must rely on qualitative models that predict only direction, trends, or magnitudes of natural phenomena, and accept the possibility of being imprecise or wrong to some degree.

- Orrin Pilkey

When did we* start to view qualitative data as lesser? When we finally became able to properly collect and analyse number representations rapidly and en masse?

(There is a rearguard of Classical empiricists, who point out that science needs the qualitative because that is what the most direct, non-mediated data is made of. While I'm not sure what to make of them, they never refer to biases, or calibration, or computation, or any of these things that modern science is totally constructed from...)

* Assuming we're all part of the logocentric analytical élite here (or at least the Bayesian Conspiracy).


Extensive background radiation studies by IBM in the 1990s suggest that computers typically experience about one cosmic-ray-induced error per 256 megabytes of RAM per month.

- 1996 source

...[so] if you have 4 GiB of memory, you have [a] 96%
chance of getting a bit flip in three days because of cosmic rays.

- 2010 source

In modern devices*, much data corruption is the result of cosmic rays from spaaaace. You can solve this by increasing the ‘critical charge’ of the device, so that fewer rays are strong enough to mess up the bits. Trouble is, that move consists in decreasing the power efficiency of every single operation, of which there are now billions per second. So the more green you make your device, the more vulnerable to the universe it becomes.

(While we're on trade-offs in formal matters: the above electrical trade-off is nothing compared to the mathematically inexorable choice involved in how much you let your properly formal system do: you can have generality or you can have consistency and provability.)

* Legendarily, in pre-modern devices hardware failures were just one of the hourly explosions or insects in the 'tubes.

Try to live at or below sea level, or work under a mountain à la NORAD You might also try to live in the lower floors of high-rise buildings, but avoid being near windows.

- expert advice


Goethe was much closer to reality than Hegel... We would rather choose
Goethe as the man who knew better life in its entirety and the numerous
relationships within it.

- a particularly eccentric passage in
a superficially dry book about nonlinear functions


Dumb koans from learning to code:

  • In order to understand recursion, you must first understand recursion.

  • The threads share a heap.

  • Objects is instances is structures is clusters.

  • An abstract class can never have instances of itself.

  • Pointers are in the stack; primitives in the stack.

  • With us, recursion grows the stack.

  • The Tower of Hanoi is a Stack.

  • Always put thread sleep in a try-catch.

  • There is no escaping the loop invariant.

  • If no one knows where you are, you get deleted.

  • Make things final if you can.


All that is needed for evil to triumph is for good men
to respond rationally to incentives.

- Misha Gurevich


L just finished working as an exam invigilator. She says Comp Sci students are the most adept at filling in the front of their scripts, Economics students the absolute worst. Any thoughts, as you’re a man from both sides?

What the two fields have most in common is the very mixed blessing of sheer job-marketability, which obviously draws a certain sort. (Though also pragmatists like the EAs.)

There’s weak evidence that economists are already kinda scummy when they enrol. Nothing to suggest that studying it changes one, except our prejudices. There’s a perception that this slime is intellectual as well as moral, but I couldn't say; all my tutorials were about as hollow, demoralising and eristic as each other, whether the subject was philosophy, economics, history, or English.

CS students are a slim majority of this kind of opportunist, but 1) at least they have the integrity to try and succeed via the manipulation of objects rather than people; 2) there are also many real engineer-philosophers with really new solutions to really profound questions. I have learned so much here.

Ben Shneiderman (2011),
treemap of World Bank new-business data


Table of Conceptual conversions

in converting between maths, metaphysics, and object-oriented code:


Maths ^
Type * Set ** Class Apples A description,
sufficiently generalised to
fit every member of a set.
Token Element Object An apple An instance that satisfies
that description.
Subtype Subset Subclass Granny
A more specialised
description, covering
only some of the first set.
Complex idea Family Inner class Apple with a teratoma? A set which includes another set as member.
Relation Relation Association e.g. Between an
apple and Newton's
lying mouth
Logical connection of
any sort.
Essence Condition Interface implementor Apples implement
Sine qua non
? Domain and
Type ?. Spec of valid input and output values.
Noumenon ? Model Der Apfel-an-Sich. The entities as they
are themselves, beyond
our view. (Sometimes
primary properties though.)
'Structure' FORM Controller Eh. The formal shape and relations
of a system - what the noumenal and
phenomenal share.
Phenomenon Particular expression? View The cuticle?
No; the secondary properties.
What is perceived
of an entity.
Function Pure function "Apples can be mapped to oranges via the following CRISPR program..." A machine for pairing inputs with outputs.
Universal Abstraction Abstraction "The apple:
a noble fruit.
Zoomed-out view of
some collection;
defining properties shown.
Concept Boolean function Classifier "Is the star apple an apple? No; not of the genus malum." A process for identifying kinds of things.
Supervenience Asymmetric
An apple supervenes
on the
electrostatic forces
of its constituents.^^
Property Variable Attribute Apples can
be clones.
Something about a thing.
Inherence Predication Attribution
This apple is a
Changing the nature
of a thing (or only making
an assertion about it (...))
Individuation Evaluation Assignment Billionth clone of the
Orange Pippin cultivar
(UK NFC 2000-008).
To give a definite
meaning to some
"A mere
Aggregation The apple has
Relation such that one
thing (apple)
contingently 'has' a set of
other things (carbon, etc).
? Composition Apple has an
endocarp, mesocarp
and exocarp.
Strongest aggregation:
parts are fully contingent
on the whole.
Primitive Primitive Primitive "Why? Why?
Why? Why?"
Thing which can't be
defined in terms of
other things. Or isn't.
stage theory
? 'Immutable'
? Nothing survives change;
all changed things are
wholly new objects.

^ Should probably have a separate column for Logic but bleh.

* Or Platonic Form, or Class, or Kind, or 'Universal' for various subtle gradations of the thought not relevant here. I should also relate or conflate the two rows beginning "Type" and "Universal" but I tire of my own hubris.

** This should maybe be "Class" too. But I haven't seen anyone worrying about Russell's Paradox in Java, so frankly whatevs.

*** I mean "adding an attribute", giving a class a field it didn't have before...

^^ and so ultimately supervenes on Wolfgang Pauli.

Wanted to include mass expressions in here somewhere because it's a cool thought, but.