17/04/2014

new start miscellany

'Adrift' (c) Andrew Wyeth (1982)


Talk is cheap; that's what's good about it.


**********************************************************


A friend recently told me he's never going to work in the private sector. Interesting thing to say - he's not so political, and was never before so programmatic. Apparently it's the temp-contract, high-pressure ugly bullshit of it that motivates his boycott more than the world-historical extractive-destructive part. There is a strong argument against such chastity, but I didn't press it too strongly, this time.


**********************************************************


Why in general is it harder to do things well than badly?

h₀: Because we define 'well' and 'badly' by how hard things are. Or, time-intensive at least. (Boo! Constructivist cop-out boo!)
h1: Because good things tend to occupy lower-entropy states and, by the second law, so require more Work to create and maintain.
h2: Because we are tuned for satisficing ('good enough'), not optimisation ('good as possible'). On an evolutionary scale, maybe, we didn't have time to optimise anything, so hasty mediocrity is our default state.
h3: Or another evolutionary argument: maybe prehistoric 'savannah' tasks admit of fewer grades of quality than those stipulated by audiophiles or wine buffs today, so, again, our quality organs are underdeveloped. (The impala was either dead enough or not dead enough?)
h4: Brute probability. Maybe the number of quality states is just much smaller than the number of bad states of things. If you've spent much time among C20th modernism, this will seem plausible to you. 

*************************************************


What does 'radical' mean? Half the people I read use it as a synonym of 'good'; others use it for 'exaggerated', 'unjustified'. OK: what do the claims that "Insanity is a sane response to an insane world"* and "the end goal of feminist revolution must be... not just the elimination of male privilege but of the sex distinction itself" and "There is no such thing as society" share?

'Radical' is from the Latin radix: root, basis, origin. If you'll allow etymology - really only the study of dead social contexts - to guide us a moment, then a radical's just someone who goes to the root of the problem (or thinks they do). In this sense all philosophers are radicals, even if they end up defending common sense from more interesting theories, as they very often do. (A somewhat dishonest way of putting this might be 'fundamentalist'.)

That radix stuff's neat, but it doesn't cover common usage, which is something like "strongly and actively opposed to a current norm." After reading a bunch in the fields that motivate radicalism most (economics** and sociology) I'm dubious that we have the capacities it would take to be radical properly, to find the real root of our problems (rather than spotting macro patterns that are forever hard to particularise right - which we do seem to manage). But attempts are sometimes salutary. Anyway neither kind of radicalism is going away: the sweeping reductions of the radix kind let us feel wise for little intellectual investment, an irresistable proposition to cognitive bottlenecks like us; and the social cachet gained via screaming opposition's unlikely to dry up even after everything we consider oppression is gone.


* I wanted to attribute this to hippy wildcard RD Laing, but there's no actual instance in his writing.

** Actually economics goes both ways: the facts of our world economy - vast intractable poverty, exponential inequality growth, and the worsening of the situation since 2009 - should radicalise you, but study of the history of large-scale economic interventions should then give you pause.


*********************************************** 


I was much vexed as a child by the thought that, since we are biochemical, thought must use up energy. Did this mean stupid people needed less food? Was meditation an ingenious latent response to a high famine risk? Was maths an underused dieting technique? Was I eating enough?

My infant reading of the biochem mind misconstrued a lot, but it's still a cool research programme indeed. Much later I learned that about a fifth of resting energy use is down to our 12W brain. Two large distinctions here, though: there's a difference between being used by the brain and being used in mental activity. ("The brain continuously slurps up huge amounts of energy for an organ of its size, regardless of whether we are tackling integral calculus or clicking through the week's top 10 LOLcats. Although firing neurons summon extra blood, oxygen and glucose, any local increases in energy consumption are tiny compared with the brain's gluttonous baseline intake.") And food in the gut is not like voltage in the circuit; it's more like coal in the boiler - a requirement for sustaining an equilbrium rather than a linear boost. The research is young and inconsistent, so no-one sensible is putting a kcal number on an hour of maths.

The caloric answer isn't so satisfying, anyway. What about going via information theory, with the brain as processor? The merest, indivisible unit of information is of course the bit - one 'yes' or 'no'. What's the energy requirement on a bit-flip? How much less efficient than this is the brain? That approach turns out to be a bit of a dead-end too, since people have turned away from the idea of there even being a fundamental cost of information processing. (The fundamental cost supposedly comes when info is erased, not necessarily at manipulation.)


**************************************************


Free jazz, abstract expressionism, anti-comedy: these things only convey their power if the audience knows what is not being done. Odd idea: enjoyment that needs knowledge, that is doomed to niche acceptance by that alone.


**************************************************


The sad bit of the internet is the gap between what might have happened when most of the world got more or less unlimited free information - a grand flowering of culture, intellect, solidarity - and what did happen: the flowering of the cruelty of porn, vast disinformation, new kinds of mob, new ease with which to become a famous pariah (15 minutes of shame). We learned that the problem is not information, it's us.


***************************************************************


Consider collective identity a neurotoxin we take as medicine against a worse pathogen, institutionalised bigotry.


****************************************** 

 

Occasions on which science has ruined my pet philosophies:


  • Empathising-Systemising theory, vs my metaethics. In metaethics I incline to sentimentalism, which holds that our moral behaviour is much better explained by our emotional dispositions than by the information we have, or the rational processes we use on it. (The prediction of this is more empathy, more morals). However, I also endorse a kind of consequentialism, which stipulates: if more consequentialist, more (ultimately) moral. Simon Baron-Cohen's contested work on autism-spectrum people suggests that this conjunction of views is dogged with tension, since consequentialism is found more in people with low / nonstandard emotional attachments to others. That is:
    1.  Moral behaviour is a function of empathy.
    2.  The morality of behaviour is constituted by consequences.
    3.  However, there's some evidence for a trade-off between one's capacity for empathy and capacity for systematic thought.
    4.  And systematisers are much more likely to be consistent consequentialists.
    5.  So (1) is less true to the extent that (2) is true, and vice versa. Give up one.
    Is there any hope? Well, a number of psychologists are dead set against Baron-Cohen on all points, so maybe it'll fall down. But even if, as seems likely, he is made to drop the naff sex essentialism, the meddling trade-off could easily remain for all the sexes. And quite apart from tanking sentimentalist utilitarianism, this leads eventually to the idea of multiple brain systems reflecting multiple metaethics in each person: balls.



  • The adrenal nature of memory, vs resisting adolescent melodrama. It's a bit of a trek from the data to the philosophy here, but bear with me. All sorts of people share a tendency to seek out extreme experiences and melodramatic relationships. One thing underlying this behaviour is what I call the epiphany theory of identity: the idea that you are shaped and defined by a small number of really big life events - which also happen to be the ones you remember clearly. I hate it: it justifies all kinds of tasteless, myopic behaviour ("rockism"). But it actually has some direct biological backing. Argument:
    1.  We are constituted in large part by our memories.
    2.  Adrenaline is produced in response to extreme experiences (bad and good).
    3.  Adrenaline is involved in the strongest form of long-term potentiation; it lends itself to vivid and lasting memories.
    4.  So extreme experiences will have disproportionate weight in recall. (2&3)
    5.  So extreme experiences will disproportionately constitute the self. (1&4)
    C.  Therefore epiphany theory might well describe humans' actual experience of themselves (to some degree requiring experimental inquiry).


  • Special relativity, vs temporal presentism. I once argued for the deeply flawed metaphysical position Presentism, for a laugh. You have to give up a great deal to make it work, but things get serious when you realise it makes you challenge (the most sensible reading of) bloody Relativity:
    1. The [Lorentzian interpretation of the] Theory of Special Relativity is true.
    2. The 'present time' relative to an event on the worldline of an object O is the sum of all events that share a plane of simultaneity with x in O’s frame of reference. 
    3. There is at least one event E that both exists at the present time in my frame of reference, & is on the worldline of an object in motion relative to me.
    4. Therefore, there is at least one event E such that the present time relative to E is not the same present time relative to me [by 1&2&3] 
    5. Presentism is true iff there is a unique present time. 
    6. Therefore presentism is false [from 4 & 5]
    There are four simple ways out of this, each of approximately equal scientific horror: deny premise 1, interpreting fundamental physics to suit your prejudices; deny premise 2 and try and force an old Newtonian absolutism to fit modern physics; deny premise 3, implying I don't even know, weird solipsism; or deny premise 5, emptying presentism of its central tenet. To dig my essay out, I made the classic dishonest metaphysician's gambit, venturing that the unfalsifiability of my pet theory was a virtue ("SR is not strictly a theory of time at all, being instead a theory of the measurement of relations between physical events. Concededly, rendering light as absolute precludes the observation of absolute presentness, but, read as a scientific theory, SR does not strictly bear on whether the phenomenon ‘absolute simultaneity’ exists. If there were only the relation of simultaneity and if we are within that system, then there is small hope of divining its fundamental nature empirically. The massive intuitive cost of this route is in detaching metaphysics from regulation by science."). But there's no question this was pyrrhic and ugly.

******************************************


What are we to call selfish bravery? 'Recklessness'? 'Avarice'? 'Entrepreneurship'?


*********************************************


Christian salvation involves the deformation of the self. (Slow clap: well done you.) But it deforms for simpler reasons than Nietzsche or Feuerbach's old grumpy individualism: far before one undertakes to flay oneself and reform, the basic shape of Heaven does for us. The Church has a standing retcon to explain just how nasty scheming bored creatures like us ever could get in: they distinguish the corrupt body from the real and shiny soul. But this clumsy medieval patch fails to understand just how much of us is bound up with our bodies and what happens to them. So, the eternal 'survival' offered has to be an attenuated kind. We'd stink up the place if we entered heaven without having changed considerably. Or, I would.


'Spring' (c) Andrew Wyeth (1978)

(click for original)



No comments:

Post a Comment