"All right action flows from the breath"
- Hajakujo

Recent comments

Wednesday, November 26, 2008

Yes, yes, very nice...braaaains



Ok, so it's an interview for the insufferably smug self-satisfied Edge club, but it's very nice all the same.

[ALVA NOË:] "The central thing that I think about is our nature, our human-animal nature, our being in this world. What is a person? What is a human being? What is consciousness? There is a tremendous amount of enthusiasm at the moment about these questions.

They are usually framed as questions about the brain, about how the brain makes consciousness happen, how the brain constitutes who we are, what we are, what we want—our behavior. The thing I find so striking is that, at the present time, we actually can't give any satisfactory explanations about the nature of human experience in terms of the functioning of the brain.

What explains this is really quite simple. You are not your brain. You have a brain, yes. But you are a living being that is connected to an environment; you are embodied, and dynamically interacting with the world. We can't explain consciousness in terms of the brain alone because consciousness doesn't happen in the brain alone."

Thought experiment inspired by this article - what can we see occurring in the brain/cognition when the body/action is the main focus but is focused inward, on performing some task not involving environmental interaction? My example is a kata, where the entire action is conceptually, not interactively based; and all thought is turned to the action. Then a person is suddenly highly active, but not interactive, and cognate but focused, not distracted. Kind of like Flow.

Tuesday, October 14, 2008

Wednesday, September 24, 2008

4 months

Tomorrow marks the four month anniversary of the last blog post I made, and so I've decided to break my silence this once, just to prevent that momentous date from actually occuring. Very quickly after writing this, I will go back to the grind of trying to write a thesis in the face of massive and insolent procrastination and excessive alternative commitments of an unnecessary nature.

In other news the poll that I've had up for all that time, measuring the comprehensibility of my pieces, has returned a massive 10 results! Unless that was me, answering from multiple separate machines over the last 4 months and not remembering, this represents a huge jump in readership figures. Since there is nothing to read, I'll assume they've all gone away again.

Whaa!?
Does anybody ever know what I'm talking about?
Fiendishly over the top50%5
I can't read, I just like the pictures50%5
Perfectly pitched0%0
Actually, a bit simple0%0
total votes: 10
powered by blogpoll

The results clearly show that nobody knows what I'm talking about, unless they can infer a lot from the mostly randomly chosen pictures. On second thought, all those who chose the 2nd option probably represent the increase in subscribers, and thus don't increase my 'readership', at all.

Sunday, May 25, 2008

A Little Something for the Weekend

"Enjoying all the loneliness at home"
Is it a question of being alone?
Employing silence to fill up a hole
the loud sound of thoughts all unknown
and buzzing memories to be owned.

The mind is I and the mind it is whole
or so it seems, so I feel I know,
but does cyclical confirmation hold?
Who holds me to be me, myself, my own?
Only one, the one that is all I hold.

Thursday, May 15, 2008

Dream-beings


Did I dream this belief,
or did I believe this dream?
- Peter Gabriel

"In the further adventures of Achilles, he again crosses paths with the Tortoise just as he is pondering certain imponderables thrown up by an unfortunate accident involving excessive physical activity and mild concussion.

Achilles: Tortoise! I'll have a word with you, since you're going nowhere fast.

Tortoise: By all means. Nothing would give me more pleasure!
[at this point, authorial licence gives way to reality...]

Achilles: I have been thinking about when I went unconscious. When I was dreaming, do you think that was some function of the brain?

Tortoise: I do.

Achilles: What do you think is the evolutionary purpose of it?

Tortoise: My 'official' reaction would be to say that dreaming is probably an accidental by-product of analytical thinking

Achilles: hmmm, not very convincing

Tortoise: I'll elucidate then. Let us consider that we have this function to think analytically and not just instinctively like most other animals

Achilles: So animals don't dream?

Tortoise: Clearly they do! But their dream is a shadow of their particular form of thought process, as ours is a shadow of how we think. I wouldn't like to begin to address what I have not experienced, when explaining what I have experienced is such a challenge.
So we were considering the analytical thought process, a function that doesn't go away when we sleep. Working solely with memory data, as opposed to sense data, the analytical thought process is on shaky ground. It confabulates fantastical visions for the 'experiential self' by performing its normal function, which is to recognise, assign nominal or symbolic value and classify. That is, it reads memory, performs some processing (which was designed for waking operation), and writes back to memory.
That is my 'official' line.

I would stand by that (because it's not really very deep at all, so it's reasonably safe).

However, if I was going out on a limb, I might say that there is more scope for interaction with unusual sensory perception when asleep simply because our analytical minds are shut down.
I might go so far as to say that this extra-ordinary interaction causally precedes our evolution - i.e. we dream because the dream state is there to be had, in the same way that we see because sight is possible.
Now, I wouldn't defend that view since, by all standards of reason, a defence is impossible!

Achilles: What do you think of aboriginal concepts of Dream time?

Tortoise: I know of it, it's a mythology built around the common idea of the mythic oneness. I wouldn't gainsay it - we know time to be entirely subjective, so I have no problem with Dream time as a metaphor for reality, outside of a single frame of reference.

Achilles: Is it possible, do you think, that the evolutionary consciousness rises out of dream time, and is used to pick and choose its subjective linear experience

Tortoise: Evolutionary consciousness? Explain.

Achilles: Well, consciousness as an evolving construct - rising out of the primordial soup as it were.

Tortoise: Consciousness associated with what? A single human? The collective unconscious? Gaia?

Achilles: A single human - but at different levels it drops further into a larger "dreaming"
I just feel that what I experienced was something akin to a dream time experience, and that it was my evolutionary consciousness, that values my linear experiences, that pulled me out and kept my alive...although, the human experience also helped me, as I was woken up by another human being.

Tortoise: To my mind, it's a double question really
a) does consciousness come from before, and last after, the human body?
b) is there really a single great consciousness so that a single personality is just an illusion, one that we can 'awake' from (whether that be in dreams or in death)?

At first reaction, I honestly can't answer either question. These are truly imponderables.

Achilles: Well, if b) is true, is it possible that the dream isn't a shadow of the waking self, as you seemed to be suggesting?
Rather, that the dream is a fundamental state of being that we can all share and partake of, and where we all become incredibly similar...but in which the identity becomes increasingly malleable.

Tortoise: My own experience of dreams leads me to believe that most dream experiences are akin to normal conscious experience, and all the stuff about identity being malleable results from the faculty of imagination - modelling!
BUT...I cannot deny there is possibly something more happening.

Achilles: Surely though, what we bring back from dreams is influenced by our own waking self

Tortoise: I think I see what you mean - that we reinterpret the dream experience on waking.

Achilles: yeah

Tortoise: We 'remember' the dream, based on a model that is built by our conscious mind?

Achilles: yup

Tortoise: Well, I agree that may affect remembrance...but it doesn't actually imply anything different is happening when we dream.
I am open to the idea that there is something extra happening, but I can't really approach that idea of a greater consciousness...because the more I think about it, the more clear it is that any rational answer will only approximate a model of a level of consciousness which I cannot understand or even discuss rationally!
For instance, imagine a Gaia consciousness that persists based on a substrate of electrical energy within the world [or solar system or galaxy, why not?]. We have a concept there, but what more can we say about it? the why, how, what...none of our experience can begin to help us form answers.

Achilles: I think our analytical brain does influence the dream a lot, even during sleep...yet what was strange for me was how little influence it had during unconsciousness, and then its sudden attempt to reassert itself. It felt like the evolutionary process speeded up in a second, pulling random impulses into a fully formed human.

Tortoise: Ok, so I have a concept of some rather unapproachable level of consciousness, and you have your unconsciousness/dream concept, which brings us back to your recent personal experience...and I feel I can say this much:

Consider the analytical, self-oriented left-brain part as 'You' for a moment. 'You've' basically got a will to live, or a will to die - stay in the human body, or exit it. You can think of dying as the end of things, or as reunion with the One, enlightenment. Yet if there is this greater consciousness, then it doesn't matter what you think, because you can't approximate the truth with thought...and you can't escape it.
If it true, then when you die you join it.
If not, when you die you rot.
Seems like I just made a good rational argument for not worrying about spirituality too much!

Achilles: I think its the "you" part that I am having difficulty coming to terms with!

Tortoise: Is there anything more that we can say regarding the transcendence of dreams? What is the 'I' if all we need to do to wash it away, is to fall asleep? Are we remembering a true state of experience when we sleep or when we wake?
A good question, Achilles, and I thank you!"

Wednesday, May 07, 2008

Noise-beings



What I have been talking about in the Doubt-beings and Love-beings posts is probably best thought of as a metaphysics of cognition, a subject which needs a metaphysical treatment only because of our relative ignorance about how we produce thought. However more and more the field is advancing, and one of the most productive areas is the investigation of noise in neuronal processing, or what looks like noise because we don't quite understand its role yet.

Did a quick google on noise in neuronal processing, and discovered enough interesting studies to last a long time. [Possibly a career's worth. We'll see!]. See these [1] [2] [3] [4] [5] [6] [7]
Still, for now, a quick recap and see how this bears on my previous two posts.

In the late 80's Roger Penrose in his book The Emperors New Mind attacked the stance of strong AI (which claims that consciousness is algorithmic and so can be executed on a UTM), saying that cognition was essentially a non-algorithmic process characterised by [what I'll call for want of better phrasing] Godelian relationships. He also made the bold claim that perhaps resolution of quantum linear superpositions was occurring as neurons act, therefore making thought a practically non-deterministic process. This view was downplayed by the mainstream, as it was thought that neuronal activity was too large-scale to be affected by quantum phenomena. The claim has not been verified or disproven, but there is some evidence to suggest that quantum phenomena do play a role for the average neuron.
The implications were this hypothesis to be proven true are staggering in scope - and they really do bear heavily on any metaphysical look at consciousness. We'll come back to this toward the end, after a few more research perspectives.

One view of the brain is a rather rattletrap contraption, riddled with signal-delivery noise and therefore stacked with signal-processing redundancy.
There's a nice article here about how noise is inherent in wetware but is compensated. Essentially, from a reductionist perspective the picture is that neurons are signal carriers that can loose signal, distort the encoding or lose it entirely. Redundancy of processing for brain areas effectively combats noise - neuron groups and signal trains are used, like this example:
"When we hear a sound, hair-like structures on neurons in our ears wiggle. Their wiggling creates a pattern of voltage spikes, which the neuron then passes on to 10 to 30 other neurons. All of those neurons then carry the same signal toward the brain, where they can be compared. Each neuron degrades the signal in a uniquely random way, and by averaging all of their signals together, the brain can cancel out some of the noise."

This perspective comes from looking at the action of single neurons, and then extrapolating that behaviour up to the next level at which hard science is possible - single event response mapping. Using functional Magnetic Resonance Imaging (fMRI), they can examine the brain as it acts (almost in real time, now). But the entire picture is far too big, busy and chaotic to treat scientifically, so they have to prime the brain to respond to a single event, like a sound, and map the response in the fMRI results.
The problem with this, valuable work though it is, is that it gives a picture of the brain that is too functionally modular. The brain is modular, sure, but it is also interconnected. The whole brain is switched on (though perhaps not acting) at the same time, and it is occlusive to think in terms of one part at a time. In complex emergent systems, it is the high level behaviour that embodies the most powerful and beautiful results. Ant algorithms are very simple on the individual ant scale, but the entire culture of an ant colony is a staggering construct for such tiny creatures.

So another perspective on noise in the brain keeps the higher order in mind - that noise encodes decision possibilities until resolution and so the brain represents information probabilistically, by coding and computing with probability density functions or approximations to probability density functions. This implies that the brain is actually a Bayesian probability calculator.
On the face of it, that's not so different to the ear example above - lots of signals are sent, the averaged sum of probabilities gives an approximately correct answer. The differences in the technical details may be larger, but one non-technical difference in particular strikes me - in this latter view, the noise is not really noise at all. It is a pre-cursor to the system - in a way, it is the principle around which the system is built. In other words, our brains evolved the way they did because the biological substrate they evolved within has to have noise. If a noise-free system were possible, we wouldn't think the way we do at all (well, we as us wouldn't exist, but for the sake of argument...).

So there we have a few different fresh perspectives on how the brain is processing and decision-making. Can we relate this in some way to the level of discussion of the earlier posts, Doubt and Love?
I'm loathe to start drawing definite inferences, since I'm working myself on the basis of intuition. Yet I think that with the most open of minds, we could imagine a brain that operates from the quantum level toward resolution of probabilistic predictor functions. This type of brain could operate as we know it does, and yet also operate within the undifferentiated, relative and probabilistic reality that we suspect* exists independent of our conscious experience of it. In other words, we exist in touch with the beautiful everythingness of reality, and yet filter it down to a point of focus that permits self-aware pro-active consciousness. I may be jumping crazily about waving my hands in the air, but I believe that I have just summarised my Doubt and Love posts with reference to hypothetical operative descriptions of the brain.

Furthermore, all this to my mind, presents a picture not so much of dichotomy but of layering of relational activity. There is not just one view of the world or the other (as I have presented in the previous two posts referenced above), but a system that requires both views to exist simultaneously and harmoniously, and therefore produce conscious thought.

Note: In all this I am kind of taking the stance that conscious thought is somehow a desirable end product of the setup of our brains - a final** and valuable cap stone of the system. Another stance might claim that consciousness is just an accidental by-product, that the 'zombie in the brain'*** is what's really in control, and the whole apparatus only operates in order to enable the 'selfish gene'. I just find the letter view a mite shortsighted and pessimistic, though I don't claim I know better.

* I am using an uncertain form in order to admit the solipsist outlook.
** Or possibly not final! But thats another days discussion.
*** Look up V.S. Ramachandran 'Phantoms in the Brain'

Sunday, April 27, 2008

Phantoms in the Brain: skim-over


In thinking about an upcoming post, I came back to a really eye-opening book I read many years ago called Phantoms in the Brain: Probing the Mysteries of the Human Mind, by V.S. Ramachandran and Sandra Blakeslee (I also see that it is the book's tenth anniversary, so this is a well-timed remembering). I thoroughly enjoyed the book for it's wit as well as wisdom (well, it is a tour of some research findings, so perhaps wisdom is the wrong word). I found a list of quotes from the book (particularly like the last one), so without the time to write a nice synopsis or review, I'll just bang them straight out with a recommendation to read the whole thing.

p. xi:

"In any field, find the strangest thing and then explore it."
- John Archibald Wheeler.

p. xv:

I'd also like to say a word about speculation, a term that has acquired a pejorative connotation among some scientists. Describing someone's idea as "mere speculation" is often considered insulting. This is unfortunate. As the English biologist Peter Medawar has noted, "An imaginative conception of what might be true is the starting point of all great discoveries in science." Ironically, this is sometimes true even when the speculation turns out to be wrong. Listen to Charles Darwin: "False facts are highly injurious to the progress of science for they often endure long; but false hypotheses do little harm, as everyone takes a salutory pleasure in proving their falseness; and when this is done, one path toward error is closed and the road to truth is often at the same time opened." Every scientist knows that the best research emerges from a dialectic between speculation and healthy skepticism.

p. 1:

For in and out, above, about, below,
'Tis nothing but a Magic Shadow-show
Play'd in a Box whose Candle is the Sun
Round which we Phantom Figures come and go
- The Rubáiyát of Omar Khayyám

p. 35:

The completely static picture of [cortical maps] that you get from looking at textbook diagrams is highly misleading and we need to rethink the meaning of brain maps completely.

p. 35:

You never identify yourself with the shadows cast by your body, or with its reflection, or with the body you see in a dream or in your imagination. Therefore you should not identify yourself with this living body either.
- Shankara (A.D. 788-820) Viveka Chudamani (Vedic scriptures)

p. 61:

For your entire life you've been walking around assuming that your "self" is anchored to a single body that remains stable and permanent at least until death. Indeed, the "loyalty" of your self to your own body is so axiomatic that you never even pause to think about it, let alone to question it. Yet these experiments suggest the exact opposite - that your body image, despite all its appearance of durability, is an entirely transitory internal construct that can be profoundly modified with just a few simple tricks.

p. 81:

in science one is often forced to choose between providing precise answers to piffling questions (how many cones are there in a human eye) or vague answers to big questions (what is the self), but every now and then you come up with a precise answer to a big question (such as the link between deoxyribonucleic acid [DNA] and heredity) and you hit the jackpot. It appears that vision is one of the areas in neuroscience where sooner or later we will have precise answers to big questions.

p. 93:

People often assume that science is serious business, that it is always "theory driven", that you generate lofty conjectures based on what you already know and then proceed to design experiments specifically to test these conjectures. Actually real science is more like a fishing expedition than most of my colleagues would care to admit. (Of course I would never say this in a National Institutes of Health [NIH] grant proposal, for most funding agencies still cling to the naive belief that science is all about hypothesis testing and then carefully dotting the "i's" and crossing the "t's". God forbid that you should just try to do something entirely new that's just based on a hunch!)

p. 110:

the primary visual cortex, far from being a mere sorting office for information coming in from the retina, is more like a war room where information is constantly being sent back from scouts, enacting all sorts of scenarios, and then information is sent back up again to those same higher areas where the scouts are working. There's a dynamic interplay between the brain's so-called early visual areas and the higher visual centers, culminating in a sort of virtual reality simulation.

p. 152:

"What we call rational grounds for our beliefs are often extremely irrational attempts to justify our instincts."
- Thomas Henry Huxley

p. 156:

[Sigmund Freud] had discerned the single common denominator of all great scientific revolutions: Rather surprisingly, all of them humiliate or dethrone "man" as the central figure in the cosmos.

p. 157:

If you think you're something special in this world, engaging in lofty inspection of the cosmos from a unique vantage point, your annihilation becomes unacceptable. But if you're really part of the great cosmic dance of Shiva, rather than a mere spectator, then your inevitable death should be seen as a joyous reunion with nature rather than as a tragedy.

p. 180:

Some of these [temporal lobe personality] patients are sticky in conversation, argumentative, pedantic and egocentric (although less so than many of my scientific colleagues)...

p. 183:

Higamous hogamous
Women are monogamous
Hogamous higamous
Men are polygamous

p. 185:

Just because religiosity has a neurological basis, does not in itself deny the existence of God, just as the neurophysiological basis of color vision does not deny the existence of color.

p. 204:

jokes have much in common with scientific creativity, with what Thomas Kuhn calls a "paradigm shift" in response to a single "anomaly" ... the joke is "funny" only if the listener gets the punch line by seeing in a flash of insight how a completely new interpretation of the same set of facts can incorporate the anomalous ending.

p. 206:

Freud's explanation [of humor as the relief of tension] belongs to a class of explanations that Peter Medawar has called "analgesics" that "dull the ache of incomprehension without removing the cause"

p. 222:

There's much truth to Sir Arthur Eddington's famously paradoxical remark "Don't believe the result of experiments until they're confirmed by theory."

p. 227:

[According to Hindu tradition] the self - the "I" within me that is aloof from the universe and engages in a lofty inspection of the world around me - is an illusion, a veil called maya

p. 227:

Everything I have learned [from neurology] points to an unsettling notion: that you create your own "reality" from mere fragments of information, that what you "see" is a reliable - but not always accurate - representation of what exists out in the world, that you are completely unaware of the vast majority of events going on in your brain. Indeed, most of your actions are carried out by a host of unconscious zombies who exist in peaceful harmony along with you (the "person") inside your body!

p. 228:

"Consciousness is a fascinating but elusive phenomenon: it is impossible to specify what it is, what it does, or why it evolved. Nothing worth reading has been written on it."
- Stuart Sutherland

p. 229:

[The] need to reconcile the first-person and third-person accounts of the universe ... is the single most important unsolved problem in science. Dissolve this barrier, say the Indian mystics and sages, and you will see that the separation between self and nonself is an illusion - that you are really One with the cosmos.

p. 235:

[The zombie argument] is based on the fallacy that because yuo can imagine something to be logically possible, therefore it is actually possible. ... even though you can imagine an unconscious zombie doing everything you can do, there may be some deep natural cause that prevents the existence of such a being!

p. 256:

It seems somehow disconcerting to be told that your life, all your hopes, triumphs, and aspirations simply arise from the activity of neurons in your brain. But far from being humiliating, this idea is enobling, I think. ... Once you realize that far from being a spectator, you are in fact part of the eternal ebb and flow of events in the cosmos, this realization is very liberating.

Wednesday, April 09, 2008

Love-beings


This is a companion piece, so best to read the post Doubt-beings beforehand!

I've said I'd use this term love to describe what I'll talk about below, so even though the doubt piece proved that adapting language to new uses can be counter-productive, I'm pressing on. Bear with me!

There is the sense of the word love that follows everyday use, and this is emotional and interpersonal. People mostly love other people - I'll come back to why I think this happens, toward the end. One can talk about loving things, concepts and so forth, but most people recognise that such emotion is inherently different to what people feel about other people.

Although perhaps in one sense, it is not so different. Love is a term that is used in other contexts - I'm thinking of teachings on enlightenment. Love is the Buddha...all is One, and One is love. At least, that's what they tell me. Now this is not emotion, because emotion is a product of the self, it's constrained and relational. And as mentioned, here love is ALL.

Yet it relates to the humble emotion we call love in the day to day. Perhaps what we call love is a snapshot of this great constant Oneness, or better a flash of light through an iris opened by the reduction of the obsession with reflection on the self. True love feels like a very selfless thing. Is it just a play on words to suggest that this is because true love involves the exact same opening of being outward beyond the self, as enlightenment does?

I can't say that without addressing why everyone is not enlightened - most of my readers will know more than I, but anyway...it takes awareness to be enlightened, one must be aware that the self is not real, is an illusion that needs to dissipate - and that awareness is very hard to hold as well, because its a scary thing at first. (This is what I've read, at least. No Buddha am I!)

With regard to the snapshot idea, I am playing with this concept in respect of my own life, trying to see how love can occasionally and spontaneously explode for people and things I have no relationship with, for variable lengths of time and no apparent reason. Or how it can last long past the end of a relationship, although that relationship may have ended acrimoniously. Or how it arises for nothing, just because my state of mind relaxes, my concerns drop away for a moment, and the world around me looks very beautiful. This often happens when travelling - walking or on trains mainly. And a curious thing accompanies - very often I will start to notice a lot more detail about the world, like how the trees beside the path on the way in from the train station to my office are all curved the same way at the base, suggesting when they were saplings the prevaling wind was nor'westerly. Of course, such musings invariably become recursive, and I start to think about my own thought process, and the spell is broken. The self is back. Is this a familiar experience?

This kind of makes me think of what else may be coming through with the emotion of love (as I am describing it. Keep in mind as you read that your own experience - and thus idea - of love is going to be different, so its just a word). If the emotion itself is a window on a constant, does this suggest that the relationship between the self and the emotion is like the opening of a valve? OK, and what does the constant represent? I'm thinking of it like a pure recognition by the right brain of the sublime quality of reality. The whole thing is pretty amazingly put together, I think our best science supports that indubitably (see what I did there?), and art has known it forever. If we consciously get a glimpse of that, its a sublime feeling. Could it be that there is a substantive recognition of intrinsic quality in the Oneness of reality that leaks through as we experience the world without the processing mechanisms of our left-brain filtering system? This idea is not that far from Kant's ideas on aesthetics, as I understand them. It is not a world away from Robert Pirsig's Chautauqua on Quality.

Maybe it gives us another channel from the narrow self-oriented conscious left-brain to the wide-open undifferentiated unconscious right-brain...love as the non-filter, a time-division multiplexed interface with the beauty of reality where doubt (or discrimination, or whatever you want to call it) is frequency-division multiplexed.

...

Coming back around again to the common idea of interpersonal love. A lot is at stake here, as we hardly want to relegate this important facet of our lives to a mere mechanistic working of cognitive functions. So all I'll say is that if the emotion of love is a reflection of this Oneness constant, then falling in love, or loving your family, could be (in operation) a lowering of defenses and a reduction of concerns about the self. You enter willingly into a vulnerable place because you trust the other person you love, and that starts the process of stripping away the illusory trappings of the self and opens you to feeling the reflection of Oneness. You're not becoming enlightened (probably the opposite >:D ), but you actively feel bliss.

But the awareness that this process is beginning isn't present as it is when enlightenment is being sought, and the relationship with the other comes with its own cares, and so the self quickly reasserts itself. And so we get this flash of pure bliss, which is over quickly but cascades into associated positive emotions, and the whole thing is labelled by our categorising left-brain as 'falling in love'. And its great! But it has less to do with this other person, and more to do with our own state, than we think.

...

Final thought - can the self-oriented, analytical left-brain experience be trained toward the state of mind I have been talking about here? Could Flow be a left-brain version of this sort of openness of being? If we had a rigorous knowledge of either one type of experience or the other, we might have a better idea, but as I say (too much) in my work - that is an issue for future work*!

*the academic equivalent phrase to 'that would be an ecumenical matter' :D

Disclaimer: There is NOTHING about thinking this through that bears on the actual experiences involved - no clarity is gained with loved ones, no steps toward enlightenment achieved, no knowing what the next moment of a new relationship might bring. It's only words - but I enjoyed setting them down :)

Sunday, April 06, 2008

Poll Results

Post-oil warring into a new stone age30.8%4



Historical precedent says everything will stay much the same, only more so30.8%4



Super-intelligent borganisms15.4%2



Whaa!?15.4%2



An unforseen utopia of free energy and human kindness7.7%1




Wow, 13 whole votes!
Looks like the trend of belief is either down or level, which given that nothing ever stays the same really, looks like a majority vote for pessimism. And only one true optimist! And I can't even remember if it was me!

Dr. Seuss Movie Adaptations


"On the fourteenth of March, in towns nationwide,
In every cinema, multiplex, on every barnside,
Gleamed another adapting of one of my books,
CGI-ed and digitized by another sly crook.

Horton, my favorite—look how he's been treated!
Stuffed with tinsels and tassels and promptly excreted!
The puns! And the filler! The script fees you must save!
While I tumble and grum-humble around in my grave.

Did you learn all but squat from The Cat In The Hat?
Please tell me you fired the prick who made that.
I would have stopped writing, maybe sold Goodyear tires.
If I knew one dark day I'd costar with Mike Myers.

And Oh!
Oh, dear! Oh!
My poor Grinch, what they've done!
They crammed in live-action and snuffed out all the fun!

It's icky, it's tacky, it's awkward, it's wrong.
The Whos look like ferrets, it's an hour too long.
What a rotten idea to spend millions destroying
This masterful tale kids spent decades enjoying!
But still you keep making them!
Just how do you dare?
Sell my life's work off piecemeal
To every Tom, Dick, and Har'.

Why it's simply an outrage—a crime, you must judge!—
To crap on my books with this big-budget sludge.
My books are for children to learn ones and twos in,
Not commercialous slop for Jim Carrey to ruin.

Have you no respect for the gems of your youth?
To pervert them on screen from Taiwan to Duluth.
Even after you drag my last word through the dirt,
I know you, you pirates,
You'd cut out my heart for a "Thing 1" T-shirt.
For eighty-some years I held you vultures at bay,
knowing just how you'd franchise my good name some day.
Not yet cold in my grave before you starting shooting
the first of my classics you'd acquired for looting.

Mrs. Seuss, that old stoofus, began selling more rights
to Dreamworks, Universal—any hack in her sights.
First The Cat In The Hat and then this, that and Seussical
without a thought to be picky, selectish, or choosical.

So to Audrey, you whore, you sad sack of a wife:
Listen close. Pay attention, for once in your life.
You give Fox In Sox to those sharks who made Elf
And so help me, I'll rise up and kill you myself.

No Sneetches by Sony—
No One Fish: On Ice
Burn that Hop On Pop II script not one time but twice.
Don't sex up my prose with Alyssa Milano…
And no Green Eggs And Ham with that one-note Romano!

This must stop! This must end! Don't you see what you're doing?
You're defiling the work I spent ages accruing.
And when it's dried up and you've sucked out your pay
There'll be no going back to a simpler day,

When your mom would give Horton a voice extra deep,
And turn the last page as you drifted to sleep.
Instead you'll have boxed sets, shit movies, and… well,
You'll have plenty to watch while you're burning in hell."

Stolen, without kind permission,
from the swell guys at theonion
and please before harsh objection
note the flattery, in my selection

Tuesday, April 01, 2008

Doubt-beings



Doubt is rather a constant thread of thought, occasionally featuring operationally, occasionally topically. Recently the issue of true certainty has been occupying me on a personal level, and almost inevitably the corollary has arisen in conversation with peers (notably here). I think that before reading on, it would be valuable, though not essential, to read that and watch this.

I begin with a premise (framed as a question) leading from those pieces - I wonder if there are doubts in the realm of thought which is supposed to belong to the right brain, the immediate and total awareness of sensory perception without reference to the self or identity?

It kind of implies that doubt is itself a construct of (the evolutionary trait of) identity. An evolutionary psychologist might therefore say that we doubt because certainty is self-defeating as a fitness function in a natural selection competition. And I suppose that this is pretty tautological, when you think about it. The certain are slower to adapt, to bend to outside forces and shape their habits to changing needs. Its intuitive, anyway, you don't need to posit evolutionary reasons to see that doubt (and fear) are useful in the day to day.

Now how far can we push doubt, and does it serve any purpose to do so, other than fueling madness and occasionally allowing wisdom to be obtained therein? Could doubt be a mechanism by which the brain circumvents its own filtering of right-side sensory overload?

The consciousness has massive input, because the senses are quite wide-band. But the left brain, the 'me', has limited attention, because to store and sort everything would take too long. Maybe doubt prevents the left brain from filtering too predictably, from cutting the same type of data out of the sensory input every time. In other words, if there was no doubt, we would never see anything novel at all.

Terry Pratchett and Douglas Adams both played with this idea of sensory editing. In Pratchett's Reaper Man, Death takes a holiday, but instead of seizing a recently deceased body, he just arrives in the rural getaway of choice looking like himself, dressed in overalls. Because they cannot sufficiently doubt their own concept of reality, nobody can see him as he really is, they simply see a rather tall, gaunt man. Only a small child can see him as he really is. Only one without certainty knows that Death is among them - is this a Pratchett version of a moral tale?

Perhaps it's worthwhile considering the experience of taking hallucinogens. So many of the commenters on the video I've linked to above claimed to have experienced a similar left-brain disconnect, when they took LSD. I wouldn't call it exactly the same experience (although everyone has different experiences) but there are similarities. The mind becomes far more localised, open, sponge-like and undiscriminating. The connection with the self is attenuated. Some people have had a complete out-of-body experience, though I don't think I have (memory is hazy with these things :D). What can we say about the larger implications of moving away from the discriminant faculties of the left-brain?

For one thing, this is a helpless beast. A person on hallucinogens for the first time is like a baby, needing a totally unthreatening environment and possibly care and guidance. Bill Hicks said anyone who thinks they can fly when on drugs and then jumps out a 10 storey window is a moron - baby birds don't do it that way! And yet nobody who wasn't on drugs ever failed to heed their doubts that they could, in fact, fly (except in Douglas Adams books, where it seems completely logical to throw oneself at the ground and miss).

For people that are habitual hallucinogen users, personality may not change at all, but if it does it often seems to involve an erosion of healthy doubt. Belief in 12 foot lizards abounds. An increase in absurd doubts may also result, as the fabric of both objective and inter-personal reality comes under question. I haven't studied the long term effects of drugs objectively, so I must say this comes completely from personal observation.

For another thing, the left-brain, the identity vector or the self is not quiescent. It can jump into the trip at any time, noting the thoughts of the consciousness and trying to relate, categorise and classify - to understand, in fact, which is its natural task. If you happen to notice this happening while you're tripping, a recursive self-recognition cycle can build up, as the brain watches itself think about itself think about itself think...It can get to be a bad trip! Another personal observation, this one from inside the experience.

What purpose these postcards from the edge? Just to note that functioning of the [input->filter->process->store] pipeline of the consciousness is a powerful part of being conscious, and I don't think it appreciates being derailed.

Perhaps this is because, as the consciousness lifts outward and settles into the moment, the identity of the self comes face to face with itself - it's forced to try to grasp what it is, in totality and separate from any simple definitions or concepts of a personal nature (anyone who's gone or going off the rails in a solipsist existential sense may feel recognition). The problem with this is that the self is a construct designed for defining, creating relational concepts and so on. Can identity really grasp itself, can the tool of understanding act upon the tool?

What is doubt? Could it be the action of the self, which separates itself from everything else, and thus cannot truly know anything else? If we exist as processes rather than fixed entities, then may not be unreasonable to think of the active element of our selves as a process too - the process of doubt, uncertainty, the knowing of things and the letting go of this knowledge.

Disclaimer: I feel its important to note how unsatisfactory I find my own blog-mounted theorising - like moulding a diamond out of clay instead of cutting it from a rock [if the analogy makes no sense, Kant has been described as a master diamond cutter]. Now, my mother is a potter, I'm not putting down clay - but if you mould a solid lump of it, chances are when it is fired it will explode. Thats the source of some unease.

Wednesday, March 19, 2008

Excess Revelry - too much fun


"What sour unkind movement stirs the heart of me?
Some dour fecund fruit from excess revelry,
love's labours lost, staunchless gush and - puuaaaggg*!"

*calumny


Sometimes real work is just not worth it, when the toys are so much fun.

Sunday, March 16, 2008

Games as Information Systems Q&A



Since I have had no time to post in a month, and there have been no replies to my last post, there is a happy opportunity to segue straight into a little Q&A on the topic of games as information systems, derived from some response to the topic over at onlyagame.

So, onto the questions:

- How would one measure the bits of information in a game situation (and is it worth even trying)?

Answering the second part first - why measure the bits of information? - I think it is clear that no matter the graphical fidelity, games are still component systems designed from the top down. Therefore a reductionist approach to analysis can still work. And reductionism is a powerful tool. I think the idea is not that we want to break down gameplay to the point where we can say: the player has just interacted with information bit x, and is being presented with bit y.
I think rather that we want to be able to say that the stream of information coming in to the player has X profile at time T.

The how is quite difficult. The first step is to reduce the dimensionality of the measurement to the 2 dimensions that the player sees. But you can't lose the relational information between what is in the viewport frame and what may be around it (and the viewport frame is itself information bearing, especially in games where it represents the player's view). Thats the really hard part.
I think that I would have to hand-annotate a game with information before I could answer this in the general case - i.e. familiarise myself with my own proposal!
But to start with, everything in the game has a relationship with the player and a novelty to the player. Under these two headings, one could imagine a framework for assigning bits to in-game elements based on their relatability and novelty. I often think of FPS games like Battlefield here - there is so much detail in one of those worlds, but a player only assigns a little attention to terrain, because no matter its appearance it all behaves the same way. On the other hand, other players require a deal of attention, because despite a certain uniformity of appearance* they can all do quite different things (to kill you!). Roughly, the idea is: how can we measure the (potential) attention budget of the player?

*Which would be a personal gripe with Battlefield games - if they skinned them in Warhammer 40K designs, then you'd have a game :D


- Can game information be considered comparable? Consider the analogue information in the state of a snowboarding game versus, say, the positions of the players on the pitch in a sports game.

Very pertinent - possibly the hardest problem with this approach. Could this be gotten around by considering the possibility space of the game, and comparing on that basis? In this sense, the possibility space of a sports-like game is more dispersed (softer) than that of an analogue game. But the essential nature of the information is still the same - probability weighted relatability and novelty. Its just that that tree down the slope at time T has a much higher probability of still being down the slope but closer at time T+1, than that player down pitch has of still being in the same spot a second later (but consider the beautiful game - the goalkeeper has a pretty high probability of being in roughly the same spot! Its all degrees).

- Are information and time the only factors influencing game difficulty?

Well, novelty is very important, and is kind of assumed in the information approach. But novelty can only be judged by known play history of the player. Who can only be identified on a profile sign-in basis. Which system can only be trusted to be valid, not known. So thats a problem.

In fact, its the same old problem with player modelling again - after a certain point, without biometrics we really can't be sure that the person playing is the same one we've been modelling all along. The fields of concept drift and concept shift have methods for dealing with this, but again its all a matter of probabilities.

Friday, February 15, 2008

Games as Information Systems


I'd like to share a prototype idea that may never get development time, and so is only gathering dust on my magnetics. The doc below was written up to pitch to my supervisors, so the language is both of the area of CS, and of my own work, and there may be unexplained assumptions or glossing over. Anything of the sort, feel free to point out. For instance, I am talking about my own interpretation of Pacman, where Ghosts move probabilistically and not under rules. So without further ado:

Information Processing as the Challenge & Skill Metrics in Pacman

Ben Cowley

1. Introduction

The aim of this approach is to represent the player’s point of view programmatically, by breaking down the game’s aesthetic presentation into component units of game-relevant information, which coorespond to the basic elements of the game that the player observes and manipulates in the process of play.

(Certain classes of) Games are about processing Uncertainty, making meaningful choices, consuming patterns. This is all related to prediction. What is provided to guide the player in their predictions is information. Insofar as information is related to the elements of play, then as it increases that must correspond to either: increasing numbers of active elements to keep track of (and to generate Uncertainty about future states); or more inactive elements which it is necessary to filter. Either way, cognitive load seems to be increasing and so must difficulty.

Ideally, the value (as predictors) of such a set of units of gameplay (as expressed by their assigned information value) would be determined by testing against logs of real games played. In other words, as subjective difficulty increases players will make more errors and the general trend of quantity of information to be processed would be correlated against this.

On the other hand, we have a defined difficulty progression in the game as it stands. The ghosts get faster, they hunt the Pacman more aggressively and the Pills make them vulnerable for less time, as the levels advance. Alternatively, if speed stayed the same, we could increase difficulty by having more ghosts. This suggests a relationship as follows.

Increasing Information

=

Increasing Difficulty

=

Constant Information

Constant Time to Process

Decreasing Time to Process



If we can accept a relationship similar to that above, we can say two things. Firstly, we can say that there is self-consistent way to measure challenge, so that one type of challenge can be compared to another, if both are given an information value using the same framework of judgement. Secondly, if this can be seen to work reliably and self-consistently, then we don’t really need a reference to an outside measurement, like a correlation to real game logs.

2. Framework

Information would be recorded from anything that varies in information content over time.
Game units of Pacman {with their informational attributes}:
  • Pacman: { x,y | possible vectors }
  • Ghosts: { x,y | possible vectors }
  • Goal-map: { Points for Dots | Points for Pills | Consequences of collisions }
The player will have three classes of information-based attributes (with generic description):
  • Vectors of movement (Opportunities for action)
  • Relative distance to ghosts (relation to dynamic obstacles)
  • Goal map (relation to goals)
These attributes (for a given agent) can be defined as:
Vectors: unblocked directions of movement from the agent. New vectors would spawn from old vectors at intersections (junctions in the map) and open alternate directions if multistep movement was to be considered.
  • Relative positions/distances: use A* over short distances, and my heuristic distance over longer. Cut off between short and long would need to be decided.
  • Goal-map: scoring function over local area, or area of local actions. So, possible points of action could be available from following the vectors defined above.
3. Clarification

Some figures below try to clarify the definitions of a unit’s information attributes (above).


Fig.1 Vectors for Pacman & Ghosts at a single iteration (i.e. no subsequent vectors representing change of direction). More iterations could be included depending on computational cost.


Fig.2 Goal map for Pacman across a local area o the right. The actual state is on the left.
The ghosts are assumed to move two squares in each direction – this is just for illustrative purposes. In practice we can assume the ghosts will be within a certain limited area and not worry too much about where – we are trying to approximate the player’s point of view.
Goal map – evaluate information for mechanics along vectors of opportunity, decreasing weights as we go. This can reflect gameplay because the game uses probabilistic mechanics for the ghosts, and so the actual course of the game has a similar fuzzy nature to the predictive capacity of the player.

Thursday, February 14, 2008

Dawkins, Only Begotten Son of Science


What if Dawkins is a prophet?
Maybe thats the way it works! I mean, forget about all the incarnation stuff, it's just beyond the pale of reasonable discussion (i.e. metaphysics baby). Lets just call Jesus/Buddha/Mohammed and so on, some smart guy. Perhaps a person (all male in recent history, boo urns) just happens along. They are there at the right time and the right place, and they have learned some things along the way, possibly from other wise guys, possibly from sitting under a tree. They may stand on the shoulders of giants. But they are the ones that are listened to, that draw a crowd. Then the crowd goes off and tells others, and you have a movement. Over time, morphology generationally produces a religion. People follow that, for a while.
Then a man comes along. At the right time, and the right place. With some other message.
I wonder, is Dick Dawkins a prophet?

What a downer for him if he is!

Wednesday, January 23, 2008

Music in its new Age


Read a rather interesting interview between David Byrne and Thom Yorke, discussing In Rainbows and the revolution in the business of music. Digital downloads!

The phenomenon of file sharing [the ethics of which are trawled through here] has rocked the industry, mostly because it was such a cosy, locked-in business model that the bright and the overpaid panicked, rather than because file sharing was offering a viable alternative source of the full range of music products. This full range is where people, with resources to exploit, need to be looking to grow their service portfolio...let's not forget, all they do is provide a service. Music is an auditory experience, packaging and distribution is a service. Record companies need to let go of the idea that they actually own music. That was never their function until they overstepped their mark. If file sharing sank all the old industry dinosaurs and evolved a new breed, it wouldn't be too soon.
But partisanship aside, let's be constructive and try to think of some ways the music industry can save itself, without crucifying everyone else in the world [the rough population of those who will eventually become file sharers].

How about a model where fans go to a concert and download the whole experience to their mobile wireless digital storage devices, whatever form they may take? The performance itself becomes a digital saleable commodity, one that can be traded online afterward. The concert DVD already does this, of course, but it seems very exciting and attractive to me that it would be instantly available and personalised (concert DVD's are usually only filmed in one location from a tour, this would be every location and be automatically the location the fan attended).
The ticket cost pays for the music, so ticket costs would have to go up (costs of putting on the show have to be covered).
However, maybe if the tickets were paid for by account debit or credit (or credit card), then they could form a binding contract on the buyer, and songs from the show recorded by the buyer could be digitally watermarked to allow detection of widespread distribution. I dunno, this is getting a bit like DRM...

Nevertheless, this is an inventive scheme, and it's invention that is needed as we have a real necessity to reinvent or replace the dinosaurs and bridge the gap between affordable music and sufficient revenue to produce it.

Let the music out - that's what's really needed.

Sunday, January 20, 2008

Where's the Cheese?




"Hellgate: London gets content update.

...The Stonehenge Chronicles update adds open, outdoor wilderness areas intended to stand in contrast to the main game's setting in the streets and sewers of London.

The update is comprised of three different sections: The Caste Caves, Moloch's Lair and The Wild. The Caste Caves unlocks four dungeons for each enemy caste, and each tasks players with defeating a spectral overlord. "

I took this from Gamasutra, which could be seen to deflate the point I'm about to make, but I'll press on regardless (oh, and maybe see the last post also).

Caste Caves. Moloch and his Lair. Spectral overlords. Sewers. In point of fact, Hellgate!

What on earth would be wrong with an MMO set in some variant of a cool real world location, as London undoubtedly is, that didn't involve either a) the cast of ghouls and ghosts, or b) the cast of the Lord of the Rings (book not film)?

Is it that hard to let go of a glorious past and set out for new horizons? I'm sure somebody in the games industry could come up with a compelling scenario to cover the possibility space for play in an MMO mechanic, that occurs in the real world, a place many people who don't care two figs for Moloch's Lair are quite attached to!!

It's kind of sad, really.

To be fair, I'm not trying to attack Hellgate: London, which I know almost nothing about. I was just kicked off by reading the Gamasutra piece, which raised the old bugbear. Why is there so little attempt to work the tropes of familiar, everyday life into games? Not every movie or book is about a land reached only through a wardrobe - some people like to read about themselves. Dogs and ponies and invasive surgery are massively popular on a certain handheld, so why doesn't anyone leverage the cinematic visuals and power of the top-end consoles or PC's to tell a little tale about modern urban life?

Or, as opposed to invoking daemonic ingress, even an MMO about how to survive the world's end we've cooked up in the real world!

Wednesday, January 16, 2008

Can Games ever become Art?

Received this synopsis of an Irish Times article recently, edited by the sender but essentially verbatim. It's not an uncommon theme for the specialist press, but as we know (and as Hegarty comments) games rarely go through the mainstream press, so I thought this worth posting because he is quite an astute social commenter. My own quick reaction follows...

by Shane Hegarty, Irish Times Weekend Review, January 12
'This column will be about computer games. Please don't turn away. I mention it only because the subject appears to be regarded by newspapers as an effective reader repellent. Millions play computer games, but it seems that few want to read about them. It is a thriving, multi-billion-euro cultural behemoth, but there are more interesting multi-billion-euro behemoths elsewhere.
...I've been playing (on) an Xbox 360...Halo 3... in which the player (in practice) must ignore the story and just shoot lots of things to survive and reach the next level. (he goes on to complain that games really haven't developed/ matured with their players; film evolution was so much more impressive - from Lumiere Bros. to Fritz Lang over a similar 35-year time-frame). 'What have games given us? Pacman, Mario, Lara Croft and Sonic the Hedgehog.
"Games boast ever richer and more realistic graphics, but this has actually inhibited their artistic growth", argued Daniel Radosh in the New York times in September, after three days of eye-blurring play with Halo3. "The ability to convincingly render any scene or environment has seduced game designers into thinking of visual features as the essence of the gaming experience". worse, he complained, the genre can't break free of another medium it has pretensions to supercede. "Many games now aspire to be 'cinematic' above all else". Not so, claimed Slate.com's gamer. Reviewing the game on the merits of its single-player campaign is like judging a deck of cards on how fun your last game of solitaire was".
He argued that a game such as Halo 3 should instead be lauded for the way in which it offers open-ended artificial environments, which the player can reshape and jump into alongside players from all over the world.
This debate is seldom picked up in a wider media that tracks every trend in music or movies, and which frets constantly over standards in each. Games are confined mainly to the business or technology pages or, pejoratively, when discussing the obesity crisis. Titles are reviewed in some publications, but not with anything like the same attention given to movies or music.
There are some obvious reasons for this. Games are predictable. For all the bluff put into the story on the back of computer game boxes, many of them actually require players to do only one thing: ignore the story and just shoot lots of things to reach the next level.
Game design is also too collaborative to throw up great individuals [my emphasis].
This week, Irish-based company Havok won and Emmy. No one seemed to be able to explain exactly what it was for. They add to the realism and interactivity of games, was the standard line, although one paper just went with, "Game Geeks Win Award".
Cinema and music offer collective experiences, while gaming is still seen as pretty anti-social. Games offer collective experiences too - with the new generation of consoles tapping into social networking - but its not the same as getting several hundred, or tens of thousands, of people in the same space to enjoy the same event.
Meanwhile, cinema has personality, unpredictability, and the possibility of a great performance. The only great performance in computer games comes from the player, and nobody else cares.
Listen to this games expert on slate-com talking about his personal highlights from 2007, and see how many syllables you get through before losing consciousness. "So there I was, minding my own business, flying my Rupture-class cruiser in a low-security star system called Klogori. All of a sudden, a Thorax blastership flown by a pilot from the then-powerful RISE alliance appears on my heads-up display...". Which reminds you that, in 35 years, the genre has yet to throw up a great critic either.
So, for the moment, this cultural giant - which increasingly influences cinema, drives technology onwards, generates huge revenue, and occupies millions of people - remains somewhat in the shadows. It seems if it still has a little way to go before it overcomes its enemies and gets to the next level'.
ENDS
comments to: www.ireland.com/blogs/presenttense

Hegarty is pretty much on the money here. The reason for it is that, up until recently anyway, games development usually attracts two types - men who are recidivist adolescents, with accompanying juvenile power fantasies (I've got my hand up :D), and money grubbing bastards.

And there are 'grown-up' games out there, but nobody's really interesting in talking about them, not even the games press {who are themselves even more useless than the developers, so much in the pockets of the big publishers}.

Sunday, January 06, 2008

Ideas and thinking



Its funny how what we think about can totally fool our own faculties of logical and aesthetic discrimination. Beautiful ideas can creep up on you, pop out with the least effort, and be lauded and praised while you're still wondering why anyone would read it...or you can slave away on a body of work that ties in years of reading and careful concept building, and nobody gets it.

Maybe that's why some people believe in the Platonic reality.

What I've described above is quite exaggerated with respect to myself, but I'm sure of the truth of it, even then. Something in the way cognition and conceptualisation work rings true to this phenomenon. How much can we truly say our minds are arbitrary, chaotic creations of fuzzily specified hardware systems? Isn't there some thread of structure of thought that is inherent to us all?

Many have claimed there is.
It was thought to be language-based, maybe recursion, but that is by no means established. Some rather famous (in anthropology circles) Amazon tribe seems to exist entirely without recursive speech, and all it takes is one black sock.
It may hide somewhere in the little-understood processes of memory formation and recall.
The signalling system of the biological neural network is hardly measurable, and very much not understood. Allow me here a hackneyed and misplaced analogy with computers, to spell things out...If our neural nets are the physical data transfer layer, then the signalling between them is the logic gate design*. If we do not have a full grasp of even this level of algorithmic operation, how can we divine the instructions being passed, or the language that they underwrite, or the semantics being expressed?
How can we hope to reason about why thinking works in such peculiar ways? I'm afraid that for now, the engineering of cognition is a way up the slope**, and we are stuck with thinking about thinking.

The beauty of it is, thinking about thinking can be far more fun than knowing the answer :)


*We could also say 'the Turing Machine specification', but this gives the false impression that classical Turing machines are not totally superseded by von Neumann logic gates (Tesla's logic gates are kind of besides the point, occuring too soon). Also, this analogy only holds in the static case, but the dynamic is too much to go into - see Holland's Emergence.


**The slope of the acceleration of human knowledge. I'm not going to say anything about how that relates to linear time here. That woud be presumptuous.

Friday, January 04, 2008

The Man...



This was going to be a follow up to, though not too promptly on the heels of, my previous oil post(it may be wise to read that first, if you haven't). However the point I am circling around is two-fold, and this part has run quite long enough, so I'm posting it standalone. I may continue back around to oil or I may diverge to the environment. We'll see!

One thing I think is often mis-understood is the exact nature of 'Them'. The Man, whoever he is. Some like to believe that there are shadowy bodies of power-brokers, overseeing great conspiracies to rule the world. Others believe that it's just ordinary people at the top, with ordinary motivations of varying shades of morality.

One thing that's not often posited is that the truth is probably somewhere in between - there are shadowy bodies of unrepresentative individuals overseeing loosely collaborative agreements on how to exercise their vast power to run the parts of the world that concern them. I doubt, however, if their motivations are larger than their capacity for vision. This is key, because I strongly believe that you don't really get to positions of great power, if you are the kind of person who sees very far beyond yourself.

Vision, in entrepreneurial terms, means discovering opportunities of indefinite expansion. Expansion, as I've said before, is the sine qua non of economy. The very idea of value is predicated on a positive prediction for economic expansion. Empires rise as they master forms of expansion, and fall as their strategies for expansion are outdated by the consequences of that same expansion. Populations of all sorts of creatures follow the same pattern - boom and bust. Managing to avoid the bust is indeed an admirable skill, and rewards its holders very highly.

It's not really visionary though.

Great, world shaking thinkers and entrepreneurs have been distinct sets of people, I believe.

To come back to the point then, where the shadowy power brokers theory falls down is that there are no people that would be put in such a position of power, and see a way to use it for truly extrinsic achievement. Read a goodly spread of sci-fi authors, and the chances are good someone will have posited a really quite plausible set of steps to be taken from the present day to escape the closed loop of our existence. And that is only the most obvious example of vision.
And where is it to be seen among those who wield power? They threw away the plans for the Saturn rockets that got us to the moon, for Heaven's sake! They used a planetary lifetime's worth of free carbon-based energy to power an economy less than two centuries old and doomed to crash by the very nature of it's design!

Still, it is little wonder, after all. 'Waste and want' are the watchwords of our world. Little wonder those at the top of food chain direct things in similar mode. Any attempt to paddle in the other direction usually costs the exemplar everything, in personal terms. So great is the flow of humanity toward oblivion, that any attempt to signal for change requires complete dedication of the signaller's life. A high price.

So it's easy to say lions for lambs, but harder to step into those shoes and be anything but the same as those led, who turn out to be more sheep with the appetite of lions.