Friday, December 08, 2006

Polya's Inner Neanderthal

I remember reading, years ago, the excellent book "The Psychology of Mathematical Invention" by the mathematician Jacques Hadamard...

He surveyed a bunch of mathematicians, intending to find out how mathematicians think internally. Many mathematicians thought visually, it was found; some thought in terms of sounds, some purely abstractly.

But, George Polya was the only mathematician surveyed who claimed to think internally in terms of grunts and groans like "aaah", "urrghhh" , "hmtphghhghggg"....

At the time I read this, I thought it was very odd.

However, now I have just read Mithen's book ("The Singing Neanderthals", discussed in another, recent blog of mine) claiming that the language of Neanderthals and early Cro-magnons was like that: no words, just lengthy, semi-musical grunts and groans with varying intonation patterns....

So maybe Polya was just old-fashioned.... ;-)

Anyone else out there think in terms of grunts and groans and so forth? If so please contact me....

Wednesday, December 06, 2006

Updating Kazantzakis

I saw this quote in a friend's email signature...

"What a strange machine man is! You fill him with bread, wine, fish, and radishes, and out comes sighs, laughter, and dreams."
-- Nikos Kazantzakis (1885-1957), Greek novelist.


To which my immediate mental response was:

OK, fine -- but it's what happens when you feed him with hallucinogenic mushrooms, amphetamines, ginger beer, Viagra and stir-fried snails that really fascinates me!!

Saturday, December 02, 2006

Zebulon's Favorite Place

My son Zebulon (age 13) recently had to write a brief essay for school on "My Favorite Place," as part of a national essay competition. His essay was chosen to represent his school in the county-wide competition. His theme was an amusing one which may resonate with many readers of this blog -- the third paragraph particularly reminds me of myself on some of my more productive days! (But Zeb is not working on AI, he's working on animations, see zebradillo.com)





My Favorite Place
Zebulon Goertzel



I work my way past all the furniture in my cramped room and sit down at my chair. I see a computer, a laptop. On its screen are pixels. Tiny, stabbing rays of color that drill into my eyes and let me enjoy my computer to no end despite its hideous flaws. The monitor is marked and scarred due to various past and unknown misuses. The dull keyboard is with its regular layout, usable but without an S key. I look at the front disk drive and recall being told not to remove it.

Beside my laptop is my tablet. In its middle-left side is the pen, a gnawed-on, well-used device that is often lost and found in my pocket. The tablet cover is not without scratches, some deep, some light. Each scratch is from a scribble or drawing or line somebody drew. A bright wire links my tablet to the sloppy tangle of wires, connectors and cables which is usually behind my laptop.

My computer’s fan consistently buzz-whirs with high pitch. I am hypnotized as I slowly lean forward, as I grip my tablet pen with sore, almost numb fingers, as I click and click and click. My back is hunched and my neck is out. I work. My eyes ache, but I hardly notice. My stomach is empty, but I try to ignore it. I decide to be done. I get up, stretch, and go to care for myself. My favorite place is my computer, or my desk, because there are no limits to what a computer can do, and my computer fascinates me to no end.

The Cognitive Significance of Radiohead (aka, The Historical and Possibly Current Significance in the Human Mind of Patterns of Tonal Variation)

In one of those pleasant synchronicities, a couple days ago PJ Manney started a conversation with me about music and the scientific mind, at the same time as I received in the mail a book I had ordered a couple weeks ago, "The Singing Neanderthals," about the cognitive origins of music.

So, here I'll start with some personal notes and musings in the musicaloidal direction, and finally wander around to tying them in with cognitive theory...

I had told PJ I was a spare-time semi-amateur musician (improvising and composing on the electronic keyboard -- yeah, one of these days I'll put some recordings online; I keep meaning to but other priorities intervene) and she was curious about whether this had had any effect on my AI and other scientific work.

I mentioned to her that I often remember how Nietzsche considered his music improvisation necessary to his work as a philosopher. He kept promising himself to stop spending so much time on it, and once said something like "From now on, I will pursue music only insofar as it is domestically necessary to me as a philosopher."

This is a sentiment I have expressed to myself many times (my music keyboard being a tempting 10 feet away from my work desk...). Like Nietzsche, I have found a certain degree of musicological obsession "domestically necessary" to myself as a creative thinker.... The reasons for this are interesting to explore, although one can't draw definite conclusions based on available evidence....

When I get "stuck" thinking about something really hard, I often improvise on the piano. That way one of two things happens: either

1) my mind "loosens up" and I solve the problem

or

2) I fail to solve the problem, but then instead of being frustrated about it, I abandon the attempt for a while and enjoy myself playing music ;-)

Improvising allows one's music to follow one's patterns of thought, so the music one plays can sorta reflect the structure of the intellectual problem one is struggling with....

I drew on my experiences composing/improvising music when theorizing about creativity and its role in intelligence, and cooking up the aspects of the Novamente AGI design that pertain to flexible creativity....

As well as composing and improvising, I also listen to music a lot -- basically every kind of music except pop-crap and country -- most prototypically, various species of rock while in the car, and instrumental jazz/jazz-fusion when at home working ... [I like music with lyrics, but I can't listen to it while working, it's too distracting... brings me back too much to the **human** world, away from the world of data structures and algorithms and numbers!! ... the nice thing with instrumental music is how it captures abstract patterns of flow and change and interaction, so that even if the composer was thinking about his girlfriend's titties when he wrote the song, the abstract structures (including abstract **emotional** structures) in the music may feel (and genuinely **be**) applicable to something in the abstract theory of cognition ;-) ] ... but more important than that is the almost continual unconsciously-improvised "soundtrack" inside my head. It's as though I'm thinking to music about 40% of the time, but the music is generated by my brain as some kind of interpretation of the thoughts going on.... But yet when I try to take this internal music and turn it into **real music** at the keyboard, the translation process is of course difficult, and I find that much of the internal music must exist in some kind of "abstract sound space" and could never be fully realized by any actual sounds.... (These perverted human brains we are stuck with!!!)

Now, on to Mithen's book "The Singing Neanderthals," which makes a fascinating argument for the centrality of music in the evolution of human cognition.... (His book "The Prehistory of Mind" is really good as well, and probably more of an important work overall, though not as pertinent to this discussion...)

In brief he understands music as an instantiation and complexification of an archaic system of communication that was based (not on words but) on patterns of vocal tonal variation.

(This is not hard to hear in Radiohead, but in Bach it's a bit more sublimated ;=)

This ties in with the hypothesis of Sue Savage-Rumbaugh (who works with the genius bonobo Kanzi) that language likely emerged originally from protolanguages composed of **systems of tonal variation**.

Linguist Alison Wray has made related hypotheses: that protolanguage utterances were holistic, and got partitioned into words only later on. What Savage-Rumbaugh adds is that before protolanguage was partitioned into words, it was probably possessed of a deep, complex semantics of tonal variation. She argues this is why we don't recognize most of the existing language of animals: it's not discrete-word language but continuous-tonal-variation language.

(Funny that both these famous theorists of language-as-tonal-variation are women! I have sometimes been frustrated by my mom or wife judging my statements not by their contents but by the "tone" of delivery ;-)

This suggests that a nonhuman AI without a very humanlike body is never going to experience language anywhere near the same way as a human. Even written language is full of games of implied tonal variation-pattern; and in linguistics terms, this is probably key to how we select among the many possible parses of a complex sentence.

[Side note to computational linguists and pragmatic AI people: I agree the parse selection problem can potentially be solved via statistics, like Dekang Lin does in MiniPar; or via pure semantic understanding, as we do when reading Kant in translation, or anything else highly intellectual and non-tonal in nature.... But it is interesting to note that humans probably solve parse selection in significant part thru tonal pattern recognition....]

Regarding AI and language acquisition, this line of thinking is just a further justification of taking a somewhat nonhumanlike approach to protolanguage learning; as if this sort of theory is right, the humanlike approach is currently waaay inaccessible to AI's, even ones embodied in real or simulated robots... It will be quite a while until robot bodies support deep cognitive/emotional/social experience of tonal variation patterns in the manner that we humans are capable of.... The approach to early language learning I propose for Novamente is a subtle combination of humanlike and nonhumanlike aspects.

More speculatively, there may be a cognitive flow-through from "tonal pattern recognition" to the way we partition up the overall stream of perceived/enacted data into events -- the latter is a hard cognitive/perceptual problem, which is guided by language, and may also on a lower level be guided by subtle tonal/musical communicative/introspective intuitions. (Again, from an AI perspective, this is justification in favor of a nonhumanlike route ... one of the subtler aspects of high-level AI design, I have found, is knowing how to combine human-neurocognition inspiration with computer-science inspiration... but that is a topic for another blog post some other day...)

I am also reminded of the phenomenon of the mantra -- which is a pattern of tonal variation that is found to have some particular psychospiritual effect on humans. I have never liked mantras much personally, being more driven to the spare purity of Zen meditation (in those rare moments these days when emptying the intellectual/emotional mind and seeking altered states of purer awareness seems the thing to do...); but in the context of these other ideas on music, tones and psychology, I can see that if we have built-in brain-wiring for responding to tonal variation patterns, mantras may lock into that wiring in an interesting way.

I won't try to describe for you the surreal flourish of brass-instrument sounds that I hear in my mind at this moment -- a celebratory "harmony of dissonance" tune/anti-tune apropos of the completion of this blog post, and the resumption of the software-code-debugging I was involved with before I decided to distract myself briefly via blogging...