The theme of today’s conference sessions was on attention, on which William James famously said, “Everyone knows what attention is.” (I never want to hear that phrase again. Ever. I heard it enough today)
I wasn’t too enamoured with the first three talks today, which were arguably given by the big-hitters of the conference. I didn’t think that any of them were particularly compelling speakers and they assumed quite a lot of knowledge on the part of the audience, which is mostly composed of graduates, many of whom didn’t even specialise in neuroscience or experimental psychology.
Kia Nobre, from Oxford, gave an interesting talk about imaging the system that controls attention in the brain, but alas my brain isn’t working properly and I can’t recall what she said. Clearly some sort of cue is in order…
John Marshall, also of Oxford, talked about spatial cognition and whether it’s a right hemisphere specialization. Well, that’s what the title of the talk was – in actual fact, he ended up talking about perceptual neglect, which is an interesting enough subject, especially when considering Bisiach’s imaginal neglect, but I don’t think he said anything particularly new or interesting. That’s not true – he did say one interesting thing, which didn’t have much to do with his talk.
He said that rather than spending our time trying to figure out the anatomical specialisation of different parts of the brain (e.g., insisting that Broca’s area of the brain is only about language), we should instead think that one region in the brain might have a number of specialised functions depending on what neural circuits are active at that time. A useful insight.
Next was Prof. Stephanie Clarke of CHUV, Switzerland. She has this interesting idea that much like the dorsal and ventral streams of processing in the visual system, there are ‘where’ and ‘what’ streams in the auditory system. She presented some histological evidence for this, which was a bit refreshing to see in the conference overburdened with psychophysical experiments, although of course she had her own psychophysical experiments as well to prove the functional point. These experiments basically showed a double-dissociation between recognition and localisation of hearing. Very intriguing stuff.
The last talk was by Geraint Rees, of UCL. This was quite controversial. Rees is an excellent speaker, and he talked about his experiments into proving that awareness (and thus conscousness) resides in the parietal cortex (or at least, in the cerebral cortex) by a clever and peculiar fMRI experiment involving subjects ‘merging’ two different images and involuntarily switching between the images.
The general consensus is that while it was a very clever experiment, of the sort that Nature likes to publish, his conclusions were a bit too ambitious and he had a few too many assumptions…
At the end of the day’s sessions, there was a brief discussion that involved how genes might determine brain function. One of the speakers speculated about how we might have to consider that genes statistically alter the growth of various neuronal regions and bias them towards being able to learn and specialise in certain areas, e.g. facial recognition.
It was at this time, about 5:20pm, that I came up with an idea. I was a bit tired and thinking that I might like to leave the lecture room but couldn’t really because I was sitting in the middle of a row and someone was talking. I then thought, wouldn’t it be wonderful if I could just transfer my consciousness into multiple locations so I could keep an eye on different events – a bit like being on the Internet and participating in several IRC chats at the same time. Not a new idea, I know, but it was quite vivid at the time.
Tomorrow is the last day of the conference, but it won’t give me much respite because I have a symposium to attend on Friday. Five straight days of having to wake up at 8am… I honestly don’t know how I’ll still be alive by this weekend.