Wednesday, July 15, 2009

Stirring things up: Human Learning Paradox

I recently started a fascinating discussion with Brian Hennessy (more on this soon, see here) on the matter of how much the brain influences our ability to innovate. In other words, how much is the brain actually trying to find patterns that mimic what it can model as opposed to create completely new constructs. Let's explore this.

I bet Jeff Hawkins (see numnenta.com) would say that the brain's ability to model is quite substantial as this organ is nothing but a classifier. Hold that though, I argue there is a limit to what our classifier can do.

An example of what do I mean by the brain trying to model what it knows may help here. Take the iPod and let's oversimplify things for the sake of this discussion. Let's say that one or more brains belonging to employees at Apple realized that there was a path between MP3 technology, hand held devices and music. That is, we can relatively safely argue that the iPod as an abstract concept is the result of a brain connecting the dots of what was already out there. That leads me to quote William Gibson :"the future is already here, it is just unevenly distributed".

Perhaps what William should have said is that the future of humans is already here, it is just unevenly distributed. That is, humans seem to innovate by connecting dots. A faschinating question would be how would non-humans innovate but until we build something capable of passing the Turing test or we meet other intelligent life forms this would be a rather fruitless discussion.

Back to us. A researcher in the natural science may disagree with my assertion that innovation=connecting dots but let's face it, all forms of research use modelling (mathematics anyone?) and prior knowledge to build upon. So even innovations like nuclear fusion or Lithium polymer batteries built from the past. Hence let's just agree that there is something about the notion of "connecting the dots" (notwithstanding how complex those dots may be) that strongly relates to our ability to innovate. The further apart the dots, the bigger the leap in innovation. Fair enough?

This all seems to suggest that Brian may be on to something. The brain is indeed influencing how we innovate, the question is how much. An even more interesting question is what are we missing in term of innovation since we are constrained by our brain? Sadly that is an question without an answer since our brain is all we have that demonstrates the ability to abstract and, yes Jeff, classify.

What does that implies for humans? It seems to imply that every cognitive process we undertake is influenced in some form by the brain's physiological and morphological characteristics. Brian goes so far as to ponder whether things like Saas and cloud computing are so successful because they resemble a neocortex at the morphological level. What he is suggesting is that we feel unconsciously or consciously comfortable with the idea of cloud computing because so is our brain, literally. Wild theory for sure but how far off is he? I say not much. Let me stretch the vision even more and propose the

Human Learning Paradox

"We will never fill the gap between our understanding at the cellular level and our understanding at the functional level with respect to the neocortex since the neocortex itself is what we are using to attempt this cognitive exercise. This is a paradox in that the neocortex would need to be capable of representing itself"

This is bad news if you make a living on computational neuroscience (or perhaps it is job security?) but it is good news for innovations. Why? Because if we embrace the fact that our brains influences how we think and thus how we innovate then we have a chance to help that process occur more often, faster and better. Now that is a good news indeed.