Just before I went on holiday recently I was asked how human learning has changed with the advent, penetration and increasing ubiquity of computing technology.
My answer was simple – it hasn’t.
Human learning hasn’t fundamentally changed over the last fifty years. Our ability to learn is something honed over several hundred millennia, it’s what set us apart from the primates in the first place. Our ability to store and transfer extra-genetic information has pretty much remained the same, bound by language itself, and simply because of biological boundaries imposed by our physiology. We can only ‘learn’ in ‘this’ way and ‘so’ much. While you hold that thought, it’s good to think that the technology is on the cusp of providing an ‘infinite’ amount of information. While we use personal computing devices currently (the PC, laptop, phone, media player, game consoles, etc.), research into neural interface devices is progressing at a rapid, perhaps even alarming rate.
It’s only a matter of time before we accept technology based artificial implants as being a natural way to enhance our biologically limited perception and cognition. While I won’t debate the ethics of this, which is another matter altogether, what we’re seeing is a revolution in the way humans view computing and just as importantly computing views humans; this shift is truly and fundamentally life-changing. So if the human brain will take thousands of years to evolve physiologically to match the demands of ‘infinite’ information, it’s only natural that an intelligent humans seeks to address that by the use of technology in some form or the other.
We’ve always used technology to escape the bounds of human physiological limits. Take writing itself as a technology, because your voice can only travel so far, the wheel which fundamentally changed transport because humans legs could only go so far, and so on. Each major technological step has supplemented humans potential, it could not have been otherwise. However efficient and adopted in a variety of ways, these technologies supplanted human physical capabilities, and not the perception-cognition complex itself. The much-touted ‘cloud’ computing paradigm coupled with mobile devices and the huge information source that is the internet is technology that can change that.
The evolution and growth of the internet and personal computing devices is a massive technology shift, because I feel at some point eventually, it will like I mentioned before, supplement the human perception-cognition complex. It’s already done that to some extent – see how often in the day we run to Google for little bits of information that are necessary to perform tasks ranging from the mundane to the highly creative, synthetic activities. Computing is invading our lives in ways unimaginable just a few years ago. We are scheming of an internet of things, want wearable computing , and synthetic but highly believe computing experiences; we expect much from what is rudimentary technology yet.
What that could ultimately mean is that there is no need to learn whatsoever. Imagine a world where each individual digitally narrates their activities and experiences, perhaps automatically, and where that captured experience can be shared freely by whosoever wishes. For a learning designer like myself, that’s the ULTIMATE solution, technology that enables the sharing of experience itself. Better than any simulation or temporal/spatial presentation of content ever. Yes, perhaps its futuristic; and may even sound scary considering the gamut of human emotion.
Such technology promises to fundamentally change human civilization, just like writing or the invention of the wheel did. But learning itself ? that’s hardly changing, we are only adding and co-opting tools and media to assist learning at a phenomenal rate.