There appears to be a growing fear in society of the group labelled “digital natives.” The fear generates from the belief that these young people are somehow superior and most knowledgeable in computer technology and social media, just because they grew up with it.
A digital native, as defined by Marc Prensky [PDF] who coined the phrase, is a “native speaker of the digital language of computers, video games, and the internet.” The rest of us are digital immigrants as we have, “become fascinated by and adopted many or most aspects of the new technology.”
The problem we face is the interpretation of what “native speaker” means. It is assumed that because the digital natives grew up with technology around them, they understand all technology, and have the necessary skills and maturity to function within, and implement, complex systems.
Reality is, just because you speak your native language, doesn’t mean you can write the next great novel in said language; just because you can use a technology doesn’t mean you understand overall implications of its power or how to use it effectively.
Forbes magazine reports overheard comments such as “‘Hire a digital native to do it,’ – CMO Fortune 500 Company, re: staffing for social media.”
This belief in an inherent ability threatens to become a divide within the workplace between the young and the still young, just not ‘native’. Unrealistic expectations are being placed on the digital native crowd to perform, while the more experienced set fear their own obsolescence or are disappointed in their new hire.
JISC InfoNet researcher Doug Belshaw describes digital literacy as, “understanding how the web works, understanding how ideas spread through networks and the ability to use digital tools to work purposefully towards a pre-specified goal.”
In all things, understanding comes through lessons combined with experience. The question then becomes, have the first group of digital natives been taught sufficiently and do they have enough experience to wear the all encompassing label that is being placed on them?
The Columbia Spectator, at Columbia University, ran an article written by a student, titled “Digital natives stuck in the Stone Age,” with the by-line, “Too many Columbians browse websites without knowing how to make them.” The author, Alex Collazo, discusses an ignorance he perceives among students in the basics of internet use.
He says, “Watching a Columbian student use a computer can be a painful experience and makes one question whether simply growing up on a computer is sufficient to instruct a person in its use.” Mr. Collazo sees his fellow students unable to make a simple website, browsing without antivirus software and unable to clear a browsing history of naughty websites, all of which should be basic tasks to a “digital native.”
This spring, the Guardian UK launched a Digital Literacy campaign to, “upgrade computer science and IT in schools.” Their campaign focused on having coding and ICT taught properly in schools. Basic skills are learned easily (surfing the internet, playing a game, word processing,) but higher skills to understand how the computer works are necessary to be competitive in the economies of the future. These things must be taught.
The first group of digital natives currently entering the work force are being are expected to function on a very high level around technology, especially social media; a level of functioning that involves learned skill, but is currently implied to be inherent.
I often hear parents, or grandparents say, “The kids are better at this than I am. They pick it up so fast.” While functionally children are able to learn technology very quickly, it is dangerous to believe they are “better” at it than an adult who understand the implications of using these technologies.
For example, at one and a half years old my baby figured out how to switch between programs on the family iPad using the ‘four finger swipe’. We had to Google the function to figure out how this was happening. The baby knows how to mash buttons but doesn’t understand the power of this technology, how to apply limitations to usage, or fully understand the content consumed – especially if accidentally viewing inappropriate content. The same applies to school age children and teenagers. The implications of technology must be taught at an ever younger age to help prepare children for adulthood; especially those who have never known an existence without a computer in the home and instant connectivity to all parts of the globe. Monitoring, teaching and discussion are the only ways to make these digital natives good digital citizens.
The divide between the natives and the rest of us will only grow deeper and more destructive unless our belief that these young people have magical knowledge surrounding technology ends. Inaccurate and inappropriate pressure will not help them reach their potential in school, the workplace or life; it only serves to frustrate them for not meeting expectations. On the other side, teachers and employers will be disappointed with the “results” they receive from these same individuals.
Appropriate use of technology is not a natural-born ability, unless you’re watching Star Trek. Yes, stranger things have happened, but until my consciousness is downloaded into a new body, I’ll be accepting the technological abilities of my local “digital native” based on the person, not a label. Parents and employers would be served well to do the same.
Later this week, checkout the flip side in Why Digital Immigrants Need to Get Over the Labels.