Is the Generational Divide in Technology Widening?
By Alex Daley, Casey Research
My son doesn’t know how to use a mouse.
He doesn’t even know what one is. As far as he’s concerned, it’s a furry animal he’s only seen in books and running around the floor of the Newark airport.
While I’ve known this for some time, it recently moved from the back of my mind to front and center following a brief car trip a few days ago. From the back seat, my eldest son – who for some inexplicable reason loves to watch the instructions tick by on the screen of the GPS unit sitting on the dashboard – requested that I program the unit to give us directions home. I politely declined, pointing out that I couldn’t be messing around with the screen as I was already driving. He followed up with that well-known, youthful naïveté that borders on soul-piercing in its effectiveness to point our shortcomings in ourselves and our world by asking:
“Why can’t you just tell it where you want to go? Like the Xbox.”
“I don’t know, son…”
Unhappy with the answer he’d received, the conversation then turned in the direction of endless questions about computers versus video games versus the car’s GPS. In all the hubbub of explaining to my eldest about the differences between them, especially how we interact with them, my youngest, despite spending many an hour on particularly snowy Vermont days upstairs in the home office playing Curious George games on the computer, piped in abruptly:
“Dad! Why would your computer have a mouse in it?! You’re just making that up!”
Digital Natives
A lot has been made over the past few years of so-called “digital natives” – children who were born and raised in the age of the computer. Kids like Mark Zuckerberg, who was born in 1984, seven years after the release of Apple’s first computer. For all of his life (and mine; if I’m being honest, I’m not that much older than Zuck), computers have been part of human existence.
We were both part of the first digital generation. But still, even then the computer was something distinct from everyday life. It was culturally defining. It was epochal, some might even say. But it was by no means universally prevalent.
One of my fondest childhood memories is of the arrival of a fresh, new Tandy 1000 SL one Christmas morning. I remember it well. 50-odd lb., 13-inch CRT monitor. Big honking base “CPU.” Keyboard… and no mouse. That came later, with the next-generation model.
The arrival of the Tandy was the moment we went from a family without a computer to a family with one. I look back at it, and I like to think it compares to the arrival of the first television set in many households in the 1940s and 1950s. The family gathered around the set to watch Ed Sullivan, neighbors aglow with jealousy.
We were by no means the first on the block to own a computer. Still, at that stage many a family didn’t yet have one. Maybe Dad or Mom used one at work, and of course, our small suburban school had a little “lab” of them, used to mainly to teach typing. So I’d used them before, on occasion, but the power of the device – and my borderline addiction to it – was not apparent until it invaded the home. Or at least my home (there is probably a good “nature vs. nurture” debate in the evolution of the computer geek).
But it wasn’t much like the arrival of the television at all. I only found out years later that my mother had to work hard to convince my father that it was more than his choice description – “a $3,000 pet rock.” We never huddled around the phosphorescent glow of the screen as a family. My brother ignored it nearly entirely (maybe that was only because I hogged it?). The same was true for the majority of children in the original group of digital natives: the computer was part of, but mostly peripheral to, the average child’s life. It was only a select set who really made it a part of their everyday lives.