Feature: Is It Just Us, Or Are Kids Getting Really Stupid? They don’t read. They can’t spell. They spend all their time playing computer games and texting and hanging out with one another on Facebook. But the problem is much worse than you think, because the way your kids live now is rewiring their brains By Sandy Hingston Two autumns ago, in my son Jake’s junior year of high school, he took an AP English course. Junior year was bad for him and me -- we never seemed to have anything nice to say to one another. But Jake did like to read, and it occurred to me at some point that perhaps I could use his AP English course to connect with him. Surely I’d read the same books he was reading, since the high-school reading list was carved in stone sometime in the early 1950s. So I asked him: What are you reading in AP English? “The Great Gatsby,” he said. ”Do you . . . like it?” I asked delicately, thrilled to be having what was almost a conversation with my teenage son. “I don’t really like the actor who plays Gatsby,” he said. “He’s got these weird bumps on his face that keep distracting me.” “The actor?” “We’re not actually reading the book,” Jake informed me. “We haven’t read a book all semester. We watch the movies instead.” It sort of made sense, once I calmed down and thought about it. It was hard to get kids to read back when I was in high school; what must it be like now, when there are iPods and iPhones and Internet and cable TV? Better to have seen Robert Redford pretend to be Gatsby than never to have known Gatsby at all. Just the same, I was glad when, for his senior year, Jake proposed taking an English course at the local community college. Come September, he and a buddy drove to the college every Monday night and sat for three hours in English 101 -- where they never once read a book. They watched movies instead. Jake got an A- in the course. We live in interesting times. In the past decade, the number of college grads who can interpret a food label has fallen from 40 percent to 30 percent. An American child is six times more likely to know who won American Idol than the name of the Speaker of the House. (For more bad news, see the sidebar on page 59.) Reading and writing scores both fell on the 2008 SATs. Not long ago, a high-school teacher in California handed out an assignment that required students to use a ruler -- and discovered not a single one of them knew how. What in the world is going on with our kids? Bring the subject up in any group of parents around Philadelphia, and you’ll hear the same thing: Children today seem, well, dumber than they used to. They don’t know the most basic stuff: who fought against whom in World War II, how many pints are in a quart, and in Jake’s case, the days of the week. (He’s shaky on the months, too.) They may be taking every AP and Honors course their schools offer, but they can’t tell you who invented pasteurization. (They do know who invented Facebook, because they saw the movie The Social Network.) They spend an average of eight and a half hours a day in front of screens -- computer screens, TV screens, iPhone screens. Add in eight hours of sleep and seven of school, and that leaves half an hour when their senses aren’t under siege -- just enough time for a shower. Technology was supposed to set us free, to liberate us from mundane, time-consuming tasks so we could do great things, think great thoughts, solve humanity’s most pressing problems. Instead, our kids have been liberated to perform even more mundane, time-consuming tasks (including the average 3,339 text messages they send and receive each month -- or more than a hundred per day). They have an average of 440 friends apiece on Facebook. Do you know how long it takes to check in on 440 friends? THE INTERNET WORLD is big and brash and colorful and infinitely engaging, and books are black and white and just sit there. If you were a kid, which would you rather hang out with? And yet the history of human civilization, our history, has been a process of learning to tune out precisely what attracts us to the Internet. Watch a bird at a feeder, and you’ll see what nature instills in living creatures: a constant awareness of any flicker of change that might spell doom. Scientists call this “bottom-up attention,” according to Anjan Chatterjee, a neurologist at Penn’s Center for Cognitive Neuroscience. “Everyone has had it happen,” he says. “You’re crossing the street, you see something in your peripheral vision, and you orient toward it before you’re even conscious that it’s happened. That’s important for survival.” What humans have developed that differentiates us is “top-down attention,” where we actively choose what we’ll pay attention to. Both kinds of attention are needed, and we naturally switch back and forth between them. But Western civilization is built on literacy, which is a top-down model. The shift from oral tradition to writing that took place sometime around 4,000 B.C. literally changed the way we think. Over the centuries, reading evolved from a group activity -- heroic tales shared around the hearth on cold nights -- to a solitary one, and writing from a practical reckoning of laws and items of trade to an exploration of the human condition. As Maggie Jackson explains in Distracted, her treatise on our current “erosion of attention,” writing created a “pause -button” for the avalanche of spoken language: When we could take time to study the words on a page -- when we could read deeply -- we began to think more deeply. Reading is highly unnatural in that it requires us to filter out distractions and focus our attention on a single task. “If the brain had an unlimited capacity to process information,” says Chatterjee, “you wouldn’t need an attentional system.” But a study at Stanford last year led by professor Clifford Nass, who specializes in human/computer interaction, showed that heavy users of multi-media “have a very hard time filtering out distracting information,” Chatterjee says. “The phone rings, and their behavior is driven by that distraction.” When heavy users have to consciously decide what to devote their top-down attention to, they can’t. They’re in thrall to their machines. Kids today are assailed by such a constant stream of input that they can’t even remember what they see. Viewers of TV screens crowded with crawls and graphics are significantly less able to recall the facts of news stories than viewers of simpler screens. “The larger the cognitive load, the harder it is to process information to any depth,” says Chatterjee. Our brains need time to mull over what’s presented to us, to decide what’s worth shifting from short-term memory into long-term storage. We may have an infinite amount of information at our fingertips, Chatterjee says, but “we don’t actually ingest that information in such a way that it gets deeply encoded.” And that explains why my son doesn’t know the days of the week. Call me old-fashioned, but the fact that he doesn’t concerns me. There are certain things my kid -- any kid -- should know by the time he’s a high-school grad -- that Wednesday follows Tuesday, and his ninetimes tables, say. That Jake can use his cell phone to retrieve this information -- can use it, for that matter, to learn how to refine weapons-grade plutonium -- is beside the point. I’d like some basic knowledge to be inside his head. ELLIOT WEINBAUM, a professor at Penn’s Graduate School of Education, thinks I’m worrying unnecessarily. “Is your son’s school on a six-day calendar?” he asks, as we sit at a table in a conference room with a handful of his colleagues, munching sandwiches and chips. I look at him in surprise. As a matter of fact, it is. Weinbaum shrugs. “He doesn’t know the days of the week because he doesn’t need to know them,” he says matter-of-factly. “What he needs to know is, is it day two or day three?” Weinbaum and his colleagues are, when it comes to education, the deciders -- the men and women who study how and what schoolchildren learn, how teachers teach, and how both can do better. And as far as they’re concerned, the kids are all right. They acknowledge that there are differences in how kids learn these days, but . . . well, let professor Janine Remillard explain. “Take literacy,” she says. “There’s not really less reading. Kids are just reading in smaller chunks. They’re not digging deeply into texts, but they’re reading from a lot of different sources.” Besides, these educators say, we don’t have solid data to tell us what kids really did know 30 or 40 years ago, not to mention that the American education system is struggling to decide exactly what should be taught now, given the ever-increasing possibilities. The Penn profs see the world changing, not our kids. So I challenge them: What is Jake learning when he spends six hours a day on his computer, playing World of Warcraft? Everyone turns to Yasmin Kafai, who, it turns out, has devoted extensive research to computer games. “Over so many hours,” she says, “he’s learned how to master an incredibly complex system. These multi-person games that involve intra-functional teams -- ‘guilds,’ they call them -organize their entrants the way some workplaces do. These are skills that corporate employers are very interested in.” And, she says, he’s learning perseverance: “Kids invest hundreds of hours in gameplay.” Still, it seems obvious that kids like Jake -- meaning most kids -- are spending way too much time at one thing instead of learning all they could. Isn’t it self-evident that my son would be a better student, better future employee, better human being, if he spent six hours a day reading Tolstoy and listening to Bach instead of playing WoW? “Adults have always been afraid of their culture being lost,” Penn’s Sharon Ravitch declares. “Okay, so classical music may be lost, but what about the broader array of music I’m exposed to? We fear we’re losing the moral base. But the postmodern view is that the moral base didn’t resonate with a lot of kids. We have this mythological notion of what people used to know, but that’s male, white, Western-based knowledge. What is teaching? What is learning? What is the political basis of schools?” Maybe I’m just crotchety because I had to read dead white men’s books instead of playing games. Maybe kids aren’t stupider at all; maybe the new ways of learning really are just different, not inherently worse. Maybe -- oh, God -- I should be on Facebook. I need to talk to more kids Jake’s age before I can decide. ON A RAINY NIGHT in September, the concession stand outside the fence surrounding Cheltenham High’s football field is a bright oasis of light. The game, against Quakertown, is tied. Beneath the stand’s sloped roof, Ally Gardiner, blond and blue-eyed and scrubbed-face pretty, is scooping hot dogs out of a vat of boiling water and laying them atop the rollers of an electric grill. “Two more hot dogs!” her friend Dana sings out. Ally and Dana and a clutch of other student-council members are handing candy bars and Cokes to customers at the window and collecting cash in return. Ally ran for this year’s president of student council and won. Her platform, she explains, with the grill rollers once more full, was all about inclusion. “I really care about the school,” she says, hair curling in tendrils from the hot-dog water. “I want to bring people together in spirit. More of a variety . . . a lot of . . . “ Her voice, usually clear and assertive, trails off for a moment while she regroups. “Everybody has an opinion about how the school should run. But people don’t feel comfortable coming forward. So we need to hand opportunities to them -- to a wide, diverse range of students.” Ally’s taking this on while taking AP Psychology, AP Statistics, AP Calculus, Anatomy, English Honors, gym -- “It’s required” -- Sports Leadership and Economics. She’s also a three-sport athlete -- cross-country plus winter and spring track. She ran for president because she’s passionate about student government. That’s the only reason she does all she does, she says. She gets to school at seven in the morning and sometimes doesn’t get home till 10. She spends an hour or two a day online, maybe three if she’s doing research. Half an hour or so goes to Facebook: “I don’t use the computer often for recreational things.” She wishes more Cheltenham students would come to activities like football games. “They give you a chance to be more social, to get to know everyone,” she says wistfully. “Right now there are small groups of friends, but they don’t really intermingle.” She’s working on a plan to have student-council members sit and talk with the loners in the cafeteria, the kids who sit by themselves with their hoodies pulled over their faces. Ally is all about old-fashioned one-on-one connection. Her dream school, the one she’s trying to make, would be “like a family.” Right now, inside the stand, the representatives of Cheltenham’s student council really are like a family - a modern-day family. Danielle’s on her cell phone. Quadirah is tapping out a text; Eva’s checking if she has one. So is Kareek. They’re all operating under the watchful gazes of Ian Haines, a special-education teacher who oversees the student council, and Dean Rosencranz, a math teacher, who helps. Haines has only been teaching for six years, but he’s seen a difference in kids in just that time. “It’s not only that their attention span is shorter,” he says. “The feedback span is shorter, too. If you ask a question and give time for the class to answer, they get restless. They’re used to getting everything at a click.” He watches Kareek work his phone. “You can’t have conversations with kids anymore. They don’t have conversations. If they have something important to say, they text-message it.” Eva, also curly-haired -- black, not blond -- is Cheltenham’s sophomore class president. “I’m really into leadership, and so is my family,” she says. “I love this school so much! From the first day, I loved it!” She spends three or four hours a day on the computer: “Just Facebook for me. MySpace is totally dead.” “Facebook is going to die,” Ally warns. “Somebody is going to come up with something better,” says Eva. “There’s a lot of hatred on Facebook.” She brightens. “It was a good way to campaign for sophomore class president, though.” “What kinds of Skittles do you have?” a customer asks Danielle. “We have the blue ones, red ones, purple ones . . .” Not to mention Nestlé Crunch, Almond Joy, Reese’s Cups, Twizzlers, Kit Kats, Dots, three kinds of M&Ms . . . For these kids, life is like their candy counter, full of infinite choices. It’s pretty clear that Ally, with her vision of school as a place where people might, say, hang out at lunch and talk to each other, is waging an uphill fight against all that distraction. It’s this candy-counter variety that makes technology today so powerful and, critics say, so dangerous. A raft of recent books with titles like The Shallows, You Are Not a Gadget and, yes, The Dumbest Generation argue that too many options and choices are stripping kids of their ability to process information and creating a massive, generation--wide case of attention deficit disorder. Children frantically multitasking to YouTube and FarmVille and Facebook become “suckers for irrelevancy,” according to Stanford’s Clifford Nass, whose newest book is The Man Who Lied to His Laptop. “Everything distracts them.” I see it in myself. I’m trying to write this article, but at the foot of my computer screen, the AOL icon is bouncing up and down. I know -- I can be 99 percent sure -- that whatever has popped into my inbox is useless spam. (Hey, I’m still on AOL.) And I’m trying my damnedest to ignore the bouncing symbol, to get my important work done . . . and I. Just. Can’t. I have to click. I have to see. I have to bite the apple. Eva knows what I mean. “It’s like Facebook is an addiction,” she says. “If I’m trying to write an essay, I’ll just unhook my Internet connection and turn off my cell phone. It’s been working, actually.” Eva’s on to something, surely. If kids would tune out the white noise of the virtual world, they could plow through Moby-Dick in no time. They just need to buckle down and put their minds to it. I mean -it’s not as though they’re birds at a bird feeder, right? IN SEPTEMBER, the New York Times published an article on a young couple, Taylor Bemis and Andrea Lieberg, who serve as caretakers for Ralph Waldo Emerson’s former home in Concord, Massachusetts. In part, it read: To connect with their long-gone host and his philosophy of individualism, freedom and self-reliance, the couple tried to read his essays and to listen to his work on audiotape, but it was only after watching a DVD about Emerson that they began to understand him. “I felt like he was the first person, or one of the first people, to start thinking outside the box with his whole Transcendentalism and, like, God and nature and all that,” Ms. Lieberg said. “So we were like, okay, he’s cool, nonconformist. And we like that.” The Times is clearly poking fun at Bemis and Lieberg: Ha-ha, the caretakers who, like, couldn’t even read Emerson’s work! But there’s another way to interpret the couple’s experience, and it goes to what’s happening with kids today. It’s not that Bemis and Lieberg were too stupid to read Emerson. It’s that their brains no longer function like that. They quite literally couldn’t understand Emerson’s philosophy until it was presented to them in a form that engaged them differently than just words on a page. I know what you’re thinking- -- it’s like reading The Great Gatsby vs. watching the movie. The movie has to be an inferior intellectual enterprise. But is that true, or has our culture just taught us to think that way? Marshall McLuhan wasn’t the first to observe that how we garner information, or share it, inevitably affects the content. In an Atlantic article called “Is Google Making Us Stupid?” Nicholas Carr relates that in 1882, Friedrich Nietzsche invested in a typewriter after problems with his vision made using a pen difficult. Now he could write with his eyes closed! What he didn’t anticipate was how the substance of his thoughts, as transmitted to the page, would change, moving “from arguments to aphorisms, from thoughts to puns, from rhetoric to telegram style.” His experience wasn’t unique. Toward the end of the 19th century, as new modes of communication and transport -- the telegraph, the telephone, moving pictures, airplanes -- were invented, artists turned to more fragmented styles of expression, using Cubism, pointillism, Dadaism, the minute-to-minute immediacy of James Joyce’s Ulysses, to try to capture how the world had changed. Even further back in time, Socrates fretted over how the newfangled written word would affect civilization, fearing it would cause practitioners of oral tradition to forsake memory, mother to the Muses. The written word could only ever be a lesser image, he declared, of “the living word of knowledge which has a soul.” Yet what’s going on with kids today can’t be reduced to a simple generational tug-of-war over media and message. While the profs at Penn’s Graduate School of Education perceive technology as a great equalizer, it may, in fact, be the great leveler. The Nielsen Norman Group, a consulting firm whose founder was deemed by the New York Times to be “the guru of Web page ‘usability,’” has done extensive research into what makes websites successful. Its advice to clients? Nothing higher than a sixth-grade reading level on the home page, and eighth-grade on subsequent pages. One idea per paragraph. More “scannability” -- highlighting, color-coding, bullet points. Teens, Nielsen Norman has found, are actually less equipped to make sense of the Internet world than their elders: They don’t have the reading ability, patience or research skills to successfully complete what they set out to do online. What they do have, in abundance, is self-esteem -- a faith in their competence on- and off-line that’s way out of proportion to their actual abilities. On the Narcissistic Personality Inventory, which asks whether you agree, for example, with the statement “I will be a success,” my son’s generation scores significantly higher than previous ones. Why not? Gerry Hartey, a longtime English teacher at La Salle College High School, observes, “Everything is easy for these kids. It’s right there at the click of a button. They don’t even have to go to the library.” Here’s the thing, though, as we fret about our kids’ online lives: It’s already their world, not ours. Young people have always rebelled against their elders, whether they were wearing zoot suits or listening to grunge. But a hallmark of civilization was that eventually, the kids gave up their rebel ways and folded, more or less quietly, into the adult world. That’s not going to happen with our kids, because their superior technical skills mean they’re already in charge. We’re being forced to adapt. We’re the followers; they’re the leaders. And it’s hard to imagine where they’re leading us, because they’re unlike us on such a fundamental level: Their brains are different from ours. THIS IS HOW WE LEARN: A sensory perception causes a synapse in our brain to become chemically excited. That synapse fires and excites another, and so on through the brain. The stimulation strengthens the pathway from synapse to synapse, making it more likely to be traveled again. Repetition of the perception wears the pathway deeper. So does emotion or shock. In the Middle Ages, children chosen as official recollectors of historic events were tossed into a river afterward, to more firmly etch the experience into their brains. (Scientists are working on a “day-after” pill for trauma survivors that would ameliorate this etching effect.) When you sat at a school desk and recited your times tables over and over, when you wrote out the periodic table of elements, when you practiced cursive penmanship, you were reinforcing memories, creating familiar paths for synapses, literally rewiring your brain for top-down attention. Your children’s neural networks are very different. Thanks to their Internet exposure, in place of steady repetition, they’re confronted, daily, by a barrage of novelty. There’s no pattern, no order, in either the input or the pathways it carves. “You have kids today who start on computers at three, four, five,” says Penn’s Chatterjee. “The younger you’re exposed, the more influence that has on the final configuration of the brain.” So the tech stuff isn’t benign, though kids think it is. And it’s been deliberately developed to make it hard for them to turn away. “The nature of addiction,” says Chatterjee, “is little rewards doled out in unpredictable fashion. The information kids are getting from texting or tweeting has that unpredictable quality. They don’t know what they’re going to get, and what they do get, they really like. It’s a set-up for addictive behavior.” What kids don’t realize -- what even we forget, as we hook up with old high-school buddies on Facebook -- is that none of this is accidental. Big thinkers at big corporations dream this stuff up, test it, tweak it, perfect it, not to make it easier for us to find old friends, but to gather information about our behavior and make money off of it. LAST JANUARY, a young Florida mother was trying to play FarmVille on Facebook, but her three-monthold son kept crying. So she shook him to death. Stupidity is one thing. Inhumanity is another. Jake can look at his cell phone to see what day it is, but where can he go online to find out what being human means? The hours he spends on his computer result in less time studying with friends, or playing pickup basketball, or hanging at the football game. And online contact is fundamentally different from being with other people. We do things online -insult those we disagree with, bully the weak, mock the bereaved -- we would never do in person. Kathleen Taylor, author of a book called Cruelty: Human Evil and the Human Brain, told the New York Times this spring: “We’re evolved to be face-to-face creatures. We developed to have constant feedback from others, telling us if it was okay to be saying what we’re saying. On the Internet, you get nothing, no body language, no gesture.” Without that feedback, we’re all starring in Lord of the Flies. But we’re not just failing to engage with one another; we’re less and less willing to engage the world at large. Witness the Blue State/Red State fault lines: When we’re able to pick and choose our sources of information, when we subsist on a steady diet of what-I-already-believe, we don’t ever have to examine or alter our constructs. “If you want to have an educated citizenry,” educational assessment expert Norbert Elliot has said, “you’ve got to wrestle with complex ideas, or you will end up with people who will only do the shallowest things.” The erosion of top-down thinking, the inability to pay attention and stay focused, will affect how our children make crucial decisions. Their world may, in effect, be made flat again. I want to believe my kids aren’t like this -- that I’ve taught them how to be discerning, how to stand up for themselves and think critically. I need to believe they won’t be swept into a Kool-Aid-colored whirlpool of insubstantiality and meaninglessness -- that they’ll know in their hearts what’s wrong and right, what matters and what doesn’t. But any parent is a tenuous anchor against this tide. Though my husband’s a professional musician, my daughter snags songs off the Internet without a qualm. Why should she pay when she can get them for free? Kids today are less and less able to inscribe ownership boundaries; they hand in papers that are pastiches of plagiarism, steal artwork, words, ideas. Part of this is because they grew up with a different notion of intellectual property: When Jay-Z samples the Chi-Lites, that’s not stealing; that’s giving props! But recent research shows that developing a conscience requires paying -- here it is -- attention to the small voice within that says, “That doesn’t belong to you.” And who can hear that small voice amid the Internet’s din? Over time, we’ve winnowed down, from our culture’s vast banquet, what we deem worth preserving. Tradition, Mark Bauerlein writes in The Dumbest Generation, “serves a crucial moral and intellectual function. . . . People who read Thucydides and Caesar on war, and Seneca and Ovid on love, are less inclined to construe passing fads as durable outlooks, to fall into the maelstrom of celebrity culture, to presume that the circumstances of their own life are worth a Web page.” Rather than learn from the past, our kids just click the mouse and start the game over. What does that mean for their chances of forming lasting friendships, or marriages? With Facebook, their cell phones, their laptops, our kids don’t ever have to be alone . . . and yet they’re always alone. The more they use the Internet to connect, research has shown, the more vulnerable they are to depression, whose incidence has doubled in the past decade. A quarter of all Americans report not having even one person they can confide in. More than half have no close friends outside their immediate family. Yet we’re wired so deeply, so irremediably, for social interaction that we leap at any glimmer of it in our machines: We choose a pleasing, soothing voice for the GPS and construct a personality for it. We lie to our laptops. If they had hair, we’d put ribbons in it. Some kids, like Ally Gardiner, struggling to create a feeling of family at Cheltenham High, sense that this isn’t leading to a good place. Adults do, too. “It’s a new world,” says math teacher Dean Rosencranz. “But we need to be careful we don’t give up something that can’t be replaced.” When kids use symbols to stand in for emotions, sprinkling their texts and e-mails with sad and happy faces, are they diminishing their ability to experience the real thing? That would explain the rabid popularity of “reality” TV shows in which people screech at one another like ramped-up, Red Bull-saturated harpies: Kids watch and say, Oh. So that’s what feeling something is like. In our rush to respond to the chime, the chirp, the bouncing icon, in our eagerness to prove ourselves multitaskers par excellence, in our willingness to sit alone at home and count our “friends,” ironically enough, we’re overlooking solitude’s real advantage: the opportunity it provides to develop what essayist Sven Birkerts describes in The Gutenberg Elegies as “our inwardness, our self-reflectiveness, our orientation to the unknown.” In other words: a soul.