Fear of young people has an official term: ephebiphobia. The coinage comes courtesy of Kirk Astroth, a professor of family and consumer sciences at University of Arizona, who borrowed the root from the ancient Greeks. An ephebe was a teenage boy who was isolated from his community at 17 to learn to hunt, fight and survive on his own. After his exile, he returned ready to protect the city, which was kept safe from both foreign hoards and teenage boys. This was a theme in the ancient world. “Our sires’ age was worse than our grandsires’,” wrote Horace. “We, their sons, are more worthless than they.”
But Astroth says that ancient and modern ephebiphobia need to be understood as different phenomena. Modern ephebiphobia is the reactive fight or flight response to a pack of hoodie-wearing high schoolers let out at 3:00 p.m. and the assumption that adolescents are full hormones, devoid of inhibitions, and inherently dangerous, whereas ancient ephebiphobia is the hyperbole that turned dancing into a fatal contagion. He says it all changed in the late 1980s and early 1990s when American children suddenly became terrifying to American adults.
“There was a real movement from a lot of parts of our society to demonize young people and make them a part of the problems,” Astroth tells Fatherly. This was true, he’s quick to add, at least in part because of a real spike in juvenile crime, which peaked in 1994. But the operative word here is “peaked.” Juvenile crime rates have plummeted over the last two decades. There’s no longer much reason to fear teenagers. Americans seem to be doing so largely out of habit.
As the research arm of National Council of Juvenile and Family Court Judges, the National Center for Juvenile Justice has been researching trends in juvenile delinquency since 1973. Their Violent Crime Index shows clearly that in 1994, 500 of every 10,000 arrested for violent offenses were juveniles between the ages of 10 and 17. But look at 2012, the most recent year for which data is available. That number dropped over 50 percent, below 200 arrests per 10,000. And according to the center’s April 2017 report on Juvenile Court Statistics, trends in delinquency measured by interactions with the court system have been on a steady decline as well. The number of delinquency cases fell 42 percent between 2005 and 2015. Moreover, the placement of juvenile offenders in residential facilities has dropped precipitously. Recent data shows a 50 percent decline since 1999.
But, despite all this good news about teens and crime, American ephebiphobia remains at a fever pitch. So why would grunge and JNCO jeans fade with the 90’s but not the perception that kids are dangerous?
Blame the media. But don’t just blame the media. Teens have always been American teens have always been dogged by awful representations of their overwhelming zeal for life. In 1693, Cotton Mather wrote of the Salem witch trials: ”The children of New England have secretly done many things that have been pleasing to the Devil.” When teenagers began to develop youth culture in the twenties, papers decried the dancing and jazz. In the fifties, The Blackboard Jungle and Rebel Without a Cause framed kids as dangerous ruffians. In the nineties, Harmony Korine’s film Kids depicted urban youths as dope-smoking, oversexed monsters passing AIDS around like a joint. More recently, MTV’s American version of the British TV show Skins in 2011 featured “frank” depictions of teenage life as defined by sex and drugs. But that’s only a specific genre of teenage life. Stats indicate that a truly frank depiction of the average teenager’s life would be pretty dull (and probably unenjoyable to watch).
“There really is no justifiable reason to fear young people,” Astroth says. “A lot of this isn’t based on anything real.” But people believe it’s real. Because the terrible teens aren’t just getting naked and high on MTV. They’re doing it on the news every night.
Unfortunately, the upward slope of the nineties juvenile crime peak coincided with the rise of the 24-hour news cycle pioneered by Ted Turner, who launched CNN, the first 24-hour news network, in 1980. Having a 24-hour news network necessitated having 24 hours of news to cover. And the mid-eighties delivered. The emerging crack epidemic gave newscasters and pundits something meaty to talk about 24/7 and a big part of that meat was the involvement of kids. According to Sarah Hockenberry a research associate for the NCJJ, as crack spread, drug dealers began using kids as lookouts, knowing they were easily replaceable and unlikely to do hard time if they were caught.
“The peak of violence corresponded with this peak of news, and especially bad news,“ Hockenberry says. “That really warped people’s perceptions of the world, the bigger cities around them and of teenagers.”
Soon political scientists like John J. DiIulio, who coined the term “super-predator,” were sowing fear over the airwaves and in the media. “On the horizon, therefore, are tens of thousands of severely morally impoverished juvenile super-predators,” Dilulio wrote in a November 1995 article for The Weekly Standard. “They are perfectly capable of committing the most heinous acts of physical violence for the most trivial reasons (for example, a perception of slight disrespect or the accident of being in their path) … So for as long as their youthful energies hold out, they will do what comes ‘naturally’: murder, rape, rob, assault, burglarize, deal deadly drugs, and get high.”
The idea of the superpredator was so terrifying that it stuck fast to the American psyche. Hillary Clinton famously used it–and ended up having to apologize for doing so to the black community. It was, after all, a racially coded idea. It’s the idea of the superpredator that made the hooded figure of Trayvon Martin so incredibly frightening. But the data then failed and the data now fails to support the basis of ephebiphobia. By 1995, when Deilulio coined the term, the youth crime rate was in decline.
A study published in the American Journal of Public Health examined data from the National Survey on Drug Use and Health for trends in youth violent behavior from 2002 to 2014. The violent acts measured included group fighting, one-on-one fighting, and harmful attacks. Results showed a relative proportion decrease of 29 percent in violence by teens aged 12 to 17-years old. What’s more, the downward trend was significant for all racial and ethnic groups.
Researchers for the study point out that the findings are consistent with other trends of risky behavior found in similar studies. They note findings from colleagues include decreasing rates of drinking and marijuana use—the last in spite of spreading decriminalization.
Science has done little to dispel the fear. In fact, it argues, kids brains make it impossible for them not to be dangerous. The book The Teenage Brain: A Neuroscientist’s Survival Guide to Raising Adolescents and Young Adults, by Frances Jensen, a mother and neurologist, is pretty typical of the argument. “When we think of ourselves as civilized, intelligent adults, we really have the frontal and prefrontal parts of the cortex to thank,” she writes. But “teens are not quite firing on all cylinders when it comes to the frontal lobes.”
Structurally, there’s truth to the statement. The frontal lobe is the seat of reason, judgment and emotion. It is well documented that it’s not fully developed until the early 20s. But the evidence suggesting that the developing frontal lobe leads to erratic and tragic teen years is likely overblown.
“Teen brains aren’t as developed as adult brains, but that’s still no reason to fear them,” says Astroth. “What we need to do is show some empathy.”
There is actually data to justify this prescription. A 2005 United Nations Children’s Fund report surveyed research showing that the idea of the turbulent teens is not consistent across cultures. In fact, young people around the world offer plenty of examples of responsibility and advanced decision making at young ages. The report also highlights research from Dr. Daniel Offer and Dr. Kimberly A. Schonert-reichl, suggesting that children are far more competent than adults would give them credit for and in fact 80 percent make the transition through adolescence without incident. “In practice, the more opportunities for decision-making that children are given, the better they are able to exercise informed choices,” the authors note.
The report goes on to point out that limiting adolescent autonomy, which ephebiphobia ultimately succeeds in doing by influencing policy and media, leads to a self-fulfilling cycle that results in kids being seen as incompetent precisely because they aren’t allowed to practice competence. Kids learn helplessness, understand their decisions can easily be over-ruled by adults, become reluctant to even make decisions, and finally lash out in frustration. “These behaviors are then used to affirm the view that adolescents are inconsistent, irrational and emotional,” the authors conclude.
For her part, Hockenberry says she understands that there is no data that will likely move adults toward reason. But she wonders why they don’t realize their views of teens reflects terribly on them as parents. “What that fear is saying is that every generation of adults thinks they’re the worst parents ever,” she says.
If you ask the guy who coined the term for fear of adolescents, he’ll say it’s probably time to knock it off lest kids take matters into their own hands. “I think young people will be the next civil rights movement in our country,” Astroth says. “They won’t take it anymore and there’s going to be a massive cultural movement for young people to have a voice.”