Wednesday, February 4, 2026

The Politics of Thoreauvian Punk

Politics is complicated ... and pretty much always has been.

And in my latest project, The Punk on Walden Pond, I am intrigued by the issue of politics in relation to Henry Thoreau and punk rock. Was Thoreau a political writer and theorist? Is punk a political art form? At times Henry Thoreau argued he is not political, and many might say the punk on Walden Pond is above politics. Similarly, while the music and bands of punk rock certainly are anti-establishment and a challenge to the status quo, some musicologists argue that more than 80% of punk songs are not political, and that the bands have no clear political agenda. I'd imagine Joe Strummer, Jello Biafra, and bands like Propagandhi have some thoughts on that.

In my Walden Punk Project, I have a piece-in-progress titled "In the Mosh Pit: the Politics of Thoreauvian Punk." Here are some thoughts from that work.

Chapter 4 of Jane Bennett’s Thoreau's Nature: Ethics Politics & Wild (2002) is titled “Why Thoreau Hates Politics." Thoreau may have hated politics, but that doesn’t mean he wasn’t political. In fact, Bob Pepperman Taylor makes a strong case in two books America’s Bachelor Uncle: Thoreau and the Polity (1996) and Lessons from Walden: Thoreau & the Crisis of American Democracy (2024) for seeing Thoreau primarily as a political writer. He believes that even the supposed “nature writings” such as Walden, "Walking," and "Wild Apples" are actually political positions, specifically in how they criticize and challenge America to be what it claims. And that is as punk as it gets, in my opinion. For, Thoreau is in many ways the first true contemporary critic to challenge the national narrative to call out the American dream to pull back the curtain on the ruse that had been perpetuated against the people.

As far as punk is concerned, it's worth noting that much punk is simply about frustrations with daily life, as opposed to large political manifestos. As Legs McNeil says in his comprehensive history Please Kill Me, “... the great thing about punk was that it had no political agenda. It was about real freedom, personal freedom." In the study Rebel Rock, a review of lyrics suggest only 25% of songs are distinctly political. However, a counterargument is that for the music of a counterculture, even when songs aren’t political, they are. 

In viewing Thoreau as combative and political, and punk as a political movement – even when it’s not trying to be, the key elements are personal conscience and a sense of social justice. The goal of Walden is to promote a kind of personal responsibility because, for Thoreau, the fear is that people will succumb to a less interesting and morally deadening utilitarianism. Thoreau insists that we submit to principles which will make us nonconforming in an unjust world. Thoreau urges readers to be rebellious, be a tradition breaker, be civilly disobedient.



Tuesday, February 3, 2026

Permanent Olympic Sites


In 1972, via a statewide referendum, the people of Colorado rejected funding for the 1976 Olympic Games, becoming the only city ever awarded the games to turn down the chance to host. While that decision shocked the rest of the country, as well as many around the world, it wasn't a surprising move for anyone who knows the taxpayers of the Rocky Mountain state. In fact, knowing what we know now about the structural challenge and fiscal nightmare the Games can be for some cities and countries, it was a surprisingly prescient and prudent move.

Hosting the Olympic Games is an incredible honor and opportunity for a country to shine on the international stage, but it’s also a significant financial and structural investment saddled with huge risks. The Olympics generally cost tens of billions of dollars to stage while providing only a fraction of that in terms of revenue. Host countries must invest heavily in building a vast infrastructure of sites to hold the events, housing for the teams and guests, and transportation and security systems to manage the people. While these can certainly upgrade a city, they are rarely necessary to maintain following the games and often end up in disuse and decay.

Additionally, any benefit from the event is often overshadowed by the corrupt history of the bidding process at the International Olympic Committee and the potential for bloated budgets prior to the event followed by blight afterwards. The scandals plaguing the entire hosting process are extensive, ranging from bribes and extortion to graft and highly orchestrated doping programs which have tainted vast numbers of events and athletes. It often seems the Olympic Games, an international institution intended to honor the individual pursuit of excellence, are more trouble than they’re worth. But it doesn't have to be that way.

Instead, the international community should establish permanent locations for the Olympics, where all countries contribute to maintaining the sites as the premier athletic facilities in the world. The fields and tracks and stadiums could serve as hosts for an endless number of world championships at all levels, and they could also serve as training grounds and research locations to serve all manner of individuals and organizations committed to honoring and promoting the highest levels of athletic achievement.

Choosing permanent locations would obviously be a significant challenge, though certainly not more problematic than the current bidding process. It’s reasonable to have host cities across multiple geographic regions, and it makes sense to consider places which held successful games and maintained some of the original infrastructure. Athens is the obvious choice for one permanent summer location, while Barcelona, Seoul, and Sydney are solid choices as well. Salt Lake City and Lillehammer are good bets for the Winter Olympics, though a strong case can be made for both Vancouver and Turin. Obviously the city and host country must want the honor and responsibility and be willing to trust the rest of the world to support the plan.

This idea is not new, having been discussed for years among commentators, athletic groups, and political leaders. In fact, at the end of the 1896 Games, which launched the modern era, King George of Greece called for Athens to be the permanent “peaceful meeting place of all nations,” and many delegations signed a letter endorsing the idea. Currently, host cities are already established through 2028 when Los Angeles will host its third Olympic Games. And perhaps that’s enough. Before any more bidding happens and planning begins, the public should discuss the idea of permanent host cities. Once the idea is floated to athletes and voters, political and business leaders should take the discussion to the IOC and make it happen. With many future games already assigned and planned, there is plenty of time to develop and implement this logical change to the Games.




Monday, February 2, 2026

Groundhog Day - a time for reflection and renewal

Groundhog Day was once about the annual folk tradition of looking to a small animal for predictions about the weather. But since 1993, the day - or at least the phrase - has become part of the lexicon, describing the boring, monotonous nature of everyday life. The Harold Ramis film that established that idea is actually about the opposite. Here's an essay I wrote a few years ago: 

It’s not about monotony — it’s about re-birth.

Twenty-six years ago, an unassuming little film about a cantankerous weatherman on the most random of holidays became a pop culture phenomenon that ingrained itself in our consciousness. The title became a metaphor for reluctantly acknowledging the dailiness of life. With the silly story of Phil Connors waking up everyday in Punxsutawney, PA, with Sonny and Cher singing “I’ve Got You Babe” on an endless string of February seconds, Groundhog Day entered the lexicon as a way to describe the drudgery and repetition of daily life. But the movie was never simply about the mundane nature of existence. It was always about self-awareness and second chances and reinvention and hope.

Let’s face it, by February 2 the New Year’s resolutions are fading, the fitness centers are back to the regulars, and we’re all bogged down in the drudgery of winter. These moments are ripe for a bit of pop culture existentialism, and the quirky film from Harold Ramis and Danny Rubin puts that long cold winter, the odd little holiday, and the repetitiveness of daily life in perspective. Watching the story of a disgruntled weatherman pondering the absurdity of a weather-forecasting rodent provides a second chance at mid-winter self-reflection and re-invention. The conceit of the film is not only the ridiculous holiday but also the inexplicable weirdness of Phil Connors’ predicament.

The film Groundhog Day is actually a wonderful primer for the wisdom of existentialism, and when I taught the philosophy in my college literature class, I would often lead or conclude with a viewing of Bill Murray’s brilliant portrayal of a man trying to bring some sense of meaning to a life that seems nothing short of absurd. Clearly, the idea of living the same day over and over again in an unfulfilling, dull, mundane place and repeating the seemingly mindless tasks of a pointless job is portrayed as a curse and a cruel joke, and that realization is at the heart of existentialism. Life makes no sense. Phil spends many years in disgruntled fashion viewing his life as exactly that, a cruel meaningless joke of an existence.

Read the rest of the essay here ...

Sunday, February 1, 2026

It's the Housing, Stupid

It all comes down to housing, doesn't it?

I am so glad that I moved to Greenwood, Village, CO, when I did in 2003, and that I also sold my house and moved out in 2024. GV is one of the toniest suburbs of Denver, with an average home price of well over $1million. And, that's completely out of range for working middle class people like teachers and police officers. Fortunately, at one time, the area allowed a fair number of townhouses and duplexes, which is where I was able to buy, directly adjacent from the high school. But those days are over, as several years ago, the millionaires were freaked about the possibility of multi-family housing coming to their little hamlet, and a (hysterical) group called "Save Our Village" got on city council, where they effectively outlawed the construction of anything less than single-family homes on quarter-acre lots.

I've been thinking about that recently after reading an interesting substack essay called "The Housing Theory of Everything" and a recent column from Nicholas Kristoff in the NY Times about how to save the American Dream through programs like Hope VI

We need some good news now, and here’s some out of left field: An important new study suggests that there’s a highly effective way to overcome one of the most intractable problems in 21st-century America — intergenerational poverty. We like to think of ourselves as a land of opportunity, but researchers find that today the American dream of upward mobility is actually more alive in other advanced countries.

The new study highlights a powerful way to boost opportunity. It doesn’t involve handing out money, and it appears to pretty much pay for itself. It works by harnessing the greatest influence there is on kids — other kids. The study, just released, is the latest landmark finding from Raj Chetty, a Harvard economist, and his Opportunity Insights group, along with other scholars.

The team dug into the long-term effects of a huge neighborhood revitalization program called Hope VI. Beginning in 1993, Hope VI invested $17 billion to replace 262 high-poverty public housing projects around America.


Saturday, January 31, 2026

Row Houses & the "vanishing" starter home

I love living in and driving around Old Town Fort Collins, Colorado. The quaint tree-lined streets are filled with a diverse selection of houses that represent the best of how American towns used to be developed. Nearly every street in the area is filled with cozy, comfortable one and two bedroom cottages and bungalows, which are located right alongside beautiful mid-size craftsman and colonials. Those same streets have a pleasant smattering of gorgeous large Victorians and estate-style houses. That sort of mixed-market neighborhood creates a solid community, one which offers a vanishing relic - the starter home.

People tend to have their own understanding, but starter homes are typically perceived as being on the smaller side, in need of renovation, or both. Buyers often go in expecting to stay a few years to build equity, then trade up for something bigger and generally better. But the concept is antiquated given current prices and big floor plans, a dynamic that’s icing out many entry-level shoppers.

Builders have been constructing bigger and bigger homes during the past half-century. Homes with four or more bedrooms made up nearly half of all new construction in 2022, according to Census Bureau data. That compares with 1 in 5 in the 1970s.

More rooms and more upgrades mean more costs. The U.S. median home price is $410,800, up nearly $100,000 since 2019, federal data shows. Layers of local regulations, as well as market dynamics, have pushed builders to go big, rather than catering to first-time buyers with less to spend. “You have zoning requirements that have encouraged large lot sizes,” said Dennis Shea, a housing expert at the Bipartisan Policy Center. “Home builders, particularly in the wake of the Great Recession, where they were very negatively impacted, find it easier to build larger homes that have higher profit margins.”

The key to the newly-coined "affordability crisis" obviously starts with housing. And finding a way back to the starter home, or a society more accepting of townhouse and rowhouse construction, could be an important key to the national economy in the new era.

Friday, January 30, 2026

David Brooks leaving NY Times for The Atlantic

Even though I taught English for three decades, I still tell people that when I grow up, I want to be David Brooks of the New York Times. Or, I add, perhaps an author like Michael Lewis. As the son of a editor and feature writer, I have been a fan and voracious reader of the "newspaper column" for as long as I can remember. Readers of this blog might recall that I've written about the high esteem I had -- still have -- for the work of legendary Chicago columnist Mike Royko. I feel the same about George Will. 

And, of course, I have always been a fan of David Brooks ... or at least since the late 1990s when I read his work for the Weekly Standard, Wall Street Journal, and eventually the New York Times, where he has been a regular columnist for more than two decades. And, I truly enjoyed and began to follow his writing more regularly after he published a very cool book of pop culture criticism called Bobos in Paradise: The New Upper Class and How It Got There. Granted, Brooks definitely has his critics and detractors, and he's by no means perfect, but as an erudite and insightful scholar and cultural commentator, I've always found Brooks to be worth the read.

And, Brooks isn't going away, but he is making a change. This morning Brooks and the New York Times announced that he is leaving his longtime home for a new position at The Atlantic, along with a new podcast and a position at Yale University. Brooks bid farewell to times readers with an extended column, "Time to Say Goodbye." 

It’s been the honor of a lifetime to work here, surrounded by so many astounding journalists. But after 22 wonderful years, I’ve decided to take the exciting and terrifying step of leaving in order to try to build something new. When I came to The Times, I set out to promote a moderate conservative political philosophy informed by thinkers like Edmund Burke and Alexander Hamilton.

When I think about how the world has changed since I joined The Times, the master trend has been Americans’ collective loss of faith — not only religious faith but many other kinds. In 2003, we were still relatively fresh from our victory in the Cold War, and there was more faith that democracy was sweeping the globe, more faith in America’s goodness, more faith in technology and more in one another. As late as 2008, Barack Obama could run a presidential campaign soaring with hopeful idealism.

The post-Cold War world has been a disappointment. The Iraq war shattered America’s confidence in its own power. The financial crisis shattered Americans’ faith that capitalism when left alone would produce broad and stable prosperity. The internet did not usher in an era of deep connection but rather an era of growing depression, enmity and loneliness. Collapsing levels of social trust revealed a comprehensive loss of faith in our neighbors. The rise of China and everything about Donald Trump shattered our serene assumptions about America’s role in the world.

We have become a sadder, meaner and more pessimistic country. One recent historical study of American newspapers finds that public discourse is more negative now than at any time since the 1850s. Large majorities say our country is in decline, that experts are not to be trusted, that elites don’t care about regular people. Only 13 percent of young adults believe America is heading in the right direction. Sixty-nine percent of Americans say they do not believe in the American dream.

Brooks' column is definitely worth reading for the perspective and insight he brings to "what's happened to America" in the twenty-first century. If nothing else, he is a well-educated and broadly talented writer with enough background knowledge to offer a thoughtful long view on the past, present, and "hopeful" future.

Godspeed, David. Looking forward to what comes next.

Thursday, January 29, 2026

Screen Time for Young Kids

Michael Coren, an advice columnist for the Washington Post, believes he has "cracked the code on toddler screen time." I'm a bit suspicious, believing "toddler" and "screen time" really don't go together at all. I'm basing my position about screen time and children on the standards and recommendations from the American Academy of Pediatrics. Basically, best practice in raising your kids around "screens" - and that goes back to the anachronism of 'television' for the past fifty years or so - is that children under six months old should have "zero screen time," and up to the age of two, it should be limited to no more than thirty minutes a day. 

How to structure good screen time for toddlers and avoid parental guilt - The Washington Post 

Wednesday, January 28, 2026

Peer Grading in K12 Education is Unnecessary and Wrong

"Ok, now trade papers with a classmate and take out your red pen."

It's a sentence everyone who has ever gone to school has heard. For as long as teachers have been saying it, a number of kids in every class have always cringed. And it's not always the kids who struggle. In fact, as a coordinator of gifted education for many years, I know the highest achieving students -- the ones who likely have "nothing to worry about" (so it's claimed) by peers seeing their work -- are often the ones who dread the practice the most. They may simply be anxious about their work, classic perfectionism. Or more likely, they're ironically embarrassed by their success and don't want to be mocked or even criticized by students who didn't do as well.

I've never really liked or approved of the practice of peer grading, as a student or a teacher, and I never practiced it in any class I taught. That puts me at odds with many, if not most, educators. And, while the legality of the issue was resolved back in 2002 by the Supreme Court after a family sued over the practice as a violation of family and student privacy, it still comes up from time to time in schools and among teachers as schools and departments debate and discuss their standards for "best practice" in the classroom. Though I've retired from teaching, I am still around schools and teachers, and I still engage int the debate, arguing that the practice is wrong.

Basically, it comes down to this -- Some kids are mean, some kids are insecure and anxious, and outside of those concerns, the practice is simply unnecessary in the learning process. Granted, the world - especially school - is an imperfect place where some people will always be mean and others will always be insecure or anxious or meek or just modest. Teachers can't solve all problems, ease all burdens, and smooth all bumpy paths. Everyone needs to learn resilience, and school plays a big role in that. However, there are some choices we can make that diminish the risk of problems, and peer grading is one of them. 

Now, many teachers over the years have countered that the practice works well as long as teachers set clear expectations, modeling the appropriate behaviors, and dealing with any violations of the protocols. But I feel that view is rather naive. Of course we want to set expectations, model good behavior, and deal with the problems. But does a teacher always know when it’s happening? When kids are inappropriate and cruel? Hardly. Any educator or parent knows most bullying happens in the dark and victims rarely complain out of fear.

And what about kids who give favors to other kids who are friends? How does the teacher know? Happens all the time, as we know from being students. With that in mind, I'd argue the practice and the grades are inauthentic at best.

So, regardless of whether a teacher could effectively manage it all the time, the practice is quite simply unnecessary. If it’s just homework or daily practice, then kids can grade themselves. They know if they’re being honest. And if it’s a true assessment then kids have no business seeing another kid's grade anyway.

Tuesday, January 27, 2026

Colorado Band Velvet Daydream battles AI version of itself

Well, this was bound to happen, right? It's like cloning meets Tron ... or something like that.

With the rise of AI and a social media world filled with deep fakes, it's becoming harder to maintain your own identity and simply exist as yourself with no competition from ... other artificially created versions of you. The publishing world has seen this with authors discovering fake AI-generated writing in their names, even using their own intellectual property. And now in the music world, a Colorado band is battling an AI version of a band with the same name.

Velvet Daydream, an excellent retro-hard rock band I wrote about a year ago, recently learned of another eerily similar and entirely fake band using the same name. 

“A few months ago, someone sent us a message and showed us there was an AI band with our name,” says Ryder King, the vocalist/guitarist of the actual, human band (which makes incredible rock music, by the way). “I found out later that they were actually directly correlated to Velvet Sundown. … I went to the Velvet Sundown Spotify, and it said they appeared on the [AI] Velvet Daydream’s album. It doesn’t say that anymore, but I have a screenshot of it.”

If you visit The Velvet Daydream’s Spotify page, you’ll find a truly haunting image of its poreless AI members, who appear very similar to the Velvet Sundown. And unless rock bands have started using FaceApp like a Real Housewife, I don’t see how anyone could believe these are real people. Listening to the music is even worse; it’s also just as comical. “Somewhere in Europe” is a madlib of indie sadboi tropes: “Smoking cigarettes / Making plans for my life…” Meanwhile, “I Heard the City Breathing In Its Sleep” describes how “every window glowed like a holy womb.” The effect is singularly uncomfortable — almost as much as Taylor Swift’s song “Wood.

Monday, January 26, 2026

The Wonderful, Weird, and Waning World of Harry Potter

It's hard to believe but it has been twenty years since the release of Harry Potter & the Deathly Hallows, the seventh and penultimate book in a "children's fantasy" book series that is undoubtedly the publishing phenomenon of the past half-century. The literary and publishing world - and the world in general - has never seen anything like Harry Potter, and likely won't ever see something like it again.

As a reader, as an English teacher, and as a parent whose children were born in 2002 and 2005, I was completely immersed in and fascinated by the world of Harry Potter from the moment I picked up the first book, roughly a year before the first movie came out. I was so taken with Harry Potter & the Sorcerer's Stone that I read the book out loud to all my high school classes, every day, a couple of pages at a time. And I was stunned with how captivating it was for even junior-level high school boys who were not readers in any sense of the word. 

This book is something special, I thought, something altogether different.

For an avid reader and pop culture fan who came of age at the dawning of the Star Wars franchise phenomenon, the Harry Potter books represented all the magic that storytelling can be. But, that said, it's equally fascinating how the books, the films, the stories, the franchise, ... dare I say, "the magic," has faded into history far more quickly than I would have thought. And that's the subject of a fascinating piece of cultural commentary from British writer Louise Perry who suggests, "Millennials, It's Time to Leave Hogwarts." 

It’s been almost 20 years since the final Harry Potter book was released. The wizarding world is still generating interest — book sales remain strong, and the 2023 video game Hogwarts Legacy, topped 40 million sales. HBO is working on a TV adaptation of the books, set to be released next year.

But the relevance of the franchise is waning. “We’ve seen our audience age up,” conceded a Warner Bros. executive of the recent spinoff films. When the first of these, “Fantastic Beasts and Where to Find Them,” premiered in 2016, just 18 percent of cinema goers were actual children.

Sunday, January 25, 2026

How does anyone pursue American Studies anymore?

How does one study America?

As the contemporary world's relatively newest major nation begins to celebrate its 250th birthday, the study of the American concept, the American Dream, American history, and American culture is being pulled in so many directions, it's difficult the find a defining narrative to something that for most of our lives seemed pretty obvious. 

In the past couple weeks, the Wall Street Journal has published two pieces of commentary that begin to dig into the complicated question asked for most of American history:  as Michel Crevecoeur asked in Letters from an American Farmer -- "What then is the American, this new man?" It was a question explored by the French diplomat Alexis de Tocqueville in the esteemed Democracy in America.

Ben Fritz, the entertainment reporter for the WSJ delves into the concept of American "culture," specifically the vanishing idea of a singular national monoculture in terms of arts and entertainment, with a piece on "The Rise and Fall of the American Monoculture," asking the important question: For most of the 20th century, pop culture was the glue that held the U.S. together. But what will it mean now that everything has splintered?

“I Love Lucy.” “Star Wars.” “Thriller.” It doesn’t get more American than that.
All nations are held together by culture, but the U.S. is unique for the power of its pop culture. Our music, television shows and movies are a multitrillion-dollar business and the first way that billions of people around the world get to know us.

For most of the 20th century, they were also the glue that held the country together. In a sprawling nation founded on the precept of individual liberty and populated primarily by immigrants from around the world, there was hardly one American experience. Maids in Boston, factory workers in Chicago and farmers in California lived much different lives despite being part of the same country.

Cinemas, radios, television sets and records changed all that. Americans might do different things during the workday, but at night and on weekends, we were watching and listening to the same things—things made in America, primarily for Americans, by the first modern celebrities.

It was the birth of the monoculture—a word that captures the historically unique power of American entertainment in the 20th century. An estimated 200 million tickets were sold for “Gone With the Wind,” which came out in 1939, when the population of the U.S. was 130 million. The “Amos ’n’ Andy” radio show was so popular that movie theaters scheduled around it and piped the audio in on their speakers. In 1983, more than 100 million people watched the finale of “M*A*S*H.”

And, on the opinion page of the WSJ this week, policy analysts Richard D. Kahlenberg and Lief Lin discuss the complicated issue in the world of cultural criticism that "American Studies Can't Stand Its Subject."

The 250th anniversary of America’s founding provides an opportunity to reflect on—and fight over—the country’s extraordinary story. Unfortunately, many of the serious scholars who study America—its history, literature and culture—fail to provide a balanced and nuanced account of the country’s complex tale.

On the one hand, America’s is a story of greatness: The U.S. is the wealthiest and most powerful nation on the planet. Its founders created what is now the world’s longest-lasting liberal democratic constitution. The Declaration of Independence put forth revolutionary ideas about human freedom and equality that ushered in a new era for the world. At the same time, the American experience is complicated. Our history includes the mistreatment of Native Americans, slavery and Jim Crow, and high levels of economic inequality that persist to this day.

Yet we found only one part of this narrative presented in most of almost 100 articles we examined from over a three-year period in American Quarterly, the flagship journal of the American Studies Association. Published by Johns Hopkins University, it’s widely considered the country’s premier journal of American studies.

Saturday, January 24, 2026

An NFL Family like none other

The NFL is filled with siblings ... and sibling rivalries. The Kelce brothers are a prominent duo, as are the Manning boys, and then there's the Watt family trio. Literally hundreds of sibling pairs have played in the National Football, which might seem surprising consider the elite status that requires, but is also understandable considering that same factor. However, a family that sends four young men to the pros is something special, and the Elliss family, whose sons Jonah (Broncos) and Christian (Patriots) face off in the AFC championship game between Denver and New England this weekend, is an incredible and heartwarming story like none you've heard before. 

On Sunday, the Elliss crew will count names and invade Denver, with Broncos outside linebacker Jonah and Patriots ILB Christian facing off in the AFC Championship Game. The two, both biological, are close as any within the group of 12 kids. Christian told New England reporters this week that he and Jonah have a “side bet” on the outcome Sunday, and Luther’s excited to watch his boys “bump heads” on special teams.

A former team chaplain for the Broncos during their 2015 Super Bowl season, Luther’s life has always been deeply rooted in faith. He figured he and Rebecca would have a large family; four to six kids, maybe. They never looked to adopt. But Rebecca called Luther one day after a workout class, talking about a baby a friend had told her about that was in transitional care in Utah.Not a week later, they brought home Isaiah Elliss.

“It just kinda spiraled pretty quickly,” Rebecca said.

Along the way, the Ellisses became one of the most prominent football families in the modern world. Jonah, Christian and Kaden are biological children and have grown into standout NFL linebackers, as is Elijah Elliss, a current Utah linebacker. Noah Elliss was adopted.

A family that sends four of its children to the professional sports level is truly an incredible story. But the Elliss family narrative is so much more than that, a story of a couple who chose unconditional love above all else.