Humans have been around for a couple hundred thousand years, by most calculations. When you consider modern, Westernized humans in that context, we come off as pretty strange in our social interactions. The ways we relate to one another reflect a very different social mindset from our ancestors. We jettison connections that were essential in the past.

For nearly all of human history, people lived in small hunter-gatherer bands, with some flow of individuals between neighboring bands that regularly encountered one another. In other words, your life was spent among people you knew.

But as has been explored by sociologists and psychologists, once humans developed proto-cities with thousands of inhabitants, starting around 9,000 years ago, something unprecedented occurred: We began to spend a lot of time around people we didn’t know very well at all.

And soon this progressed to regularly encountering complete strangers, and on to our current existence where we can hop a plane to, say, Outer Mongolia, and be among not only strangers but people we are unlikely to ever see again. Stand in line in a downtown Starbucks, and you’ll be amid more strangers than a hunter-gatherer would meet in a lifetime.

How odd is this? Our brains are evolved to navigate social groups. There’s a part of the brain near the visual cortex called the fusiform, which specializes in facial recognition; across all the primate species, the larger the typical social group, the larger the relative size of the frontal cortex, a brain region central to social intelligence — and humans have the largest by miles. Our brains are stupendously attuned to rapidly and automatically distinguishing between “us” and “them” (and with tremendous malleability as to what category someone falls in). And we are strongly predisposed to have “It’s them!” alarms go off when encountering a stranger.

In a similar vein, economic game theory tells us that generosity and cooperation — good things in human terms — plummet when interactions are one time only and anonymous. Game theorists show that the surest way to boost cooperation is “repeated-round, open-book” play: You interact with an array of individuals multiple times (a known social group) that can access your play history. In other words, the more we’re known, the less we act like jerks.

Given the advantages of sociability, we’ve come up with cultural mechanisms to promote it amid our anonymity. As documented by the psychologist Ara Norenzayan of the University of British Columbia, it is only when societies get large enough that people in them regularly encounter strangers that “Big Gods” emerge — deities who are concerned with human morality and who punish our transgressions. The gods of hunter-gatherers generally couldn’t care less whether we’ve been naughty or nice.

We secularized Western moderns have invented our own Big Gods, omnipotent eyes in the sky that in a more literal way counter anonymity - this is our world of drones, of ubiquitous security cameras, of recording devices in everyone’s pockets that can document the deeds and misdeeds of police and passersby alike.

There are plenty of more or less unavoidable circumstances in which people pick up and leave everyone they know to go live among strangers. For example, you marry (or are forcibly married off to) an outsider and go and live in a distant village. Or for political or economic reasons, you bid your loved ones farewell, and cross an ocean to a new world and never see your homeland again. And there are certainly disastrous times where entire communities are destroyed. This is our legacy of conquest and enslavement, of scattering refugees across the world, our invention of diasporas.

But given our ancestors’ habits, our brains’ alarm at “thems” and our better behavior among known groups, it is truly strange and relatively recent that we would voluntarily dissolve intact social groups. We arduously establish stable communities where people know one another and function effectively, where there are complex social networks, where there are high degrees of cooperation, connection, maybe even love. We dwell in them for years and then one day in effect say: “It’s over. Next Monday, let’s all pick up and wander off in different directions. Most of us will never see each other again, and we’ll definitely never exist as a community again. Who’s up for this?”

This is crazy by primate standards, or by the standards of the millenniums of people living in small agricultural communities. The oddity of voluntary dissolution, and the storm of strange emotions that it can evoke even many decades later, are on my mind now because I just watched my son graduate from high school. Few of his class will merely settle down in their hometown metropolis. Instead, they will scatter to distant universities and help form new self-dissolving communities. Four years of social complexity, then a spate of denial about what’s really happening: “We’ll be in touch.”

From the adorable faux pomp and circumstance of a preschool graduation, through the last day of summer camp, and on to the graduations of late adolescence and early adulthood, we are trained to live in communities in which eventually, everyone will walk away and barely look back.

It certainly has its advantages; that same process trains us to seek out opportunities, to sample different ways of living, to explore and venture, to reinvent ourselves. Maybe it even helps prepare us for each of our ultimate departures. But it also leaves us with a peculiar sense of loss that few of our ancestors ever felt.

Robert M. Sapolsky is a professor of biology at Stanford University and of neurology and neurosurgery at Stanford’s medical school.