Stargazing as Therapy

18/12/2021

Oh stars! oh eyes, that see me, wheresoe’er I roam: serene, intent, inscrutable for aye, tell me Sybils, what I am.— Wondrous worlds on worlds! Lo, round and round me, shining, awful spells: all glorious, vivid constellations, God’s diadem ye are! To you, ye stars, man owes his subtlest raptures, thoughts unspeakable, yet full of faith.

– Taji in Herman Melville’s Mardi: and a Voyage Thither

Two generalizations about ancient civilizations: they were interested in gods and stars.

 

In a world where everything is mysterious and potentially perilous, no firm demarcation stands between reality and mythology. Our Paleolithic hunter-gatherer ancestors lived in a world of inscrutable natural phenomena — the visible and secret lives of animals, the sun and the moon, fire, the weather, birth and death, etc. — about which they might come to a rudimentary understanding through harried observation, speculation, and orally transmitted glosses of the conclusions to which others had previously arrived. Language itself is the primeval medium of animism: lightning leaps, it darts, it strikes, dances, and it roars. By simply communicating its behavior in these terms, we make it come alive. When lightning appears to strike certain targets intentionally, or if a sudden flash in the sky and crack of thunder coincides with an event of significance within one’s social group, the superstitious person is apt to attribute intelligence to it, or insinuate that it’s an extension or effect of a supernatural entity.

 

Over time, the descendants of the mythologizing hunter-gatherers developed agriculture, architecture, metallurgy, mathematics, writing, and the other practical arts — but none of these in and of themselves led to alternative descriptions of how the world worked. What grounds had a skeptic of his national religion to stand on? If a volitional agent isn’t responsible for thunder and lighting, then what is their cause? If the earth and sky weren’t fashioned by (or from) empyrean beings, then what were their origins? If the separation between human beings and unspeaking animals wasn’t the result of divine descent or intervention, then how can it be explained?

 

But prescientific cultures were not stupid. After all, they managed to survive and eke out conditions of long-term stability for themselves without electricity, refrigeration, calculators, hydraulic machinery, or the methods of industrial mass production and distribution. (Imagine the violent chaos into which a modern state would be flung if all these things vanished overnight!) A society could not have achieved this were it not generally practical-minded. Where we read accounts of ancient superstitious rites performed for the purpose of calming a storm or ensuring a good harvest, we ought to balance our derision of the celebrants’ ignorance with the understanding that their actions were informed by what seemed like a perfectly consistent idea of how the world in which they lived operated. Petitioning the gods for good weather or fair seas was an effort to control natural events in a way that made sense, given the available facts, and which must have correlated with favorable outcomes in the past.

 

The complex and frequently unobservable factors that determined the yield of an annual harvest, fair or foul weather, and the spread of a plague might be understood as the unimpeachable judgment of a deity or the cryptic whims of the god or spirit in whose bailiwick the phenomena in question reside. The aleatory character of events on terra firma strongly suggested that the gods and spirits overseeing worldly affairs were prone to inattentiveness, to playing favorites, or mercurial temperaments — and mortals had little alternative but to accept events as they transpired under divine sanction, and go on keeping the faith.

 

But the ancients — at least those living beneath unobstructed skies — observed with fascination the one part of their world which behaved with an inexplicable and precise consistency: the motion of the stars. Like the seas, the weather, disease, and the mysteries of birth and death, objects beyond the clouds were integrated into the explanatory narratives of myth. The Sumerians associated the stars of the modern constellation Draco with their supreme deity Anu. The star named Thuban by medieval Arabic astronomers, and called Alpha Draconis by their modern-day successors, was located at the north celestial pole between 3900-1800 BC. (Due to the precession of the Earth’s axis, today’s polestar is the familiar Polaris. In the third millennium CE, it will be Errai of the constellation Cepheus.) To the people of Uruk, city of the legendary King Gilgamesh, the asterism to which Thuban belonged would have seemed the apex and axis of the cosmos itself, and the only seat from which the supreme Anu could conceivably reign. For centuries, Egypt also placed especial significance on Thuban: it was the passage through which a deceased Pharaoh hoped to rise and reunite with his divine brethren. At times, and in certain eschatological texts, the Egyptians believed the realm of the dead was located about the modern constellation Orion, associated with the psychopomp god Osiris. The Hindus, looking northward from Orion (identified with Vishnu in the Vedas), saw in the seven stars of the Big Dipper the representation of the Saptarishi, the seven sages who transmitted to humankind the knowledge of yoga they received from Shiva. The upturned eyes of ancient China saw an ordered celestial city reflecting or prefiguring the earthly reason of empire and metropolis, with the Emperor at the celestial north pole, surrounded by concentric rings of walls, districts, and mansions. The planetary conjunctions coinciding with the founding of the Xia, Shang, and Zhou Dynasties were widely interpreted as signifying the passage of the Mandate of Heaven to the new rulers, legitimizing the ousting of the previous regime. And we could go on.

Developed independently by different cultures at different times, astronomy was the first science. It had to be: science endeavors to discover the systematic logic and relations of nature, and the mathematical order of the firmament was its only viable starting point. Whether the study of the stars, moon, and planets was originally spurred by religious purposes or utilitarian concerns is probably impossible to know, and possibly varied from place to place. But whatever the impetus for cultivating it, knowledge which could accurately predict the turning of the wheels in the night sky and correlate the positions of the stars to cyclical occurrences on Earth was invariably valuable. A culture that observed festivals on certain significant days in the solar or lunar cycle could not do so without the use of a calendar, the creation of which necessitated the prolonged observation and meticulous charting of the fixed stars. The folk knowledge of an agrarian society would instruct farmers to plough their fields and plant and harvest certain crops at times corresponding to the appearance of particular stars at dusk or dawn. In an age before magnetic compasses or GPS devices, mariners embarking beyond the sight of land could only rely on the sun and stars for guidance. A traveler at night couldn’t glance at his wrist to know the hour or the time remaining until dawn, but might make an accurate guess by looking up.

 

The modern affluent world isn’t very interested in either gods or stars.

The modern affluent world isn’t very interested in either gods or stars.

Charting the historical decline of religiosity across the developed world is beyond the scope of this piece, though we can confidently place the inflection at the Scientific Revolution. Isaac Newton and his disciple Samuel Clarke vehemently denied that the new model of a clockwork universe left no room for an active, interventionist deity; the Western intellectual caste of the 18th century wasn’t so sure, and given the choice between keeping the traditional faith of an involved God and accepting Newtonian physics, many were more inclined to trust the idea that could mathematically substantiate its claims. The passage of two centuries and an indefatigable interrogation into the mechanisms of nature divulged the mundane and reliably quantitative workings of phenomena previously attributed to supernatural agencies, gradually shunting the omnipotent deity of Abrahamic doctrine into the periphery. By the late 19th century, polemicists across Europe could unflinchingly assert that religious belief was an outmoded and extraneous element in a conception of the world necessarily adapting itself to the material conditions of modernity.

 

Not independently of the rise of scientific materialism, burgeoning secular institutions slowly drew the axis of society away from religious organization. Through successive stages of liberalization, the universities of Europe and North America transitioned from behaving as formal organs of particular Christian sects toward formally adopting the free inquisitive ethos of secular humanism. The maturing culture industry, equipped with electric media, arrogated to itself social functions that previously belonged to the pulpit, consummated the displacement of participatory rite by public spectacle, and constellated the public consciousness with a new pantheon of saints and idols. Through the 20th century, the role of the art museum became that of a secular shrine, visited by multitudes of pilgrims crowding not for a glimpse of a (purportedly) corporeal testament to the life and miracles of a martyr, but to stand in hushed reflection before visible instantiations of human genius. In nations where communist factions waged successful revolutionary campaigns, the victors typically swept aside religious practice and tradition as aggressively as they did any other vestige of an old regime.

 

I am an atheist, but I don’t think there’s any contradiction between my own metaphysical beliefs and my apprehensions regarding the decline of religiosity across the developed world. Years ago, when the New Atheism movement was beginning to pick up steam, I was no less enthusiastic than its public proponents in condemning organized religion for its archaic (and often cruel) doctrines regarding abortion and homosexuality, its dangerous strains of textual literalism, and its timeworn strategies of social control through promising eternal rewards to the obedient and threatening dissidents with damnation. But we were all naïve if we expected the withdrawal of religion from public and private life would lead to a flowering of enlightenment and the proliferation of rational and dignified humanist values.

 

The multifarious influences a strong religious institution brings to bear on society could be analyzed ad infinitum, but for the sake of simplicity we’ll condense them under two main headings: social cohesion and world-narrative.

 

A religion’s annual festivals and feasts, its encouragement of adherence to custom, the common parlance of myth and doctrine, routine participation in liturgical service or private observance, and opportunities for involvement in activities adjacent to the ecclesiastical body (typified in the United States by the church bake sale) — these are all social emulsifiers, bolstering community ties and actively affirming the individual’s belonging to an in-group. The composition of such practices will be informed by the world-narrative of that religion, sometimes intersecting with contemporary political affairs, but always arching over them. One whose understanding of the world is comprehensively and mythologically mapped can place himself and his family, the life of his community, and the naked realities of the earth, oceans, and sky within an intelligible network of mutual and commensurate significance. Whatever his station, he can understand his life as part of a narrative of cosmic meaning and purpose.

The atheist’s criticisms of these states of affairs still stand, of course. A community united by an ideology that penetrates every aspect of their lives is susceptible to intolerance of nonconformity, prejudice toward outsiders, and to manipulation by leaders who can interpret signs and scripture to further their personal interests; attempts by religious leaders and researchers to force scientific knowledge into compliance with a “revealed” cosmology is usually embarrassing at best and dangerous at worst, as demonstrated by the “Young Earth” theorists in the United States and Europe, and by the strange noises from orthodox Hindu science conferences.

 

But the alternatives that secular institutions have offered in lieu of religion’s historical roles as a social binding agent and cohesive world-narrative have left much to be desired.

 

Much of the unrest which characterizes the first half of the 20th century reads as an object lesson in the subsumption of church by state, and of religious doctrine by secular ideology. Declaring themselves and their parties as pontifices maximi, figures like Adolph Hitler, Mao Zedong, and the patriarchs of North Korea’s Kim Dynasty made the monumental discovery that earlier justifications of the social contract between rulers and the ruled (the Divine Right of Kings, the Mandate of Heaven) could be altogether dispensed with. A universal narrative of dialectical materialism or national destiny served the same purpose, and doubled as a condition for social unity and purpose — and contained a built-in directive to purge dissidents. As it turns out, secularity is no proof against national-scale cult behavior.

 

The pitched political tribalism of the internet age is a signal that ideological spasms no longer require the orchestration of a demagogue or political party. While digital media is an active agent in the culture wars, providing the venue and reinforcing the dynamics, it is not the cause. The conditions of modernity encapsulated by the term “late capitalism” — alienation from community, the debasement of work, the heteronomy of culture, the widening wealth gap, the sense of ineffectiveness ingrained by isolation and the breakdown of civic life, exacerbated by the “global village” — are a hotbed of anomie and Kierkegaardian despair. Grounding oneself through membership in an identitarian tribe — into which politics on both the left and right are devolving — has become an increasingly popular palliative. On the left, the “woke” movement is on its way to becoming the basis of a puritanical secular religion and America’s next great cultural export. In its distorted reflection shimmers right-wing nationalist movements gaining traction across the world, and most disturbingly (and fascinatingly) in QAnon, a global coterie of believers in a quasi-mythological and ever-evolving narrative of a secret conspiracy of pedophiles and Satan-worshippers at the highest levels of power, destined to be brought to justice by Donald Trump the Redeemer. Given the apparently high proportion of professed Christians in the ranks of the QAnon faithful, we might fairly call QAnon a Christian sect that’s veering towards secularity, given the emphasis it places on Christian identity rather than doctrine or practice.

 

Or, if the millennial or Zoomer-aged wage-earner isn’t enthusiastic about politics or inviting the added stress of culture-war paranoia and pessimism into his life, he can just go about the business of living, day in, day out. Working for a company he knows is exploiting him, which literally views him as moveable capital — as a “human resource.” Spending his commute, his evenings, and weekends watching sports, drinking beer, playing video games, absorbing ragebait “journalism,” admiring influencers, smoking weed alone, editing his dating app profile as though it were a resume, bingeing on television dramas, identifying himself with a fandom dedicated to a Hollywood film or video game franchise, collecting things he knows he doesn’t need but seems to desperately want, and scrolling, scrolling, scrolling. Listening to podcasts because his classmates and coworkers aren’t as interesting the disembodied voices of articulate strangers who come to him and leave him alone whenever he wishes; jacking off to porn because it conveniently approximates the physical pleasure of sex but demands none of the emotional investment or compromise of a human relationship, asks him to please nobody but himself, and gives him a greater variety of (simulated) sexual experience than he knows he is realistically capable of achieving in reality; fussing over Twitter, Instagram, and TikTok because attention and engagement mediated by a software platform is more expediently gratifying and more obviously structured than forming and maintaining social bonds in the material world; planning and saving for vacations to take him to rustic or exotic places that make excellent backgrounds for selfies, and returning from his week in a paradise tailored for tourists with a sense of tired resignation to a home he doesn’t own, a job he knows is bullshit, and a routine that seems like a life in the abstract.

 

In some corners of the internet, people respond to the polemics or complaints of the terminally online with the simple admonition to “touch grass:” to unplug, to go outside, to step outside of a frenzied media ecosystem and contact an actual one for a little while.

 

I would suggest a similar remedy: “look at stars.”

 

The obscuring of the night sky and general decline in uranological knowledge followed directly from an epistemological revolution catalyzed by the study of planetary motion against the fixed stars — a process which deserves a high spot in the index of history’s monumental ironies. Newton’s work inaugurated the modern age of science, without which the industrial revolution and an epoch of sustained technological improvement could not have happened. In most of the developed world, paying attention to the stars no longer advances any commonplace practical purpose: farmers rely on meteorological data and instruments for measuring soil moisture and temperature to determine the optimal time for planting a crop; sailors, hikers, and every other sort of traveler uses GPS; just about all of us own digital timepieces; few of us are in the habit of glancing up at night and expecting to see any omens — if we anticipate seeing anything at all. As more of the world’s population clusters in cities, and as the provinces of developing nations undergo electrification, only a small fraction of us sees the night sky in its primordial clarity.

By blotting out the stars, we’ve made our world of common experience into a cordoned habitat with an opaque roof — an environment which keeps us from recognizing that the exigencies of work and social commitments, the 24/7 welter of digital media, the acrimony and cynicism of politics, and the spiral of consumption do not constitute our entire situation.

By blotting out the stars, we’ve made our world of common experience into a cordoned habitat with an opaque roof.

While the specter of communism haunted Europe in the 19th century and possessed Asia in the 20th, while fascism was breaking out like a wildfire and being tenuously extinguished, while capitalism became entrenched across the globe and sundry revolutionary and reactionary tendencies advanced and ebbed, the unnoticed but most pervasive revolution in human thought across the globe has been the proliferation of anthropocentrism. It is not a political or philosophical doctrine, but an invisible ideology: the implicit belief that humanity, its works, and concerns for itself are the only things that warrant real attention or psychological investment. It is invisible because its adherents rarely announce themselves as such; it is evinced in patterns of thought and behavior, and by the inconspicuous absence of reference in speech or writing to anything outside the spheres of politics, sociology, entertainment, aesthetics, economics, language, the particulars of one’s job and social life, self-analysis, and so on.

 

In a sense, what’s real to any of us is what we arrange our habits around, or what we make time for. For many of us, the natural world — except when exotic and photogenic images of it confront us in print and on our screens — fails to register. It’s merely an academic fact, more discursive than material, and more or less inconsequential (except as a delivery mechanism for the effects of climate change).

 

Anthropocentrism is a symptom of modernity metastasized: the flattening of all values into means and ends. In the affluent world, an ambient malaise hangs over the masses — especially the younger generations. We have access to more information about the world than any people before us, but many of us are acutely convinced of our inability to change it. Most of us are estranged from the food we eat by several degrees of separation; the things we buy appear on store shelves and on our doorsteps as if by magic, as alien objects in spite of their familiarity. We don’t know our neighbors, we are used to being anonymous and invisible in crowds, and the land the asphalt covers is strange to us. Depression was on the rise in many nations even before the COVID-19 lockdowns began in 2020, as were reports of anxiety and loneliness epidemics. When one of us asks: What am I doing? What is this for? What does it mean? the anthropocentric consciousness must seek answers in a closed system of reference. If careerism, the accumulation of goods, social recognition or the pursuit of pleasure seem frivolous, arbitrary, or otherwise unsatisfactory reasons for existing — or if one fails to strike it rich, to trend on Twitter, or to live up to one’s aspirations as an artist, writer, video game speedrunner, etc. — the only alternative is to conclude that there is no reason, no meaning, no intrinsic significance to any of it, and that would be intolerable. Better to retreat deeper into the opiate haze of media addiction, or to seek purpose and belonging in a political faction with a zero-sum view of its relations with the rest of society.

 

Stargazing is no cure for the sicknesses of modernity, and it answers none of the questions prompted by the conditions of life it produces — but it might help you find your footing in the volatile psychological landscape of the 21st century. Here are some tips to get started.

Stargazing is no cure for the sicknesses of modernity — but it might help you find your footing in the volatile psychological landscape of the 21st century.

The sun is the nearest star. Become conscious of the gradual shift — from north to south, and then from south to north — in the orientation of the path the sun inscribes in the sky throughout the year. Most of us notice this indirectly by way of the lengthening and shortening of the days from winter to summer and from summer to winter, but try to keep track of the sun’s location when it rises in the east and sets in the west, and its position between the northern and southern horizons when it reaches its peak each day. Get a feel for the tempo and the steps in the waltz our planet and our star have been dancing for eons.

Get acquainted with whatever stars are visible at night where you are. Don’t make a point of traveling far for darker skies. Even in an urban neighborhood, at least a few celestial “landmarks” will punch through the electric blaze and smog on most nights. Get acquainted with these stars. Learn their names. Take note of the first time you see them each year. For the “imperishable” stars about the celestial pole, observe where they are at a set time each night, once a week, or once a month. If you’re especially attentive, you might learn to guess the hour your timepiece indicates by glancing upwards on a given night.

 

Over time, you may come to associate memories and expectations with the stars you see rising and setting at different parts of the year — your personal timeline then stands in relation to a cosmic touchstone. For the intents and purposes of human life, only the motions of the sun and fixed stars keep true time. All other clocks and calendars are arbitrary; even the SI definition of an “atomic” second was formulated to compensate for the irregularities of a synthetic mean time.

 

Learn to recognize the interloping planets. If you’ve got a handle on the configuration of the fixed stars, identifying a wanderer among them is simple.

 

No doubt the photographs from the various planetary flybys and landings are impressive to look at in glossy books or on a high-resolution screen — but learn to look at our solar neighbors with your own eyes. You don’t necessarily need a telescope; even in a city, Venus, Jupiter, and Saturn are hard to miss. Viewing their reproduced images, we understand these objects merely as ideas and abstractions. In the night sky, they are minutely but palpably present, these sister satellites of our home star, these other worlds.

 

In an anthropocentric and tacitly atheistic society, pausing to recognize and contemplate the stars is the closest thing to communing with a deity through prayer. The wealth of material facts that have chased the gods and angels from the heavens subtract nothing from the astonishment that the naked fact of its existence must inspire. Intellectually, we can get a handle on what we see — the relic light of gargantuan nuclear orbs separated from us by a gulf of billions of miles — but we can scarcely conceive of even a single star in its totality. The scale and power of our own sun exceeds all earthly reference. For most of the time it is over our heads, it seems to punish us for even presuming to look upon it.

 

Remember Carl Sagan’s famous remark about “star stuff:” with the exception of hydrogen and some helium, all of the matter comprising our world was produced in the cores of long-dead stars, under conditions which the mind reels to contemplate. Glimpsing at the points of light in the night sky, we see the likeness of the originators of all we know on Earth. Again: it is one thing to read about this and accept it as a barren academic point, and another to consider it while the visible emissions from these astonishing engines of creation and destruction excite our optic nerves.

It’s inexplicable, isn’t it? What is all of this? Why should it exist at all, and in this way? Anyone who tells you that scientific materialism can answer these questions is a fool or a liar. The Big Bang is a description of a theoretical event (one which indeed comports with the facts we have), but it can’t account for itself. The methods of science discern the character of fundamental forces and their interactions, but it is beyond them to discern why the rules of reality are what they are. At the substratum of all our relations, our jobs, our games, our identities and habits, our aspirations and grudges, everything we are and everything we know, lies a mystery as enigmatic as the Tao, sublime as the Trinity, as ineffable as the Tawhid, deeper than the Vedas.

 

Though the material conditions of modernity weren’t engineered to preclude our recognition of being bound up in an insoluble cosmic marvel, they do so anyway, and with dismal efficacy. We cannot with any intellectual honesty treat the universe as a background fact: it is with us here in our world. We are no more independent of it than the waters of the Pacific Ocean are from the Atlantic. The disregard for the context in which human life exists goes beyond mere oversight: it’s delusional.

 

To deliberately spend some time outside, under the sky, away from the noise, and to concentrate on natural objects in attentive silence is probably a healthy exercise in and of itself — but looking at the stars invites moments where the closed system of reference opens up, admitting the existence of something beyond this life. To recognize something inexplicable — something in our world that we cannot touch, cannot reach, and are utterly powerless to influence or manipulate — can prompt existential moments of a different kind than those experienced in the narrow confines of anthropocentric consciousness, in which “eternity” is a word without meaning, and human life is ultimately trivial for all its self-aggrandizement. Communing with the austere and immense cosmos under the night sky, one is led to the conclusion that life is precious because of its triviality on a galactic scale, and the question of what one ought to make of the tiny event of their life gains in urgency.

 

The stars give us no answers: they offer uncertainty rather than affirmation. But in any case, it’s always useful to have one’s total situation put in perspective. None of us chose the world or the moment in which we were born, or the particularities of the lives we came into. Life can be disappointing and painful. None of us knows where we’re going, and we’ll each of us eventually die and be forgotten. But in the face of eternity, what can we call any of it but miraculous? How do we deny intimations of a cosmic wholeness in which each of us is a small and temporary figment, and ought to be treated with compassion? A cosmic perspective, even if can only be maintained for interspersed moments, admonishes us to choose more deliberately how we should define ourselves through what we do with our days on Earth. It prevails on us to abandon battles that don’t need to be fought and to be resolute in the ones that must be, and promotes the circumspection that helps one to know the difference. It almost invariably fosters a sense of gratitude for the privilege of living to see and feel and recognize the lights of the firmament — and, to borrow another line from Dr Sagan, to be a way for the cosmos to know itself.

 

Shall we not also say that these are some of the noblest sentiments inspired by the old beliefs in the gods?