I learned something new! I love it when that happens.
In the last few months I moved from just below to Fall Line to deep in the Piedmont. As a result the collection of counties I am responsible for have gotten hillier, wetter, and slightly cooler. I’m seeing a different suite of flora and fauna than before, including some things that haven’t previously been on my radar.
A bowshot from National Forest land in the high hills of North Georgia, something caught my eye beneath the white pines, oaks, beeches, and maples. Small white stalks rose from the leaf litter, curling over at the top with a petaled capsule like a lamp post for gnomes. I snapped some photos so I could look up what kind of fungus this was when I got home.
Monotropa uniflora is known by several common names, including ghost plant, Indian pipe, ghost pipe, or corpse plant. It was much more interesting than I gave it credit for. For starters, it isn’t a fungus; it’s a plant in the heather family. Instead of spraying spores to reproduce, ghost pipe requires native bees to visit its flower and carry its pollen. Unlike heather (and most plants), ghost pipe doesn’t produce chlorophyll, so it can’t use sunlight to make energy. So how does it live?
That’s the next interesting bit. There are many fungi that engage in mutualistic association with plants. These fungi, known as mycorrhizae, colonize the roots of plants. They assist the plant in collecting water or nutrients, and collect carbohydrates created in the plant’s chlorophyll factories above. The ghost pipe is a parasite of certain mycorrhizal fungi. This is a switch; usually a fungus parasitizes plants. But we have a plant stealing sugar from a fungus that said fungus “traded” from a tree that produced that energy in the sunlit canopy 80 feet above. Thus, it can grow in the densest, darkest forests. Ghost pipe pops quickly up after a rain, and flowers in early summer to autumn; I was fortunate to have wandered the woods at just the right time to see them.
Curiosity is an important trait for any aspiring naturalist. Knowledge can be gained by seeking answers, but curiosity is the driver.
I’d just finished a meeting on the northern edge of the state, and I wasn’t entirely ready to head to lower lands. So I found a pull-off where I could park the truck and ambled down next to the river. Here the water was deep enough to roll on fairly smoothly, but only lightly clouded compared to the brown waters a few counties south. I sat here for a few minutes, trying to be present – my mind pushing away the people I’d dealt with earlier and the long drive ahead of me. Off in the distance rose the high, thin whine of Brood X cicadas. By my knee, an adult dobsonfly, another short-lived but far quieter insect, twined around a grass stalk.
It’s not hard to lose track of time beside running water, but home called so I didn’t tarry too long. Back up to the roadside, I paused to take one more look at the river. I don’t recall hearing anything noteworthy, but my eyes flashed down to the ground. There, beside my boot, a shrew about the size of my thumb lunged out from under a leaf to grab an earthworm by the middle. The tiny insectivore backed into the leaf litter with its thrashing prey. I knelt and flipped over the leaf, revealing a tunnel of sorts in the humus; the shrew was already away to eat and resume its neverending hunt.
My meetings with shrews are rare enough to make this close encounter notable. I wouldn’t want to handle a shrew, of course, as they are among the only venomous mammals. A nip from one of these fellers would cause localized pain and swelling for a few days, but their toxic saliva renders worms, insects, and even mice paralyzed and comatose. Shrews store their captured prey in caches for later use. Given their high metabolic rate – starving in a matter of hours if not fed – storing ready food is not a bad strategy.
This is a common theme with me, but one which I hope will sink in. Take the time to stop, to be still, to look and listen. How can Nature grace you with a moment if you aren’t there to appreciate it?
Much is being said about the removal of statues honoring figures of the Confederacy. In the summer of 2017, a rally to preserve a statue of R.E. Lee brought out the very worst people, shocking many who didn’t seem to realize that Nazis and racial purists and their ilk were so prevalent or so willing to “Make America 1930s Germany.”
Why on earth am I writing this now, with Trump gone but his cult fighting on, and other fronts of the culture wars claiming lives and dividing the country? Actually, I wrote this in August 2017. And after some thought, I quietly shelved it. It just seemed too divisive a topic to broach – a discussion about statues, history and perspective could be seen as throwing gas on social media’s bonfire. Maybe there’s enough space for people to gauge my intention without assumptions.
The Lens of History
People and events of the past should be viewed through two lenses: the contemporary and the modern. You look at a subject in the context of the times, though the eyes of those who were there, ideally through multiple points of view. Then you look again through the filter of your own time.
A man can be a hero to his friends and a villain to his foes, and somewhere in between from point of view. A moderate human being can look cruel across the centuries; the past’s extremist becomes our visionary.
Ain’t About You, Jayne
Statues say more about the people raising them than about the folks whose likenesses are fixed in bronze or marble. Joss Wedon said it well (through the voice of Malcolm Reynolds): “It’s my estimation that every man ever got a statue made of him was one kind of a sumbitch or ‘nother. Ain’t about you Jayne. It’s about what they need.”
That’s very true. Take the epicenter of the Charlottesville flashpoint: Robert E. Lee . He was a man. He thought things and did things as men do. He was more famous than many. Generations of people made his memory into what they wanted: a wise general, honorable to a fault, devoted to duty, beloved by his men and quick to set the example for reconciliation. To those whose elders recounted tales of privation, sacrifice, subjugation and the upturning of the social order, he was an icon, a secular saint in the pantheon of the Lost Cause. Now, a century and a half after the war, a new generation is reframing him with a modern lens, to suit their modern needs: a traitor, a racist, a demented graybeard and patron saint of slavery. These iconoclasts would have him reviled and then forgotten. Neither vision truly represents the man, though he may resemble elements of both.
Application of Power
I spent 20 years in a small town where over 40% of the residents live in poverty; in a park, a group of citizens erected a bronze statue of Old Fella, a stray dog. Why? Because the dog’s story touched their hearts, and because they had the money and influence to do it. The money might have gone to help other strays, or the poor folks of the town. But plucky Old Fella meant more to animal-lovers around the nation than children with growling stomachs. I care more for my dogs more than I do for most people, yet that use of funds didn’t sit well with me.
Although many modern Americans would cringe at my choice of words, I see it as a form of worship–ancestor worship, hero worship, or the exaltation of ideals. It is also a visible application of power. A monument goes up with the assent of those in power, and is removed when that power wanes.
The act changes little – one group feels a moment’s satisfaction, another is rankled, but in all my years, I’ve never seen changing a flag or playing musical statuary bring about economic or social reform. Not once.
The Myths They Carried
We all carry myths with us, even if deep down we realize they aren’t true. In the same way that Ivanhoe and King Arthur celebrate nobility rather than wallow in the brutality of the age, or how honor, clean living and good diction ensure that the Lone Ranger always triumphed in those thrilling days of yesteryear, many southern elders found comfort in the notion of moral, noble ancestors who were honorable in peace and valorous in war. All these are myths, only dimly resembling the folk of the time, who carried virtue and flaw in individual measure. But what are myths but stories that showcase the virtues that a culture aspires to?
It is proper – indeed, imperative – to revile the Nazis, the alt-right and their ilk, and it is perhaps inevitable that sepia images of the past are swept away to make room for whatever passes for the current “how things really were” that gleeful iconoclasts foist on us. In between bouts of two-minute hates, take a moment to see the small consequences. Look beyond the shrill hate-mongers and see the quiet elders who watch with befuddlement and sadness as today’s society strips away their cherished myths. Search your own beliefs for myths on which you have built your worldview, and delve deeper into their origins.
Many moons ago, I was on a van with my college Soils class, riding past farmland south of Athens. One of the students hailed from the Midwest. Surveying newly-tilled, rusty-red earth, she asked the professor, “Do you have any soil here?” All the southerners laughed, but it’s a fair question, and one with a depressing answer.
Most soil profiles have several distinct layers, or horizons. The O (organic) layer and A (surface or topsoil) layer contain most of the accessible minerals, organic material, water-storing pores, microbial life…everything that makes soil valuable for plant life. The Georgia red clay, synonymous with southern dirt, is the B or subsoil layer; it contains very little organic material and is stingier both with nutrients and water. In much of the south, and especially in the Piedmont, we simply have little to no topsoil. But this was not always the case.
Think back to the first half of the 19th century. The native peoples were swept out, and settlers opened the forests and savanna. The supply of land seemed inexhaustible, so it was deemed more important to wrest what one could from the soil than to tend it carefully. Cotton was the primary cash crop, and straight furrows were easier than rows that curved to follow the contour of the land. Rains came and washed away centuries worth of topsoil, hastening the decline of fertility and forcing farmers to abandon exhausted farms for new ground. By mid-century, many upland farms had lost their topsoil completely and farmers were trying to coax crops from subsoil. The problems compounded as the lost soil washed into waterways. Millponds filled with silt; rivers clogged up with earth, increasing flooding. There were some efforts to control erosion – contour farming and planting pines, for example – but war put those efforts on the back burner.
Subsequent decades follow a pattern of waning agriculture, a rise of forests, a clearing of forests, and a new rise of agriculture. Poor farmers trying to coax crops from even poorer land was the recipe for a cycle of poverty and destitution. In my lifetime I have seen the cycle more or less continue, with eroding fields planted in pines or left to grow wild; some of those stands have since been cleared again for agriculture or other developments. Erosion is still a pervasive issue.
Take a look at this creek in the Georgia Piedmont. See the broad slabs of stones and smaller rocks in the creek? That’s likely the natural base level of the creek, about where it would have been two centuries ago. See the 10’+, nearly-vertical banks? Those are largely composed of earth that has eroded from the surrounding hills due to clearing, development and poor farming practices. Steep banks of silt such as these are unfortunately a common feature of creeks and rivers in much of Georgia.
I’ve been told that one reason for the University of Georgia’s location at the turn of the 19th century was the clear waters and abundant fishing on the Oconee River. Current students may be forgiven for assuming the steep banks and opaque brown water is the natural state of Piedmont rivers. Not that many years ago, crews dredged behind the dam at Whitehall Forest; 20 feet down into the muck, they came upon a sawn tree stump!
I drive past so much land with stubby, slow-growing pines or thickets of spindly sweetgums and scrawny oaks, with little on the ground beyond stubborn tufts of grass or reindeer moss – a lichen that even as a kid I learned to associate with poor soil. Across the land, gullies – some bandanged by leaves and brush, others bare gashes that continue to hemorrhage soil – remind us of the natural wealth lost through ignorance, indifference, or greed.
But I haven’t visited the most famous monument to this human-enabled catastrophe: Providence Canyon. What started in the mid-1800s as series of gullies a few feet deep have become a fan-shaped series of canyons spanning hundreds of acres and plunging up to 150 feet below ground level. The super-gullies continue to gnaw at the land around them, extending further every year. It is sometimes called the “Little Grand Canyon,” as if its existence was a point of pride and not the most visible scar of a curse our ancestors bequeathed to us.
I see turn-of-the-century farm houses, mobile homes, and dilapidated ranch style dwellings lining the backroads I travel. Around these homes are either pine stands or clearcuts, for the hard, gullied subsoil is neither fertile enough for seasonal crops nor (in some cases) level enough to run a tractor. The preponderance of political banners touting modern lost causes suggests to me where these folks stand on the subject of climate change. They are willfully ignorant of the effects that human irresponsibility will have on their descendants; yet I suspect that they are innocently ignorant of the impoverishment bequeathed to them by their own forebears.
Over five millennia ago, a medical practitioner related how to use the pharmacological tools of his trade – resins, bark, leaves – in the healing of his people. His specialized knowledge of healing lore may have seemed magical to the folk of Mesopotamia. His use of advanced technology to record his knowledge, using a stylus on a hand-sized tablet of wet clay, may have seemed equally magical; however, it no longer inspires awe today, for such technology is grasped by toddlers in our own time.
To say the written word is taken for granted is an understatement of the highest magnitude. You are reading this now, or perhaps you skim to get the gist before moving on to some other article or blog or meme poster. This conceptual breakthrough should not be forgotten, for almost every tool and device you use, your food, your water, your clothes, your vehicles – almost everything you have or use or know is only available because someone wrote down instructions or information of some sort. We’re talking about the deceptively simple process of converting concepts into language, and then representing the sounds of language into written symbols. Then – and this is the real magic – the process is done in reverse, by other people, perhaps in other times.
Writing appeared independently in several places around the globe, but the proto-writing of Mesopotamia was arguably the first. The writing system evolved over the centuries in the cities between the Tigris and Euphrates Rivers. Simple drawings representing simple concepts – “ten sheep” for instance – appeared around the mid-4th millennium BCE. These representations become increasingly abstract until the symbols became tied to sounds in the Sumerian language rather than ideas. As Sumerian power waned and other peoples rose to take their place, the writing system was adapted to work with the completely unrelated Akkadian language (imagine the Japanese katakana syllabary being reworked to represent German). Cuneiform tablets written in the Elamite, Hurrian, and Hittite languages show the flexibility of the technology, or perhaps the limited desire for innovation in the face of an adequate method of recording information.
Time passed, the fortunes of kingdoms and empires shifted, and cuneiform-based languages gradually slipped from power and thus popularity. The last known cuneiform tablet was imprinted in the 1st century BCE, by which time the Library of Alexandria, hundreds of miles from the lands of the Sumerians and Akkadians, was already in decline. From pictographs to the highly stylized scripts, this earliest of writing systems was in use over three thousand years. By comparison, the common writing system for much of the Old World and most of the New World evolved from the script first used by the Phoenicians some three thousand years ago. Before rag paper, before parchment made from animal skins, the people of Mesopotamia used a stylus to draw patterns of wedges on soft clay tablets. Some of the tablets were baked on purpose, and others survived to this day through calamity, as fires in the libraries hardened the tablets. A combination of durable material, dry climate, and lack of disturbance allowed far more written material survive from the Bronze and Iron Age Middle East than in all the time before the early Middle Ages in Europe or around the Mediterranean.
For this year’s gift-giving season, I had a very short list. Apart from some books, there wasn’t much in the way of stuff that I desired to add to the general clutter. Knowing that I would be pestered by the relatives for more, I decided to ask for something quite unusual – a replica of a cuneiform tablet.
This one, composed of lines in three columns on the front and back, was written in the ballpark of the 24th century BCE. During a subsequent period of neglect, this tablet and tens of thousands of others were buried in rubble, where they remained until excavated in the late 19th century in Nippur. The face of it is badly damaged, but the reverse is legible. I plan on giving this replica relic a place of prominence in my library, to remind myself and all visitors that the impulse to gather and preserve knowledge has been around a long, long time.
I will end here with a quote from Carl Sagan, who eloquently expressed his thoughts on the topic some forty years ago:
“What an astonishing thing a book is. It’s a flat object made from a tree with flexible parts on which are imprinted lots of funny dark squiggles. But one glance at it and you’re inside the mind of another person, maybe somebody dead for thousands of years. Across the millennia, an author is speaking clearly and silently inside your head, directly to you. Writing is perhaps the greatest of human inventions, binding together people who never knew each other, citizens of distant epochs. Books break the shackles of time. A book is proof that humans are capable of working magic.”
Fires in North America, South America, Australia. Earthquakes in the Middle East. Tornado outbreaks, and a wall of wind across the Midwest. The Atlantic hurricane season that wouldn’t quit. Locusts in Africa. Final numbers aren’t in, but this may be the hottest year on record – the other contender being the previous election year.
Ah, yes, politics. The fear that if the other side wins, they’ll do unto us what we’ve been doing unto them. A UK that wants to leave Europe but keep their room. A Middle East that boils, bleeds, and literally explodes. A US where protests against violence are met with violence, a wannabe dictator rails against the rules of order when they don’t work for him, and a face mask (or lack thereof) is seen as ideological gang colors.
And the big story is the greatest pandemic in a century. Aided by a populace trained to distrust any inconvenient recommendations from scientists, this particularly virulent virus has killed over a million and a half people around the globe – including some 313,000 in the United States since February. I work with people who think the “China Virus” is a political stunt, and refuse to wear masks. I know hospital workers who have been in crisis mode for months, trying to keep the careless, the unlucky and the nonbelieving from drowning in their own fluids.
While all the humans strut and fret our hour upon the stage, fighting grand battles for the hyperbolic cause du jour, nature progresses on its own cycles. Foliage colors and falls. Frost rims the dried leaves. And the sun’s daily arc dips a little closer to the southern horizon with each pass, unregarded by most. In a couple of days, the sun’s apparent southward journey halts, as it has at this point in the planet’s revolution around the sun since before there was anything alive on this rock to notice. There’s no way to know when our ancestors first noted the daystar’s annual wander north-to-south and back again, or what significance they attached to that drifting. There are archeological clues of stone-age observances on several continents, but what it meant to the people when the sun finally stood still (solstice being the Latin term meaning “sun standing”) may never be revealed.
A natural cycles go, the winter solstice is as good a time as any to mark the turning of the year. The cultural end of the year is marked on the 31st. The 10 days (give or take, depending on the year) in-between are a liminal time – a threshold from an end to a new beginning, and not quite “normal” time. Everyone is poised for Christmas, then having Christmas; they stay buoyed until New Years’ Eve, and start coming down on the first of January. January 2nd is a time of forgetting resolutions, paying holiday bills, writing the wrong dates on documents, and looking towards a distant spring. I consider that whole ten-day span of time as the ending of the year, like a held breath before the plunge.
Take some time during this drawn out space between years. See how the natural world deals with this time of years; how do people acknowledge or work around the state of the environment at this time.
If I became governor of Hell I would reserve a special room for Eugene Schieffelin and his minions in the North American Acclimatization Society, the idiots who thought it would be nice for all the birds mentioned in Shakespeare’s works to be represented in America. In the 1890’s he released around 60 starlings into Central Park. Because of one single line in Henry IV, an estimated 200 million of the aggressive little bastards currently occupy North America, wreaking havoc on native bird populations.
Why such antipathy for these morons? Why not put in Etienne Trouvelot (who introduced Gypsy Moths) or whoever shipped the wood that contained the fungus that annihilated the American Chestnut? Because Schiefflin and his cronies went out of their way to perpetrate their crime, and for a silly reason.
Humans have been bringing pests from one land to another since they first commenced to roam, and many native species and a few ecosystems have paid the price. Many are completely unintentional, from fire ants to zebra mussels. Some seemed like a good idea at the time, like kudzu or cogon grass. But Schieffelin’s crowd were whimsically Anglophilic.
Dreamers are fine. But sometimes their dreams can become nightmares.
Watch when the tide comes in. Waves rush up the shore, then fall back. Every advance is a little higher, but then there is the ebb and momentary lull before the next advance.
The latter end of spring is like that, forward two steps and back one. Temperatures have been, by and large, nice – 60s F, occasionally 70s. Last week, there were a couple of 80s. Yesterday’s high broke 90, and now it is dipping down to 60 again. I love the cool and know that the hot will be here sooner than I’d like. The trick is appreciating the good weather whenever I can and as much as possible.
My fallow field and unmowed roadside are flecked with purple now. Verbena rigida is known by many common names, including slender vervain, tuberous vervain, and sandpaper verbena. It is a South American plant, more tropical than temperate, but it seems pretty happy in South Georgia, blooming from spring to fall. It is considered to be invasive in some locales. However, with pollinators in need, I’m not going to begrudge their presence on my property.
What is a shrine? Whether a box, an alcove, or a demarcated spot, it is a sacred site dedicated to a person, deity, idea, or something else worth veneration or remembrance. It could be a saint, or a town’s war dead; in a less formal sense, it may be a shrine to a beloved actor’s career or a sweetheart. Those crosses on roadsides marking sites of tragedy qualify as shrines.
Shrines may be simple and plain, or cluttered with memorabilia or objects of significance. What they all have in common is that they are sacred space, in the sense that they are set apart from the normal (“setting apart” or “dedicating to” being among the translations of the Latin sacrare). In this sense, even a non-religious or civic shrine is made sacred, for it is set apart and dedicated for some form of awe, respect, remembrance, or veneration.
What sort of shrine would a naturalist have? I’ll tell you what this one has.
A section of valuable bookshelf space has been set aside (literally made sacred) for a space to, not worship, but to make me aware of greater things; to remind me of moments I spent in wild places, and to center myself.
The large, flat stone that makes the central “altar” of my shrine is from a massive wall on my family farm – a structure I consider sacred in its own right. A century or two ago it was a bridge or perhaps a mill dam. The lichen-spotted rock rests on a bed of leaves taken from the same location. A number of objects lie on or about the stone, each with its own history.
First is a small dumbbell-shaped white rock, whose chipped surface reveals a dark gray interior. This is a flint nodule of a sort common to southern England. This particular one was collected on a hill overlooking an early Saxon settlement of the 6th or 7th century. Both the flint and the stone on which it sits likely felt a horse’s hoof at some point, for that form of powered transport changed little in the intervening 11 or so centuries. That only began to change in my Grandfather’s lifetime.
Also atop the of the altar are four projectile points, all found by me personally in my Coastal Plain wanderings. My archeologist friend reckons the newest is from the Late Archaic cultural period – maybe 3,500 years old, give or take. At that time, across the ocean, Egypt was pulling back together its fractured remains into a New Kingdom, while the Bronze Age Britons were putting their finishing touches on Stonehenge.
But wait, the oldest of these points is from the Early Archaic, which puts it at having been shaped by a hunter-gatherer 9,000 years ago. To view the sea, the maker would have had a longer walk in his day than I in mine, for the sea levels were a fair bit lower. In Europe, Mesolithic hunters roamed lands that have long since drowned in what is now the North Sea. The first farms were already established in the Fertile Crescent, the Nile, and the Indus Valley; the peoples of Mesoamerica had begun cultivating squash, but domestication of maize was still a millennium or so in the future.
The most powerful technology humanity has ever produced did not exist in the Early Archaic. Before alphabet, before ideogram, before any known proto-writing, a young hunter learned the vital skill of knapping points at the knee of an elder. In time, the youth would gain finesse and wisdom at this craft and would in turn teach others with few alterations in form.
Take a moment to think about the span of time. Four artifacts, variants of the same technology, collected in a radius of 50 miles, and spanning half a dozen millennia. Why, the device I write this on largely functions on technology younger than I am, powered by technology that was only commercially available in my great grandfather’s time. For that matter, the earliest known writing was developed some 6,000 years ago – the timespan covered by those four spear points.
But let’s move on to the next temporal talisman: a molar from a cave bear, gifted to me by a friend. The provenance is unknown beyond Europe or Asia, but the species probably died out around 24,000 years ago. Perhaps the owner of this tooth cast a wary eye at early humans, or Neanderthals; it is possible it never scented a two-legged rival in all its life.
Across the stone lies a sliver of mammoth tusk, from the dwindling permafrost of the northern tundra of what is now Alaska. These ancient elephants faltered and faded away as the ice drew back around 12,000 years ago; whether their demise was solely from a changing world or whether humans helped them along is a point of contention.
Now we take an accelerated leap back in time – several epochs, each measured in many millions of years. I won’t ask the reader to imagine such a span of time as a million years; you truly can’t. Instead, I will say to imagine that you can imagine, and that will have to suffice. Leaning against the base stone is a permineralized bone, the fossilized rib of a Triceratops; this fragment of what would have been a 30-foot-long beast was also a gift to me. The final days of the dinosaurs in which this second-most-famous species lived were still a long time indeed, for this genus arose some 2 million years before it (and most dinosaurians) died out.
And then there’s the spiraled shell of an ammonite. This is the only one of the various items thus far that I bought (at a fundraiser, in my defense). The varied orders of these cephalopods appeared on the scene in the Devonian Period and persisted until the general extinction at the end of the Cretaceous Period – a span of over 300 million years, half again as long as mammals have existed.
Remember the orders of magnitude between the flint and the spearheads, or between the mammoth and the dinosaur? Here we go again, as close as I can track life to the earth’s beginning. Four large, roughly rounded marbles with brown and black striations, represent those most ancient of times. These are stromatolites, the fossilized remains of the silty structures left by colonies of cyanobacteria. These microbes arose as far back as… are you ready?… 3.5 BILLION years ago. Their peak was 1.25 billion years ago, after which they declined as the world grew more hospitable to things that would graze on cyanobacteria. It should astound you to realize that these microbes persist in hostile environments today, so you can find active stromatolites in Australia and South and Central America. I would have loved to have found these specimens in my wanderings, but I bought them among some rock collections in Cambridge, Massachusetts.
There are other history lessons in this shrine beyond the vastness of the span of life. Here is a leaf from an American chestnut – a young sapling ensconced on a University campus and not in imminent danger of blight. But this pressed leaf is a reminder of the billions of chestnut trees which succumbed, in less than half a century, to a fungus blight transported carelessly to our shores in nursery stock. It represents only one of many species we have negatively impacted in our time in this land and on this earth.
Two more palm-sized stones are worth noting. One was smoothed by the milky waters of the Hoh River on the Olympic Peninsula of Washington State. The other gained its rounded shape rubbing against a billion of its brothers under the waves on Jasper Beach, Maine. With a stone in each hand I can bookend the Lower Forty-Eight States.
This small sacred space is a place to remember moments spent in wild places, and the goodwill of friends. It is a place to remind myself of the impermanence of lives but the tenacity and adaptability of Life. It is a focus for contemplation and meditation, reminding me to set myself apart from the modern world for a few moments to think about the larger world, both what is tangible and what is beyond human comprehension.
Consider what you have enshrined in your life, and why.
As I write this, white blossoms are popping out behind my house — an old nemesis sneers at me. The Rogue Pear.
The callery pear (Pyrus calleryana) is an Asian native brought to the US in the 1960s. Why? The pears are small, round, and inedible. The flowers fill the air with a sickly-sweet odor. The trees are densely limby and prone to splitting in bad weather. Like many fast-growing trees, the callery pear is short-lived, lasting only a couple of decades or so. So why has it turned up in every doctor’s office park and subdivision? Three things: beautiful foliage– deep green in summer, leaves turning blood-red or wine-dark in the autumn; an explosion of flowers late in winter; quick growth to a bushy, symmetrical silhouette.
Horticulturalists created a number of cultivars, the best-known being the ‘Bradford’. We removed the thorns, we straightened the forms. As a useful biproduct, these cultivars couldn’t reproduce. You plant Bradfords, and they stay where you put them.
But, in our great and unmatched wisdom, we kept tinkering. Each cultivar had slightly different properties. Different colors, stronger limbs. And then it happens. One cultivar is used to landscape a new strip mall, and a different one dresses up an office park down the street. Some local bees visit one and then the other. It turns out that different cultivars can fertilize each other. And these new seedlings exhibit the attributes of the original, wild pear: able to grow on a wild range of soils, able to seed prolifically, and armed with thorns that can punch through a truck tire. I call them rogue pears, when I don’t use stronger adjectives for them.
It’s a contagion the scope of which you aren’t likely to notice until late winter. Come February until April, these innocuous green trees suddenly blaze white in floral profusion. My corner of the county is pretty well infested; a mere 10 years after being fallow, the neighbor’s field is a young forest, with 8 out of 10 trees being pears. But I didn’t have any inking of realize how widespread the problem was until I was a couple of hours away, driving on a highway skirting the Fall Line. In the pine plantations on either side, the midstory was packed with pears bedecked with their white blossoms. Some quick research showed the rogue pear has popped up in most states east of the Mississippi, and it has a foothold in several western states as well.
Everyone knows about kudzu. You may have heard of Chinaberry or privet or tree-of-heaven. Now that pear is on your radar, maybe you’ll start seeing it come the end of winter.
Maybe, hopefully, you’ll choose native trees for your next landscaping project.