Who was St. Patrick? The answer involves pirates, snakes, and revelatory dreams.
St. Patrick is one of the most widely known Christian saints and the patron saint of Ireland. But not much is definitively known about his life. He was not the one who introduced Christianity to Ireland, as is sometimes claimed. Nor does he have much to do with luck or leprechauns. And in fact, he wasn’t even Irish!
Patrick was born in Roman Britain near the end of the fourth century. His family was either of indigenous Celtic descent or from Rome. He signed his name Patricius in Latin, but according to some accounts his birth name was Maewyn Succat. Although Patrick’s family wasn’t particularly religious, his father became a Christian deacon in order to benefit from tax incentives.
When Patrick was sixteen, he was abducted by Irish pirates who attacked and raided his family’s estate. They took him to Ireland and imprisoned him for six years in County Mayo near Killala, where he worked as a shepherd. Alone and separated from his family and his country, Patrick turned inward and found solace in his religion. He began to dream of converting the Irish people to Christianity. He continued to receive revelation through dreams and, after six years as a prisoner, heard God’s voice telling him it was time to leave Ireland. He walked 200 miles to the Irish coast and escaped to Britain.
Later, he had another revelation in a dream, where he saw a figure called Victorious who offered him a letter titled “The Voice of the Irish.” In the dream, Patrick was deeply moved by a company of Irish people imploring him to return, and he felt called to go back to Ireland as a missionary. He began training to become a priest and spent 15 years in religious study before he was ordained. Despite the lengthy time he took in filling his educational gaps, he still felt inadequate for the task that lay ahead of him. But once he had embarked for Ireland, he was filled with confidence and determination for his cause—to minister to Christians already there and to convert the Irish. (He did not introduce Christianity to Ireland, as is commonly believed, but he played the largest role in converting the Irish people to Christianity.) Patrick had great success and baptized and confirmed many people into the Christian faith.
He dealt carefully with local authorities and non-Christians, but he was cast into prison at least once and was in constant peril of martyrdom. Those he converted, too, were at times in danger. In his Letter to Coroticus, Patrick denounces the British mistreatment toward Irish Christians and bids those who have died farewell. In his autobiography, Confessio, he humbly poured forth thanks to his Maker for his success in helping the Irish become “people of God.” One scholar stated that “The moral and spiritual greatness of the man shines through every stumbling sentence of his ‘rustic’ Latin.”
Patrick was able to reach so many people in large part due to his familiarity with Irish language and culture. Instead of attempting to wipe out traditional Irish beliefs, Patrick instead incorporated elements of Irish ritual into Christianity. The Irish celebrated their gods with fire, so Easter celebrations included a bonfire. The Irish used the symbol of the sun in their native worship practices, so Patrick superimposed an image of the sun onto the cross, creating what is known as the Celtic cross. This helped the Irish converts understand the underlying similarities between the light of the sun and the light of Christ and incorporate Christianity more naturally into their lives. According to legend, Patrick used the three-leafed shamrock to explain the concept of the Trinity—and though this story is not based in fact, it illustrates St. Patrick’s creativity, passion, and understanding of the Irish people.
In this country with a rich tradition of oral storytelling, Patrick’s life and mission was expanded and exaggerated over the centuries. Other legends hold that Patrick drove all the snakes in Ireland into the sea to their utter destruction, and that is why there are no snakes on the Emerald Isle. One story reports that Patrick prayed for food for a starving group of sailors traveling over land, and a herd of pigs miraculously appeared to sustain them.
In the later years of his life, Patrick retired to Saul, the site of his first church, and died on March 17, 460 AD. He was never formally canonized by the Catholic Church simply because there was no canonization process during the first millennium. Rather, he was almost by default proclaimed a saint due to his widespread popularity. He is widely known as the patron saint of Ireland, and Irish Catholics seek his protection and intercession.
The Feast of Saint Patrick is held on March 17 as a religious celebration. It was traditionally a solemn affair in Ireland, a day of holy obligation and silent prayer rather than rollicking and revelry. Until the late twentieth century, St. Patrick’s Day was actually more widely celebrated in the Irish diaspora than in Ireland itself—most prominently in North America and Australia, where there are large numbers of people with Irish heritage. Starting in the 1700s, Irish immigrants in America began to incorporate celebration of the culture and heritage of the Irish in general on St. Patrick’s Day. Parades and parties, music and dancing now characterized a celebration of Irish pride, and outside Ireland, St. Patrick’s Day has focused mainly on more secular aspects of Irish culture. Many people take this as a day to wear as much green as humanly possible, hunt for four-leaf clovers, and drink green beer. Ireland only began to incorporate these types of festivities beginning in the 1970s.
St. Patrick has become nearly synonymous with Ireland. As we celebrate St. Patrick’s Day and honor Irish heritage, let’s also incorporate his humility and deep respect for Irish culture without resorting to stereotypes.
We celebrate Valentine’s Day in honor of Saint Valentine every year on February 14th—but who is the legend behind the holiday?
The answer is uncertain, really, but it involves miraculous healing, secret marriage ceremonies, and beekeeping.
The Many Saint Valentines
Various accounts of Saint Valentine seem to overlap and intersect with one another. There are about 500 recorded stories about a saint named Valentine. There were likely two Christian martyrs who made a name for Saint Valentine, and the legendary stories associated with them were merged into one. Historians agree that there may be little actual historical basis for some of these accounts, while others may be exaggerated versions of real events.
And what’s more, there were actually about twelve saints with the name Valentine. Valentinus, the Latin word for worthy, strong, or powerful, was a common name between the second and eighth centuries.
The Roman Priest Who Secretly Performed Marriages
In one account, Valentine was a priest in third-century Rome during the reign of Emperor Claudius II. Claudius outlawed marriage for young men, reasoning that men with no wives or children would be a better asset for the Roman military. Valentine saw the injustice of this decree and began to perform marriages for young Christian couples in secret. These clandestine ceremonies meant that the married men could not be pressed into military service, thereby depleting Claudius’s potential forces. Was Valentine trying to show the emperor that not even the military can conquer love—or is this the story of how Valentine aided in draft-dodging to weaken an empire that was hostile to his people?
Christians were a small and persecuted minority in Europe during this time. Valentine also aided many of his fellow believers in escaping the harsh sentences they faced in Roman prisons, where they were tortured and beaten.
These were serious crimes against the Roman Empire, and when Claudius found out, he ordered that Valentine be put to death. Claudius offered one way out: Valentine could renounce his faith. The priest refused, choosing rather to suffer his own torture and imprisonment, and he was beheaded on February 14. (The year is uncertain, but it was somewhere around 269–280). Valentine’s compassion, integrity, and heroic actions in the face of persecution sealed his fate as a Christian martyr.
Before he was killed, though, he fell in love with a young woman, possibly the jailor’s daughter, who visited him during his confinement. He sent her a note signed, “From your Valentine”—meaning that the first valentine greeting was from the saint himself.
The Bishop of Terni Who Converted His Captor and Healed the Blind
Another account from the same period places Valentine as the Bishop of Terni. Valentine was placed under arrest in the house of Judge Asterius for the crime of attempting to convert people to Christianity. The judge and the bishop engaged in debate over religion to pass the time. As Valentine continued to pledge his faith to his captor, Asterius decided to put him to the test. He presented Valentine with his daughter, who was blind, and, with a guarded hope and desperation, asked him to heal her. Valentine placed his hands on her eyes and restored her sight. Judge Asterius, in awe, broke all the idols in his house and fasted for three days, and then he and his entire 44-person household were baptized as Christians. He also ordered the release of all Christian prisoners nearby.
Valentine was later imprisoned again, and this time, he was sentenced to death by beheading on February 14. Before he died, he sent a letter to Asterius’s daughter whom he had healed, signed “Your Valentine.” (Or perhaps she was the one who sent him the letter.) Sound familiar?
Both Valentines are said to have been buried at Ponte Milvio along the Via Flaminia, and since the year 496, a feast has been held in honor of Saint Valentine on February 14. However, the Roman Catholic Church removed Saint Valentine from its calendar in 1969 due to the lack of reliable information on his life.
Today, Valentine is still honored as the patron saint of love, engaged couples, and happy marriages . . . as well as epilepsy, beekeeping, fainting, and the plague.
It’s not clear how all these associations came about. But let’s talk about beekeeping for a moment. One of Saint Valentine’s duties in the afterlife is to maintain the sweetness of honey, that bee colonies flourish, and that . Bees have long been associated with goddesses of love, like Aphrodite, Venus, or Gwen. Myths from ancient Greece and Rome portray bees as having romantic effects on humans. Honey has been used as an aphrodisiac and medicinal agent for thousands of years, and its sweet taste represents love and healing. Beekeepers, then, can represent the guardians of love in family and marital relationships.
All this is to say that Valentine watches over the protectors of love. But also the plague.
So Why Do We Celebrate Valentine’s Day?
Our friend Geoffrey Chaucer, author of the Canterbury Tales, was the first to associate the feast day of Saint Valentine with romantic love.
In his poem “Parlement of Foules,” Chaucer wrote,
For this was on seynt Volantynys day
Whan euery foul comyth there to chese his make.
Those lovebirds were looking for a match on Valentine’s Day. Chaucer was perhaps referring to the feast day of St. Valentine of Genoa on May 3, which is when birds are more likely to mate. But people took the 14th of February and ran with it.
Medieval folks had gone crazy for stories of courtly love and especially stories of forbidden or secret relationships. Now, young lovers called upon Saint Valentine for a blessing on their relationship, finding romance and excitement from the tales of secret marriages performed under his hand. In France, February 14th became a day of feasting, singing, and dancing in honor of romantic love. The French and the English penned love letters to their “valentines,” and Shakespeare’s Ophelia pines,
To-morrow is Saint Valentine’s day,
All in the morning betime,
And I a maid at your window,
To be your Valentine.
The legendary name of Valentine is now shorthand for romantic love and all the excitement and disappointment, butterflies and heartbreak that come with it. But perhaps we can remember a different type of Valentine—the type who is willing to sacrifice his own life out of a deep commitment to his faith and love for others, the type who courageously performs acts of compassion during times of oppression.
Why do we say “once upon a time” when beginning a story? The answer involves great time, great distance, and great imagination.
“Once upon a time” is a formulaic beginning that primes the listener for a story and frames the narrative to come. It is vague and imprecise on purpose—it’s a signal that the story is fictional and invites readers or listeners to open their imagination. But if you really think about it, this common phrase used to begin a story doesn’t make a whole lot of sense. “Once” means something happened, but how can that something be “upon” a time?” Well, until fairly recently, “upon” was often attached to any time-related phrase where we might use “at” or “on” today.
History of “Once Upon a Time”
It is believed that many different variants of this archaic-sounding phrase have been a part of English storytelling since before Chaucer’s day. By this time, lines like “once on a time” or “once upon a day” had already become conventional, as evidenced by their inclusion in enduring literary works.
The first recorded use of such a phrase, in this case “Onys uppon a day,” was in 1380 in the story Sir Ferumbras. This was a Middle English romantic poem that was part of a collection of literature on the history of Charlemagne’s France.
Geoffrey Chaucer also used the similar phrase “once on its use in The Canterbury Tales, published in 1385.
Yet another variation is found in Miles Coverdale’s 1535 translation of the book of Job: “Now upon a time . . . the servants of God came and stood before the Lord.”
Finally, the fully formed phrase “once upon a time” has been around in oral and written form since about 1600. It’s also likely that storytelling conventions in French, German, and Scandinavian fairy tales were born out in English translations. They often have similar first lines in their respective language, such as “there was once” or “once” or “there was a time.”
Time and Space: The Power of Story
“Once upon a time” conveys distance and time far removed from here and now. The power of story is that it allows us to imagine and process things outside the constraints of our everyday lives. Maria Konnikova explains that stories allow us to reflect on the world in a nonthreatening way through psychological distancing. The distancing is actually what helps us connect to the story, discern patterns, and weave together pieces of ourselves that would otherwise remain without tangible expression. At a distance, we have the freedom to engage in fantasy and reflection to comprehend more about reality than we could when we are in the thick of real life. We can learn to empathize, play, imagine, and abstract. We then translate the truth of a story into the language of our reality to overcome real problems.
Think about Star Wars—a long time ago in a galaxy far, far away is a great place to explore the idea of having both light and dark within oneself, the larger battle of good vs. evil, and the mystery of a higher power or force greater than humanity.
Back When Tigers Smoked: Beginning a Story in Other Languages
Other languages, too, have phrases that signal the beginning of a story and create space in terms of time and distance that can be filled in through imagination. Let’s take a look at how stories begin around the world.
Many opening phrases simply mean “a long time ago” or “it’s an old story.” Romance and Germanic languages typically use some variation of “there was once” or “once it happened”—creating a distance in time.
In some languages, such as Russian, the phrase used to begin a story translates to “in some kingdom” or “in some land.” Czech uses an interesting phrase that means “Beyond seven mountain ranges, beyond seven rivers.” Some other Eastern European languages have similar phrases that convey great distance, like Lithuanian: “Beyond nine seas, beyond nine lagoons.”
Some simply signal that we are indeed about to hear a story. In Hausa, a West African language, a narrative begins, “A story, a story. Let it go, let it come.”
Some openers draw on the value of oral traditions, like the Iraqw opening line spoken in Tanzania and Kenya: “I remember something that our father told me and that is this.” Likewise, in Chile, one classic formula is “Listen to tell it, and tell it to teach it.”
And some openings are designed to draw listeners in by engaging a fantastical, mythical, far-off world. A traditional Turkish opening phrase translates to “Once there was, and once there wasn’t. In the long-distant days of yore, when haystacks winnowed sieves, when genies played jereed in the old bathhouse, [when] fleas were barbers, [when] camels were town criers, [and when] I softly rocked my baby grandmother to sleep in her creaking cradle, there was/lived, in an exotic land, far, far away, a/an . . .”
In Korean, one way to introduce a story is to say, “Back when tigers used to smoke [tobacco] . . .” This one is worth digging into. The tiger is a defining symbol of Korea and features in the Korean origin myth as an animal that had the potential to become human. A tigress and a she-bear who lived together in a cave both wanted to become human. The two animals were promised by a heavenly prince that if they stayed in their cave for 100 days and ate only mugwort and garlic, they would emerge as humans. The tiger lost patience and ran out into the forest, while the bear persisted and became a woman who gave birth to the king Tangun.
As a prominent figure in Korean folklore, the tiger is a symbol of strength, power, and protection but is also often the one who is outwitted or who becomes the butt of a joke. The dual symbol of the tiger is both revered and lovingly ridiculed.
(As an aside, smoking has been widely practiced in Korea since the introduction of tobacco in the early 1600s. Everyone, old and young, male and female, rich and poor, smoked tobacco, until the late 1800s when it fell out of fashion for women. Growing tobacco boosted the economy, tobacco was seen as medicinal, and smoking became a social activity. Smoking has decreased in South Korea in recent decades but is still more prevalent than in other parts of the world.)
But why, in the phrase used to begin a story, are tigers the ones smoking? No one knows exactly why. It may simply be an indication of a highly fantastical story, given that tigers don’t and can’t smoke (at least not of their own volition). It’s a signal that a story is about to take place, drawing on the rich representation of the tiger in stories that people are already familiar with.
What are brownie points? The answer involves military slang, wartime food rationing, and the Girl Scouts.
“You might earn some brownie points if you shovel snow for your next-door neighbor!”
“Turning in your report early will get you some brownie points for sure.”
No, this doesn’t mean that someone will bake you a gooey chocolate confection if you rack up enough points. Nor does it mean that a Scottish household fairy will magically milk your cows and sweep your barn during the night.
The Oxford English Dictionary defines a brownie point as “a notional credit for an achievement; favour in the eyes of another, esp. gained by sycophantic or servile behaviour.” This definition emphasizes the negative connotation of pandering to someone to win their favor, but “brownie points” can also simply mean imaginary credit for doing a good deed or praise for a performing a service for someone.
The OED indicates that the term “brownie point” may be most closely tied to the term “brown nose,” which similarly means to ingratiate oneself with someone by being excessively attentive or eager to help. “Brown nosing” originated as military slang and is documented as early as 1938. The general notion was similar to many other obscenities—the brown noser was kissing someone’s backside. You can guess where the brown came from.
J.E. Lighter’s Random House Historical Dictionary of American Slang states that “brownie points” originated from a rewards system used by the Brownies tier of the Girl Guides (UK)/Girl Scouts (USA) program. While “brown nose” is a more likely precursor to “brownie points,” the term surely owes some of its popularity to the founder of international scouting.
In 1914, Lord Robert Baden-Powell organized the youngest age group of girls in the Girl Guides program and called them the Rosebuds. The group was run first by his sister, Agnes, and later by his wife, Olave. After hearing that the 8-to-11-year-olds in the Rosebuds group disliked the name, Lord Baden-Powell renamed them Brownies after an 1870 story called “The Brownies” by Juliana Horatia Ewing.
Brownies were popular in children’s literature at the time the Brownie Girl Guides were founded. In “The Brownies,” two children named Tommy and Betty learn that it’s better to be helpful and hardworking like brownies rather than lazy like boggarts. Ewing’s tale draws upon brownies in English and Scottish folklore, where they are described as a kind of fairy that dwelt in homes and awoke at night to clean, do chores, and sometimes pull light-hearted pranks on lazy servants. They were rewarded with a bit of cream and bread. Boggarts, on the other hand, were malevolent household spirits that stir up mischief, make things disappear, and cause milk to sour. A brownie could turn into a boggart if it the household occupants offended it.
True to their namesake, Girl Guides/Scouts in the Brownies section (now encompassing ages 7–11) are encouraged to be kind and helpful in their communities. Brownies receive badges as a reward for achieving various interest-related and scouting-related tasks or for doing good deeds. The term “Brownie points” with a capital B has been associated with Girl Scout achievement badges. However, there has never necessarily been a “point system” to earn badges in Brownies or to advance up the ranks in the Girl Scout organization, and the term is not used within the organization.
The Brownie Exchange
As another theory, During World War II, citizens were given ration points in various colors that could be exchanged for food, based on availability. Red and brown points were used to buy meat and fats, while blue points purchased canned and bottled foods. It’s easy to see how “brown points” could have morphed into “brownie points.”
Other theories as to the origin of “brownie points” abound:
In the 1930s, brown vouchers called “brownies” were awarded to delivery boys who carried the Saturday Evening Post,Ladies Home Journal, and Country Gentleman, which they could exchange for items from a catalog.
In the late 1800s and early 1900s, G. R. Brown, general superintendent of the Fall Brook Railway in New York and Pennsylvania, gave “brownie points” as a system of merits and demerits. This was copied by other railways, but it was mainly used as a term for negative actions by employees.
A camera club started in 1900 taught children how to use the Brownie box camera—but with no system of points in sight, this seems like an unlikely origin.
With the most likely origin stemming from a reference to excrement, it’s no wonder that we’ve come up with many other explanations for “brownie points”—but a variation on “brown-nosing” does seem to make the most sense.
Given all of this background, it’s surprising that the term “brownie points” didn’t come into existence earlier in the twentieth century.
A 1944 book of American Speech recorded “brownie points” as a schoolyard taunt for goody-two shoes students who answered all the teacher’s questions.
The first specific documented use was in 1951 in the Los Angeles Times. In an article called “Brownie Points—The New Measure of a Husband,” Miles Marvin wrote of brownie points as a means of earning favor with his wife. He goes to great lengths to explain the unfamiliar new term:
I first heard about them [brownie points] when the chap standing next to me in the elevator pulled a letter from his pocket, looked at it in dismay and muttered “More lost brownie points.”
Figuring him for an eccentric, I forgot about them until that evening when one of the boys looked soulfully into the foam brimming his glass and said solemnly:
“I should have been home two hours ago . . . I’ll never catch up on my brownie points.”
Brownie points! What esoteric cult was this that immersed men in pixie mathematics?
“You don’t know about brownie points? It’s a way of figuring where you stand with the little woman – favor or disfavor. Started way back in the days of the leprechauns, I suppose, long before there were any doghouses.”
Miles didn’t exactly know where it came from, either, but his article captured an unfortunate transactional dynamic between wives and husbands. A respectable husband in Miles’s day was supposed to remember birthdays and anniversaries, remember to complete tasks his wife had asked him to do, and get home on time—and it seems like his friend could never get ahead.
Since then, brownie points have typically been used in a more positive light, as an imaginary reward for good deeds rather than a dauntingly high bar to achieve or a pejorative term. Probably because they remind us of chocolate.
Miles, Marvin. “Brownie Points—The New Measure of a Husband.” Los Angeles Times. March 15, 1951, p. 41.
Many Christmas songs are written by those who imagine what Bethlehem might have been like on the night Christ was born. “O Little Town of Bethlehem” was inspired by a Christmas Eve service actually held in Bethlehem.
Phillips Brooks was highly esteemed in matters of both faith and intellect, as an Episcopalian preacher who had earned a doctorate of divinity from Oxford, taught at Yale, and publicly advocated against slavery during the Civil War. He was known to be quite reserved and found an outlet to express his feelings through writing verse. Hymns were a major part of his spiritual upbringing, as his parents had each child in the family learn a new hymn each Sunday and recite it for the family. Wrote one biographer, “These hymns Phillips carried in his mind as so much mental and spiritual furniture, or as germs of thought; they often reappeared in his sermons, as he became aware of some deeper meaning in the old familiar lines.” He also said that “the language of sacred hymns learned in childhood and forever ringing in his ears” was a means through which “he had felt the touch of Christ.”
In 1865, Brooks embarked on a pilgrimage to the Holy Land. On Christmas Eve, he rode on horseback from Jerusalem to Bethlehem and saw the fields where the shepherds saw the star. He participated in a five-hour long Christmas Eve service at the Church of the Nativity and was profoundly moved by the experience.
In 1868, with the memory of Christmas in Bethlehem still “singing in [his soul],” Brooks channeled his feelings about the experience into a song for the Christmas service at the Church of the Holy Trinity in Philadelphia, where he was the rector. He asked organist Lewis Redner to set the words to music. Redner struggled to come up with just the right tune. On the night before the Christmas service at their church, Redner wrote, “I was roused from sleep late in the night hearing an angel-strain whispering in my ear, and seizing a piece of music paper I jotted down the treble of the tune as we now have it, and on Sunday morning before going to church I filled in the harmony.” The children’s choir performed the song, and Brooks and Redner thought that would be the end of it.
But the owner of a bookstore a few streets down began to print the carol on leaflets for sale. In 1874, a Reverend Huntington of All Saints’ Church in Massachusetts asked for permission to reprint the song in a Sunday school hymn book and named the tune “St. Louis.” The song gradually gained recognition and made its way into official hymn books of many different denominations.
The town of Medford, Massachusetts, had an annual sleigh race around Thanksgiving. In 1850, James Lord Pierpont wrote a song to commemorate the Thanksgiving tradition and published it in 1857 as “One Horse Open Sleigh.” Reportedly, he penned the song in a tavern that was home to the one piano in town and may have lifted some lines from a minstrel song by Stephen Foster.
It actually has two additional verses that you may not know, telling the story of a young couple on a sleigh ride who tip their sleigh into a snowdrift (and they’re still “laughing all the way!”).
The song was not widely popular at first and remained a local phenomenon, but the phonograph record changed that. The song was first recorded on an Edison cylinder in 1898. Drawing upon the cold weather imagery, musicians and choirs began to incorporate it into their rotations of Christmas songs, and it eventually became closely tied to the Christmas season rather than Thanksgiving. Finally, the radio propelled “Jingle Bells” to be consistently one of the most popular Christmas songs in the country.
In 1939, the Great Depression was beginning to fade, but World War II was lurking on the horizon. The Chicago-based Montgomery Ward department store was looking to cut costs for its annual holiday giveaway, which featured children’s coloring books, while still spreading Christmas cheer in an uncertain environment. The marketing department was tasked to come up with something, and employee Robert L. May wrote an original Christmas storybook with his four-year-old daughter in mind.
According to History.com, “As he peered out at the thick fog that had drifted off Lake Michigan, May came up with the idea of a misfit reindeer ostracized because of his luminescent nose, who used his physical abnormality to guide Santa’s sleigh and save Christmas.”
May’s brother-in-law, Johnny Marks, was a professional songwriter and later wrote hits like “Rockin’ Around the Christmas Tree” and “Holly Jolly Christmas.” Nearly ten years after May wrote his storybook, Marks adapted the story into a Christmas song. Gene Autry, better known as the “Singing Cowboy,” picked up the song in 1949, and his recording sold over 2 million units in a year. This made it the second most successful Christmas record ever, just after Bing Crosby’s “White Christmas.”
Since May had written the story while on the job at Montgomery Ward, the department store owned the copyright to all things Rudolph. When May’s wife died of a terminal illness in 1947, he was left as a single parent with crippling medical debt. The president of Montgomery Ward signed over 100% of the “Rudolph” copyright to May, and with the royalties, he was able to pay off his debt and live comfortably for the rest of his life.
Once in Royal David’s City
Cecil Francis Humphreys Alexander was a prolific hymnwriter from a young age. By the time she was 22, several of her texts had been published in the hymnbook of the Church of Ireland. She was born in Dublin in 1818, and her influence spread all over Ireland as she accompanied her husband, who was the bishop of the Church of Ireland and later an archbishop, on his travels. Alexander took every opportunity to engage in the ministry of the church and work with children. She also gave much of her life to charity work and social causes.
Her poem “Once in Royal David’s City” first appeared in the collection Hymns for Little Children in 1848. The goal of this book was to explain the Apostles’ Creed in cheerful, simple terms for the benefit of children. “Once in Royal David’s City” elaborates on the line of the Apostles’ Creed “born of the Virgin Mary.” Two other well-known hymns were also published in this book: “All Things Bright and Beautiful” (to explain “creator of heaven and earth”) and “There Is a Green Hill Far Away” (to illuminate “was crucified, died, and was buried”). The proceeds from the hymnal were used to build the Derry and Raphoe Diocesan Institution for the Deaf and Dumb.
Henry John Gauntlett, an organist at several churches in London, composed the music for the most well-known version of the song in 1849. The song caught on quickly and became part of the Christmas liturgical sections of hymnbooks in most Christian denominations.
Later verses of the song have sparked controversy over their portrayal of the baby Jesus as “weak” and “helpless.” Critics have suggested that Alexander was writing in a Victorian-era context in which a patronizing tone was taken toward children, and children were to be seen and not heard. Her lyrics may well have reflected Victorian child-rearing principles rather than providing a scripturally based account of the Nativity scene—but you’d be hard-pressed to find a Christmas song that does get everything right, and the Son of God in the flesh surely did experience what it was like to have human feelings and experiences.
On Christmas Eve, English speakers around the world tune in to the Festival of Nine Lessons and Carols, broadcast from King’s College in Cambridge. This tradition began in 1918, was first broadcast in 1928, and is now heard by millions around the world. In 1919, the second year of the festival, the organist composed an arrangement of “Once in Royal David’s City” as a processional hymn. The first verse is an a cappella solo by a boy chorister, with the organ and choir joining in for the rest of the song. It is a high honor to be chosen for the solo—to be the voice that rings in the spirit of Christmas in the hearts of people all around the world.
“Stille Nacht! Heilige Nacht!” was written by the Catholic priest Joseph Mohr while he was stationed at a pilgrim church in Mariapfarr, Austria. He was inspired to write the lyrics in 1816 when he went on a walk and looked out at a peaceful, snow-laden town, still and silent on a winter night. Two years later, Mohr asked his friend Franz Gruber, an organist and schoolmaster, to compose music to accompany the words for the Christmas Eve service at St. Nicholas Church in Oberndorf. The song was performed by a guitar and a choir.
Some accounts of the story say that the organ in the church was broken, and it could not be fixed until the spring when the snow had melted. Mohr was still determined to provide sacred music for Christmas Eve, and Gruber pulled through by composing a simple, peaceful melody on the guitar. Some versions blame a flood for the organ’s downfall, and others have a mouse chewing a hole in the leather of the organ bellows. But the truth is that guitar accompaniment was actually fairly common in Germany and Austria at this time, regardless of whether and why the organ was out of commission.
The song began to spread when an organ builder and repairman working at the church (maybe he was fixing the organ?) took a copy of the song to his home village. Two families of traveling folk singers picked it up and began to perform the song around northern Europe. The Strasser family performed “Stille Nacht” for the King of Prussia in 1834. The Rainer family of singers debuted the carol in America for the first time in 1839 outside Trinity Church in New York City.
The Episcopal priest John Freeman Young, who was assigned to Trinity Church, had taken up the hobby of translating hymns into English. He translated “Stille Nacht” into “Silent Night,” and his words are now sung by millions of people in English-speaking nations. The song has been translated into over 300 languages, and it is one of the most popular Christmas songs worldwide.
The Twelve Days of Christmas
The Christian twelve days of Christmas take place between Christmas on December 25, celebrating the birth of Jesus, and Epiphany on January 6, commemorating the adoration of the Magi. The “Twelve Days of Christmas” song was likely French in origin. The earliest version of an English poem about the twelve days of Christmas is found in Mirth With-out Mischief, a children’s book from 1780. It was written in the style of a “memory and forfeits” game, in which players tested how well they remembered the lyrics as each new verse was added and had to forfeit something to their opponents if they made a mistake.
In the song, the twelve days of Christmas promise the following gifts:
Day 1: a partridge in a pear tree Day 2: two turtle doves Day 3: three French hens Day 4: four calling birds Day 5: five gold rings Day 6: six geese a-laying Day 7: seven swans a-swimming Day 8: eight maids a-milking Day 9: nine ladies dancing Day 10: ten lords a-leaping Day 11: eleven pipers piping Day 12: twelve drummers drumming
There have been many variations to the lyrics over the years. Some carolers sang of “bears a-baiting,” “hares a-running, or “ships a-sailing,” and no one can ever seem to remember the order of the last four gifts. Sometimes “my mother gave” me the gifts, and other times “my true love sent” them to me.
Additionally, some variations came from mishearing the lyrics. The fourth day of Christmas came with “four colly birds” originally, using a regional English expression meaning “coal-black.” Those not familiar with the regional dialect substituted other words that made more sense to them—canary, colored, and curley birds are found in different versions of the song. Frederic Austin, the English composer who set the words to music in 1909, used “calling birds” in his version, which became the most popular variation.
Finally, some historians hypothesize that the five golden rings originally referred to the markings of a gold-necked pheasant. This would also make the golden rings gift consistent with the gifts for the rest of days 1–7, which are all birds.
And speaking of gifts, each year, PNC financial services calculates the Christmas Price Index (CPI) by totaling the price of one set of all the gifts given in the song. In 2021, the CPI is $41,205.58. (If you include all the repetitions, the total cost of the gifts from the singer’s true love is $179,454.19.) The CPI is up a whopping 5.7% from the previous year and 5.9% from the year before that due to the impact of the COVID pandemic on the cost of purchasing each item—who knew that the price of birds increased so much year-over-year? The nine ladies dancing, eleven pipers piping, and twelve drummers drumming were out of the running for the 2020 CPI, since the pandemic led to cancellation of most live performances, so it’s good to see they’re back this year!
In the 1990s, an email chain began to circulate that claimed “The Twelve Days of Christmas” encoded messages about important articles of Christian faith (i.e., the partridge in a pear tree was Jesus, the two turtledoves were the Old and New Testaments, and so on). The story went that, beginning in the 1500s, Catholics in England were prohibited from practicing their religion in public or private when the English crown established itself as the supreme head of the church in England. Catholics in hiding thus sang “The Twelve Days of Christmas” to teach children a catechism of their faith during a time of persecution.
This myth has been thoroughly debunked, given that there is no evidence to support it, and that it first surfaced seemingly out of nowhere in the 1990s. Further, nearly all the religious tenets supposedly “hidden” in the song were basic beliefs shared by Anglicans and Catholics alike, as well as many other Christian denominations at the time. There is no reason they would need to be secretly encoded in a Christmas song. It’s also not clear how believers would have remembered what each gift stood for, since there is not a clear relationship between the gift and the idea that it supposedly represented (how do eight maids a-milking help people remember the eight beatitudes?).
“The Twelve Days of Christmas” is a largely secular song that celebrates the Christmas season with one’s “true love” and with gifts, dancing, and music—which is not a bad thing, considering that the made-up stories about it hark back to a time when one group of Christians decided to torture and kill another group of Christians.
Hark! The Herald Angels Sing
The hymn we now know as “Hark! The Herald Angels Sing” was written by none other than Felix Mendelssohn in 1840 as a tribute to Johannes Gutenberg. It was performed as the second movement of Festgesang or Gutenberg Cantata during the unveiling of a statue of Gutenberg at the Leipzig Gutenberg Festival, which commemorated the 400th anniversary of the invention of the printing press.
A men’s chorus sang:
“Gutenberg, du wackrer Mann, du stehst glorreich auf dem Plan!”
“Gutenberg, you valiant man, you stand glorious on the square!”
Mendelssohn wanted to publish the music with English lyrics but couldn’t find quite the right text. He wrote, “If the right [words] are hit at, I am sure that the piece will be liked very much by singers and hearers, but it will never do to sacred words.”
In the 1850s, a choirboy named William Cummings who had sung in Mendelssohn’s oratorio Elijah hit on just the right text: “Hymn for Christmas-Day,” from Hymns and Sacred Poems (1739) by one of the founding ministers of Methodism, Charles Wesley. Wesley was a prolific hymnwriter and wrote over 6,500 of them.
The original tune for “Hymn for Christmas-Day” was slower and more solemn. Wesley intended it to be sung to the same tune as “Christ the Lord Is Ris’n Today.” The first line as written by Wesley was a little different, too:
“Hark! How all the welkin rings / Glory to the King of Kings.”
“Welkin” was an obscure word (even then) meaning the sky or heaven. In 1758, George Whitefield, a fellow founder of Methodism, changed Wesley’s text to the more familiar first line:
Hark! The Herald Angels sing / Glory to the new-born King!
Deck the Hall
“Deck the Hall” comes from a sixteenth-century Welsh New Year’s Eve song with the same tune but completely different lyrics. A literal translation of “Nos Galan,” as it was called in Welsh, goes something like this:
The best pleasure on New Year’s Eve, —Fa, la, &c. Is house and fire and a pleasant family, —Fa, la, &c. A pure heart and brown ale, —Fa, la, &c. A gentle song and the voice of the harp, —Fa, la, &c.
The “fa la la la la” is found in the original Welsh, and it was likely passed down from much earlier medieval ballads.
The song gained greater popularity when it was published in John Thomas’s Welsh Melodies in 1862, with the traditional text provided by the Welsh poet Talhaiarn side by side with English lyrics by the Scottish musician Thomas Oliphant. Oliphant’s lyrics were not a translation of the lyrics but a new poem entirely. His version may not have been the “ancient Christmas carol” it promises to be, since it came from a New Year’s Eve song, but it did draw on the ancient Christmas and Yuletide traditions of the British Isles:
Deckthe hall with boughs of holly, ’Tis the season to be jolly, Fill the meadcup, drain the barrel, –> Don we now our gay apparel Troll the ancient Christmas carol.
See the flowing bowl before us, –> See the blazing yule before us, Strike the harp and join the chorus. Follow me in merry measure, While I sing of beauty’s treasure. –> While I tell of Christmas treasure.
Fast away the old year passes, Hail the new, ye lads and lasses! Laughing, quaffing all together, –> Sing we joyous all together. Heedless of the wind and weather.
These verses have a decidedly Scottish flavor—“lads and lasses” and copious references to alcohol cement its origins in the British Isles. Notice that earlier version of the song was very merry indeed, and certain lines were later changed to remove references to drinking.
The druids of the pre-Roman British Isles saw holly as a sacred tree that retained the light of the sun the whole year round, since it was an evergreen that remained green through the winter. They decorated their homes with the leaves and berries of the holly tree as a symbol of life—a practice that has perpetuated throughout time. The very word “holly” may even be a variation of the word “holy.” Later on, Christians in Europe saw the red holly berries as representing the blood of Christ and the leaves representing His crown of thorns. Writings from 1598 state that “every man’s house, the parish churches, the corners of streets, and marketplaces in London were decorated with English holly (Ilex aquifolium) during the Christmas season.” Peoples all over the world, including Romans, Greeks, Native Americans, and Chinese have used holly for centuries as an aspect of winter and new year celebrations.
And the “blazing yule” before us is the log burnt in the home during the twelve days of Christmas. This tradition comes from a Scandinavian winter solstice ritual in which the log is an “emblem of divine light,” a symbol of the return of the sun following the darkest day of the year. The Yule log was lit from the remains of last year’s log, which was carefully preserved for this purpose. It provided light and warmth as family gathered around the fire and told, yes, ghost stories, and predicted their future for the coming year. The Yuletide fire cleansed the remnants of the old year and was hoped to bring forth a fruitful spring.
Some other things you may have wondered:
To “troll” means to sing the song in a round.
“Quaffing” means drinking (alcohol).
From Austrian church services to Welsh New Year’s Eve festivities, the origins of our favorite Christmas carols are fascinating. What song would you like to know the history behind?
Where does pie come from, and why do we eat it on Thanksgiving? The answer involves a certain bird known for collecting miscellaneous objects, Queen Elizabeth I, and elaborate dinnertime entertainment.
Tracing the origins of pie takes us back to the ancient Egyptians, who ate a crusty cake (similar to a modern galette) made from oats and barley with a honey filling. These early pies may have also included fruit or nuts. Drawings on the tomb walls of Ramses II and III depict spiral-shaped pastries that resemble galettes. A tablet from before 2,000 BCE included a recipe for a chicken pot pie as well.
In the fifth century BCE, the Greeks invented pie pastry that was used as the crust, as mentioned in the plays of Aristophenes. People began to work as pastry chefs as well, a separate occupation from bakers. Greek pies had mainly meat fillings.
Roman pies were likely adopted from the Greeks. Romans made pastry crust out of flour, oil, and water, but it was more of a carrying and storage container. As the pie cooked, the crust held in the juices of the pie filling, which typically included meat or vegetables but could also be sweet. A recipe for placenta (it may seem like an interesting name, but it literally means “flat cake”) in the second-century BCE cookbook De Agri Cultura by Cato the Elder may be one of the earliest recipes for a closed pie as well as an early recipe for cheesecake. It was made by encasing a sweet, thick filling of goat cheese, honey, and layers of pastry dough in crust.
From Egypt to Greece and then to Rome, the early pie had already been adapted in various ways. From there, Roman roads spread pies through Europe. As a greatly flexible format for baking any number of ingredients found in different environments, pies found expression in all parts of Europe.
Four and Twenty Blackbirds
By 1300 CE, pie had entered the English language—but through a rather interesting route. The word piehus (pie + house, meaning a pie bakery) is attested from the late 1100s, meaning that pie was likely used earlier but not written down in any surviving texts. Pies were also known as bake-metes (bake + meat).
The word pie or pye came to Middle English via Middle French from the Medieval Latin word pia, meaning “meat or fish enclosed in pastry.” It’s likely that this word is connected to pica, meaning “magpie”—the reason being that magpies have a habit of collecting miscellaneous objects, and a pie is a collection of various ingredients baked together in a pastry crust “nest.”
Pies became very common in England by the mid-1300s, and in 1378, King Richard II found it necessary to issue a law controlling pie prices in London. Pie is even mentioned in Geoffrey Chaucer’s Canterbury Tales, as a specialty of the disreputable cook who traveled with the pilgrims to Canterbury:
And he could roast and seethe and broil and fry And make a good thick soup, and bake a pie
A pie in this era had many ingredients, including meat and vegetables seasoned with pepper, currants, or dates, whereas a pastry had only one filling. Similar to Roman pies, the crust of a pie was not necessarily for eating, since it was hard and several inches thick, but rather for preserving the filling on long journeys and sea voyages. The crust was called a “coffyn” (coffin), which at that time typically referred to a container or chest where valuables were kept. When the filling of a pie was fowl, the bird came with legs still attached, dangling over the sides as handles to make the pie easier to eat. Other fillings included tortoise, beef, mutton, offal, and fish with spices like cinnamon, pepper, and orange peel.
Besides being a convenient and durable food at sea, pies soon became the centerpiece of exquisite banquets. Cooks baked increasingly elaborate pies with creative fillings to impress royalty as the pastry lid was removed to reveal what was inside.
Sometimes, cooked and redressed birds were placed on top of the pie to identify their contents.
You’ve probably heard the nursery rhyme “Sing a Song of Sixpence”:
Sing a song of sixpence A pocket full of rye Four and twenty blackbirds Baked in a pie When the pie was opened the birds began to sing Wasn’t that a dainty dish to set before the king
This is not just a silly rhyme—it literally happened. Royal cooks baked pies and then carefully concealed live birds in the crust just before serving. When the pie was cut open, the birds flew out. Recipes from Italian and English cookbooks in the 1500–1600s contain recipes for “to make pies so that birds may be alive in them and fly out when it is cut up” and “Live Birds in a Pie” (which also contained live frogs).
Some (very, very large) pies were even said to contain an actor reciting poetry or a whole band of musicians! A banquet thrown for the Duke of Burgundy in the 1500s featured a large pie with a young woman inside, which was the beginning of the “surprise pie” and “girl in the pie” tropes.
This was a form of entremet, a term that once referred to a set of small dishes served in between courses of a banquet or as dessert. But by the Middle Ages, entremet had become a form of entertainment through edible or nonedible ornaments and live performances. The four-and-twenty blackbird pie was a dish meant not just to be eaten but also to entertain and dazzle the royal taste.
She’s My Cherry Pie
Fruit pies entered the scene in the 1500s. English tradition attributes the first cherry pies to Queen Elizabeth I, who was known for her fondness for fruit pie. Forget the defeat of the Spanish Armada, the relative political stability during her reign, and a golden age of English literature and drama—the reign of Queen Elizabeth brought sweet, sweet dessert pies to the world.
Speaking of English literature, Shakespeare used pie as a dramatic device (as if it weren’t already dramatic enough) in Titus Andronicus. Titus has two villains baked into a pie because they attacked his daughter, and then serves the pie to the victims’ mother.
As American as Apple Pie
Though pie was featured in other European cuisines, the pie was an English specialty that was unmatched in any other country.But asEnglish settlers brought their beloved pies with them as they colonized North America, recipes diverged somewhat, and pie became as American as—well, apple pie. (Apple pie actually came from our friend Geoffrey Chaucer, who printed a recipe in the fourteenth century that included figs, raisins, and pears along with apples—but no sugar, as this would not have been widely available until the seventeenth century.)
In the 1600s and 1700s in America, pie was served at every meal and became a staple at social gatherings and celebrations. New England was even known as the “pie belt” due to the popularity of the pastry. The early colonists used long, narrow coffins to encase a variety of fillings. They eventually switched to a round shape, and during the Revolutionary Era, people finally began to eat the pie crust.
Shepherd’s pie and cottage pie were most popular at first, in true British fashion. Soon, though, colonists began to bake pies with fruits, berries, and other ingredients that grew in the New World. As colonists moved west, they continued to adapt their pies to use local ingredients. Over time, both America and Great Britain had increasingly greater access to sweeteners like maple syrup, molasses, and honey, as well as cane sugar due to the exploitation of enslaved people on sugar plantations in the Caribbean islands. It took a couple of centuries, but sweet pies eventually won over in America, while savory pies retained their hold in Great Britain.
Pie may have been served at early “thanksgiving” celebrations in 1618–1621, but pumpkin pie was most definitely not on the menu. The earliest recipe for pumpkin pie comes from a 1675 English cookbook and used a British preparation of spiced and boiled squash as a pie filling. This pie was savory rather than sweet and did not make its way to the United States until the 1800s. Squash was an import from the New World to the Old World, but pumpkin pie was an export from Great Britain to America.
Our national mythology surrounding Thanksgiving comes mainly from much later in our country’s history. Sarah Josepha Hale, who advocated to celebrate Thanksgiving as a unifying national holiday after the Civil War, described the perfect Thanksgiving dinner as including fried turkey with gravy, ham, wheat bread, cranberry sauce, and pumpkin pie.
In the South, sweet potato pie typically took the place of pumpkin pie due to greater availability, and pecan and walnut trees lent themselves to nut pies as well. Sweet potatoes came to America on ships that brought slaves from Africa. The sweet potato dishes that became a regional favorite in the southern United States thus have roots in the slave trade. In the North, pumpkin pie remained popular along with apple. The Midwest developed a liking for cream and cheese pies, while the plains inherited Swedish tart berry pies.
Pie has a long and storied past, much of it fascinating from the perspective of history, linguistics, and literature. But like many aspects of European history, pie was influenced by extravagant and indulgent displays of wealth in service of royalty. Like many aspects of American history, pie was influenced by colonialism and slavery. Going forward, may our celebrations and traditions and the food we associate with them connect us to the past while reaching toward a more equitable future.
Halloween comes from the Celtic festival of Samhain (pronounced “SAH-win”). It is observed on October 31–November 1, the midpoint between the fall equinox and the winter solstice. It marks the end of the harvest season and the beginning of the “dark half” of the year. Samhain was a liminal celebration, a time of transition between summer and winter, light and darkness—a time when “the normal order of the universe is suspended” (Rogers, 2002). The liminality during Samhain meant that the lines between the spirit world and the physical world began to dissolve for a night. Monsters, gods, spirits, fairies called Sidhs, and ancestors might cross over from the Otherworld into the human world. Spirits and fairies played tricks on mortals, and the night was one of supernatural intensity.
The holiday included such festivities as feasting, guising or mumming, divination, sacrifices, and a bonfire. Rituals mediated the sublimation between the supernatural and natural world but also displayed the values of hospitality, caring for the poor, and celebrating the cycle of death and rebirth.
During Samhain, people disguised themselves from the spirits that roamed the night and avoid becoming the target of their tricks. The best way to do that was to wear masks and dress in animal skins to blend in with the supernatural beings, to become a ghost or a fairy for the night.
Guising or mumming was a precursor to modern trick-or-treating. In this tradition, young people dressed in disguises visited houses in their village and played tricks, danced, and performed until the occupant guessed their identity and gave them food. As a type of ritualized hospitality, guising appeased the homeowners’ ancestors and blessed the house to be free from the mischief of the real spirits that were thought to roam the night.
On Samhain, households allowed their fireplaces to burn out while the harvest was gathered. Then, Druid priests started a community bonfire at the top of a hill using a wheel to cause friction and spark flames. The round wheel and the resulting light of the bonfire represented the sun, which was now retreating in the shorter days of winter. The pillar of smoke wafting up from the fire represented the axis mundi, the world pillar that connects heaven, earth, and the underworld. The fire itself protected the village from sinister spirits and Sidhs. Each person took a flame from the communal bonfire to re-light the hearth in their home, bringing the light and protection and warmth of community back to their own dwellings.
Sacrifices were also an important aspect of Samhain. People sacrificed animals that would not survive the cold months ahead to satisfy the spirits and also laid food as an offering at their ancestors’ graves. The poor in the community, who represented these ancestors, would gather in the cemetery and eat the offering.
Various divination practices and games provided both entertainment and somber predictions about death, marriage, and life. Some sought out wise women to prophesy about the year ahead. Some placed stones around the bonfire that represented people; those people ran around in a circle with torches, and in the morning, if a person’s stone was out of place, it signified imminent death. Other divination rituals involved using food like apples, hazelnuts, or oatmeal to predict one’s future or even the name of one’s true love. One divination trick involved hiding items in a cake, and a person’s future was signified by whatever they found in their portion of cake, such as a coin for wealth or a ring for marriage.
When Rome conquered the Celts in the first century CE, they introduced their own traditions into Samhain. These included Feralia, a public festival honoring the dead, and the feast of Pomona, which celebrated the first apple harvest of the year in honor of the goddess of the harvest.
Starting in the fifth century CE, as Christianity began to grow in areas that were once pagan, church leaders began to reframe Samhain as a Christian celebration, in a display of cultural adaptation we’ve also seen in Christmas and Easter traditions.
In the seventh century, Pope Boniface cast it as a day to celebrate Christian saints and martyrs and moved the date to May 13. This didn’t stop anyone from continuing to build communal bonfires in the fall.
A century later, Pope Gregory moved the date of the celebration back to the fall. The night of October 31 became known as All Hallows Eve (“hallow” referring to a saint or holy person). Hallowe’en (“a holy evening”), as it was later called, became an evening vigil where families visit the graves of loved ones to pray and leave flowers and candles. Some also bring with them a feast, including their dearly departed ones’ favorite foods. Gregory also declared that November 1 would now be a feast day called All Saints Day. This day is an opportunity to remember all the known and unknown saints and martyrs throughout Christian history. In the tenth century, Abbot Odela of the Cluny Monastery designated November 2 as All Souls Day to honor not just saints but all Christians who had passed on. Catholics and Anglicans today consider All Hallows Eve, All Saints Day, and All Souls Day to be holy days to remind themselves to live as the saints and then to ask for God’s mercy for all souls. Throughout November, a Book of the Dead is placed near the altar in the church for parishioners to write the names of the dead they wish to be remembered.
Even though Christian celebrations began to take hold, Pope Gregory’s declaration didn’t stop pagan traditions—people continued to celebrate the harvest, the seasons, the supernatural encounters, the sharing of light during the beginning of the darkest time of the year. By the end of the Middle Ages, the merging of the sacred and the secular produced a richly textured mix of meanings and traditions, all centered around the connection between the mortal world and the world of spirits.
Hundreds of years later, the Irish had spread them to other countries in Europe and brought them across the Atlantic to America. The Reformation in Europe had led to the prohibition of All Hallows Eve among Protestants, but Halloween persisted as a secular holiday. The Puritan tendencies of early America prohibited Halloween there as well. But the influx of Irish immigrants in the nineteenth century brought the widespread celebration of Halloween in conjunction with existing harvest celebrations and fall festivities. The cultural amalgamation of Celtic, Roman, Christian, American, and likely other traditions has produced the Halloween we know today—a night full of costumes, mischief, tricks and treats, apple bobbing, and fall festivities, all of which have ties back to the rituals of Samhain.
Why do we do change our clocks twice a year for Daylight Saving Time? The answer involves Benjamin Franklin’s trusty almanac, bug hunting, and coal-powered warfare (notice that farmers are not on the list).
“Spring forward, fall back.” Every second Sunday in March, groans echo throughout 75 countries in the world as everyone gets up an hour earlier than their body is used to. And every first Sunday in November, those same people rejoice when they get to sleep in for an extra hour. The idea is to maximize sunlight during waking hours in the Northern Hemisphere’s spring and summer by shifting our clocks to add an hour of sunlight to the end of the day. We’re not actually losing or gaining any time; we’re simply robbing an hour from March and giving it to November to “save” daylight. But who is the Robin Hood responsible for such theft?
The Origin of Daylight Saving Time
You’ve probably heard that Daylight Saving Time (DST) was proposed to benefit farmers who wanted extra daylight to work in their fields later in the evening, but this is a myth.
In 1784, Benjamin Franklin published a satirical letter in the Journal de Paris, lamenting that most Parisians slept until noon (at least, he did) even when the sun rose at 6:00 a.m. According to his almanac, which listed the hour of the sunrise and sunset each day, they were missing out on six whole hours of natural sunlight but burning candles late into the night. Though Franklin didn’t suggest a shift in clocks, he suggested a shift in schedules to align life more fully with the rise and set of the sun, who “gives light as soon as he rises.” He calculated that, by doing so, the country could save the modern equivalent of $200 million by “the economy of using sunshine instead of candles.”
As Franklin’s letter hints, the primary policy rationale behind DST is actually energy conservation, though society was burning coal more than candles by the time it was proposed.
In 1855, a New Zealand entomologist named George Hudson suggested a two-hour time shift to allow for more light in the evening hours to go bug-hunting. In the early 1900s, William Willett independently came up with the idea to help Great Britain avoid wasting daylight and proposed it in Parliament, backed by Winston Churchill and Sir Arthur Conan Doyle—but to no avail.
Finally, in 1916—two years into World War I—Germany took notice of Willett’s idea of moving the clock forward and adopted it as a way to conserve energy during the war effort. Almost every other country involved in the war soon passed daylight saving laws. Because industrialized nations were primarily using coal power, the time shift actually did save energy and contribute to the war effort during this era.
(And in World War I, coal was power. As Germany faced international blockades and domestic shortages of necessary resources, the British allied forces’ control of the coal industry became one of the decisive, war-ending assets that led to the defeat of the Axis powers. Coal fueled the British blockade that weakened Germany to the point of defeat.)
Though DST was mainly a way to save fuel, another economic objective behind it after the war was to encourage people to use the extended daylight hours in the evening to shop, attend sporting and recreational events, and spend more time outdoors.
Daylight Saving Time in America
The United States formally adopted Daylight Saving Time in 1918. The dates when the time change occurs have been changed over the years, and the most current legislation is the 1966 Uniform Time Act, which regulates time zones and the observance of DST across the country.
Currently, all states but Hawaii and Arizona currently observe Daylight Saving Time. Hawaii abandoned Daylight Saving Time in 1967 because it is close enough to the equator that the sun generally rises and sets around the same time each day, regardless of the time of year. (Likewise, most tropical nations and territories do not observe Daylight Saving Time either because variations in day length are negligible.) Since 1968, Arizona has permanently been on Mountain Standard Time, with the exception of the Navajo Nation and the Hopi Reservation, which do observe Daylight Saving Time. (This means that if you drive east on the Arizona State Route 264 while DST is in place, you will change time zones six times in less than 100 miles!) Due to its location, there is plenty of daylight in Arizona year-round, and residents benefit from cooler temperatures in the evening rather than more sunlight.
In 2021 alone, thirty-three states have introduced legislation addressing the issue of DST. In the last four years, nineteen states have passed legislation or resolutions to enact DST year-round. However, they still need the approval of Congress for this legislation to take effect. Some critics of DST argue that permanently turning our clocks ahead an hour will not only eliminate the nuisance of the time change but, more importantly, alleviate some of the health consequences of DST in the spring while maintaining quality of life in the winter. Different states vary in their preference of remaining permanently on daylight time or standard time, but, as noted by the National Conference of State Legislatures, “the actual March and November time changes are almost universally reviled because of all the accompanying adjustments we must make, like coming home from work in the dark and the slower-than-expected resetting of our internal time clocks” (NCSL, 2021).
The Pros and Cons
We know we can’t create more daylight, even if we tried. The earth will continue its rotation and revolution, unhindered by our puny human efforts. But we can manipulate the way we think about it by altering our construction of time. One benefit of DST is that it provides longer evenings in the spring and summer since we wake up an hour earlier. The extra hour of light provides time for outdoor activities, encouraging a more active lifestyle and increased spending in the tourism and recreation industries.
A potential benefit of DST is increased safety. Some studies have found that DST contributes to a reduction in pedestrian fatalities and crimes such as robbery in the evening hours simply because it stays light later. However, other studies have found that fatal car crashes increase by 6% in the week after we “spring forward,” especially in the morning hours, due to a disruption in sleep cycles. Sleep deprivation causes more drowsy driving incidents during this period, as well as contributing to an uptick in heart attacks, strokes, and workplace injuries. These unwarranted interruptions in our circadian rhythms seem to do more harm than good. One researcher commented, “It would be better for sleep, the body clock, and overall health to have more morning light and less evening light, as is the case under permanent standard time” (Ries, 2020). However, it’s important to note that these disruptions are temporary, often lasting just a few days. For example, the incidence of heart attacks rises 25% on the Monday following the March change to DST, but the overall incidence of heart attacks throughout that week is average as compared to the rest of the year.
The most often cited benefit of DST is that our daily routines coincide more with the hours of natural daylight, reducing the need for artificial light and yielding energy savings, albeit very modest ones—a meta-analysis found average energy savings of 0.34% due to DST. Energy savings are largest farther from the equator, while subtropical areas actually increase energy use due to DST. Another study found that even when electricity usage for lighting goes in the winter down due to DST, energy usage for heating and cooling goes up, rendering the overall effect neutral. The researchers concluded that “the effects of daylight saving time on energy consumption are too small to justify the biannual time-shifting” (Havranek, Herman, and Irsova, 2016, p. 26).
Research shows that only 33% of Americans are in favor of continuing Daylight Saving Time. Most see it as an annoyance, and most proposed “benefits” turn into downfalls with a little investigation. More than 140 countries have adopted DST at some point, but about half have abolished it since. Will the United States be next?
Coate, Douglas, and Sara Markowitz. “The Effects of Daylight and Daylight Saving Time on US Pedestrian Fatalities and Motor Vehicle Occupant Fatalities.” Accident Analysis & Prevention, vol.36, no. 3 (May 2004): 351–357. https://doi.org/10.1016/S0001-4575(03)00015-0.
Fritz, Joseph, Trang VoPham, Kenneth P. Wright Jr., and Céline Vetter. “A Chronobiological Evaluation of the Acute Effects of Daylight Saving Time on Traffic Accident Risk.” Current Biology, vol. 30, no. 4 (January 2020): 729–735.E2. https://doi.org/10.1016/j.cub.2019.12.045.
Kotchen, Matthew J., and Laura E. Grant. “Does Daylight Saving Time Save Energy? Evidence from a Natural Experiment in Indiana.” NBER Working Paper 14429, October 2008. https://www.nber.org/papers/w14429.
Why do we take something uncertain “with a grain of salt”? The answer involves a universal antidote to poison, Bible commentary, and some questionable photos of Ireland.
To take something with a grain of salt means to understand that something may not be completely accurate, to interpret something skeptically because it may be unverified or uncertain. For example, if you were relating an interesting fact about panda bears that you heard from a tourist at the zoo, you could tell your friends to “take it with a grain of salt” since you aren’t sure whether the source of information is trustworthy.
Outside the United States, other English-speaking countries use the phrase “take it with a pinch of salt” to mean the same thing.
But why a grain or pinch of salt? Why not a twist of lime or a drizzle of olive oil?
The Roman Cure
King Mithridates VI (135–63 BCE), ruler of the Hellenistic Kingdom of Pontus, was continually in conflict with the Roman Republic for decades. His relentless attempts to build an empire made him one of Rome’s most formidable opponents and one of the most celebrated rulers of Pontus. In addition to his military endeavors, he has gone down in history as “The Poison King.”
Mithridates was obsessed with toxicology and paranoid that his enemies were planning to poison him. His fear over real and imagined assassination attempts led him to research all known toxins and their remedies, experimenting on prisoners of war to understand the effects of various substances. He attempted to make himself immune to poison, Princess Bride-style, by ingesting small doses and gradually increasing the amount to build up tolerance. Later scholars including Pliny the Elder (CE 23–79) claimed that Mithridates developed and regularly ingested a universal antidote for all known poisons, known as mithridate or mithridatium. Pliny wrote that Mithridates’ panacea contained over 50 different ingredients, including small amounts of various poisons, that were ground into power and mixed with honey. The original recipe, however, has been lost to history. Historians today believe that Mithridates likely did not actually have such an antidote, but continued to fund research while publicly bragging that he already had it to fend off potential assassination attempts.
Pliny, a Roman author and natural philosopher, amassed a great body of knowledge from studying and investigating natural and geographic phenomena. He wrote the Naturalis Historia, which claimed to cover all ancient knowledge and became an editorial model for later encyclopedias.
Plinywrote in the Naturalis Historia that after the Roman general Pompey (106–48 BCE) defeated Mithridates, he found in Mithridates’ private cabinet the following recipe for an antidote in Mithridates’ own handwriting:
Take two dried walnuts, two figs and twenty leaves of rue; pound them all together, with the addition of a grain of salt; if a person takes this mixture fasting, he will be proof against all poisons for that day.
The Latin phrase addito salis grano literally means “after having added a grain of salt,” but it was translated as “with a grain of salt” (cum grano salis in Latin) to more closely match the grammar of modern Romance languages. The idea here is that a poison or an unsavory medical cure is more easily swallowed with a small amount of salt.
The Modern Medicine
The implication that a grain of salt can mediate the effect of poison did not take on a metaphorical slant until much later, influenced by scholarly study of classical Latin texts. In 1647, the English religious commentator John Trapp wrote, “This is to be taken with a grain of salt.” No one is exactly sure what he meant, and it’s possible that this expression did not convey the same meaning it holds today. Perhaps a particular piece of commentary on the Bible was a little hard to swallow, for whatever reason.
The phrase didn’t really gain traction until the early twentieth century. It did not surface again until the August 1908 edition of The Athenæum, a U.S. literary journal. The journal included this text:
Our reasons for not accepting the author’s pictures of early Ireland without many grains of salt . . .
Apparently, the photographer’s work did not meet the editorial guidelines of the journal. By this time, it seems that the metaphor was already common enough that readers understood the meaning even when it was slightly altered for rhetorical effect.
From here, the saying “with a grain of salt”—based on the idea of using salt to make something unpalatable easier to swallow—began to catch on as a metaphor for adding a little skepticism when consuming potentially doubtful information.
The UK caught on later in the century. The earliest printed citation comes from the 1948 book Cicero& the Roman Republic:
A more critical spirit slowly developed, so that Cicero and his friends took more than the proverbial pinch of salt before swallowing everything written by these earlier authors.
This quote itself provides a good lesson on studying etymology and language change—use good judgment, vet your sources, and take things with a grain of salt when it seems that there are gaps in the historical narrative.
Corwell, F. H. Cicero & the Roman Republic. Pelican Book, 1948.