Where Does Halloween Come From?

Where does Halloween come from? The answer involves community bonfires, ritualized hospitality, and visits from the Otherworld.

We’ve talked about trick-or-treating and jack-o’-lanterns, so now let’s dive into the origins of the holiday itself!

Celtic Samhain

Halloween comes from the Celtic festival of Samhain (pronounced “SAH-win”). It is observed on October 31–November 1, the midpoint between the fall equinox and the winter solstice. It marks the end of the harvest season and the beginning of the “dark half” of the year. Samhain was a liminal celebration, a time of transition between summer and winter, light and darkness—a time when “the normal order of the universe is suspended” (Rogers, 2002). The liminality during Samhain meant that the lines between the spirit world and the physical world began to dissolve for a night. Monsters, gods, spirits, fairies called Sidhs, and ancestors might cross over from the Otherworld into the human world. Spirits and fairies played tricks on mortals, and the night was one of supernatural intensity.

The holiday included such festivities as feasting, guising or mumming, divination, sacrifices, and a bonfire. Rituals mediated the sublimation between the supernatural and natural world but also displayed the values of hospitality, caring for the poor, and celebrating the cycle of death and rebirth.

During Samhain, people disguised themselves from the spirits that roamed the night and avoid becoming the target of their tricks. The best way to do that was to wear masks and dress in animal skins to blend in with the supernatural beings, to become a ghost or a fairy for the night.

Guising or mumming was a precursor to modern trick-or-treating. In this tradition, young people dressed in disguises visited houses in their village and played tricks, danced, and performed until the occupant guessed their identity and gave them food. As a type of ritualized hospitality, guising appeased the homeowners’ ancestors and blessed the house to be free from the mischief of the real spirits that were thought to roam the night.

On Samhain, households allowed their fireplaces to burn out while the harvest was gathered. Then, Druid priests started a community bonfire at the top of a hill using a wheel to cause friction and spark flames. The round wheel and the resulting light of the bonfire represented the sun, which was now retreating in the shorter days of winter. The pillar of smoke wafting up from the fire represented the axis mundi, the world pillar that connects heaven, earth, and the underworld. The fire itself protected the village from sinister spirits and Sidhs. Each person took a flame from the communal bonfire to re-light the hearth in their home, bringing the light and protection and warmth of community back to their own dwellings.

Sacrifices were also an important aspect of Samhain. People sacrificed animals that would not survive the cold months ahead to satisfy the spirits and also laid food as an offering at their ancestors’ graves. The poor in the community, who represented these ancestors, would gather in the cemetery and eat the offering.

Various divination practices and games provided both entertainment and somber predictions about death, marriage, and life. Some sought out wise women to prophesy about the year ahead. Some placed stones around the bonfire that represented people; those people ran around in a circle with torches, and in the morning, if a person’s stone was out of place, it signified imminent death. Other divination rituals involved using food like apples, hazelnuts, or oatmeal to predict one’s future or even the name of one’s true love. One divination trick involved hiding items in a cake, and a person’s future was signified by whatever they found in their portion of cake, such as a coin for wealth or a ring for marriage.

Snap-Apple Night shows a Halloween party where people are bobbing for apples.
Snap-Apple Night shows people feasting, playing divination games, and bobbing for apples in a Halloween party in Ireland. These traditions have their roots in the festival of Samhain.
By Daniel Maclise, 1833.

Roman Samhain

When Rome conquered the Celts in the first century CE, they introduced their own traditions into Samhain. These included Feralia, a public festival honoring the dead, and the feast of Pomona, which celebrated the first apple harvest of the year in honor of the goddess of the harvest.

A tapestry showing Pomona gathering apples
Pomona, Roman goddess of orchards, fruit, and the harvest.
By Francis Helminski, Wikimedia Commons.

Christian Samhain

Starting in the fifth century CE, as Christianity began to grow in areas that were once pagan, church leaders began to reframe Samhain as a Christian celebration, in a display of cultural adaptation we’ve also seen in Christmas and Easter traditions.

In the seventh century, Pope Boniface cast it as a day to celebrate Christian saints and martyrs and moved the date to May 13. This didn’t stop anyone from continuing to build communal bonfires in the fall.

A century later, Pope Gregory moved the date of the celebration back to the fall. The night of October 31 became known as All Hallows Eve (“hallow” referring to a saint or holy person). Hallowe’en (“a holy evening”), as it was later called, became an evening vigil where families visit the graves of loved ones to pray and leave flowers and candles. Some also bring with them a feast, including their dearly departed ones’ favorite foods. Gregory also declared that November 1 would now be a feast day called All Saints Day. This day is an opportunity to remember all the known and unknown saints and martyrs throughout Christian history. In the tenth century, Abbot Odela of the Cluny Monastery designated November 2 as All Souls Day to honor not just saints but all Christians who had passed on. Catholics and Anglicans today consider All Hallows Eve, All Saints Day, and All Souls Day to be holy days to remind themselves to live as the saints and then to ask for God’s mercy for all souls. Throughout November, a Book of the Dead is placed near the altar in the church for parishioners to write the names of the dead they wish to be remembered.

Christians pray and place candles and flowers on the graves of their loved ones on All Hallows Eve.
On All Hallows Eve, Christians visit the graves of their loved ones to pray and leave flowers and candles.
Photo by Kaj Bjurman, 2007, from Wikimedia Commons.

Even though Christian celebrations began to take hold, Pope Gregory’s declaration didn’t stop pagan traditions—people continued to celebrate the harvest, the seasons, the supernatural encounters, the sharing of light during the beginning of the darkest time of the year. By the end of the Middle Ages, the merging of the sacred and the secular produced a richly textured mix of meanings and traditions, all centered around the connection between the mortal world and the world of spirits.

Hundreds of years later, the Irish had spread them to other countries in Europe and brought them across the Atlantic to America. The Reformation in Europe had led to the prohibition of All Hallows Eve among Protestants, but Halloween persisted as a secular holiday. The Puritan tendencies of early America prohibited Halloween there as well. But the influx of Irish immigrants in the nineteenth century brought the widespread celebration of Halloween in conjunction with existing harvest celebrations and fall festivities. The cultural amalgamation of Celtic, Roman, Christian, American, and likely other traditions has produced the Halloween we know today—a night full of costumes, mischief, tricks and treats, apple bobbing, and fall festivities, all of which have ties back to the rituals of Samhain.

Sources

@_soul_stice, Instagram post, October 19, 2021.

Crain, Alex. “All Saints’ Day – The Meaning and History Behind the November 1st Holiday.” Christianity, October 29, 2021. https://www.christianity.com/church/church-history/all-saints-day-november-1.html.

Joukowsky Institute for Archaeology & the Ancient World. “Origins in Samhain.” Brown University. https://www.brown.edu/Departments/Joukowsky_Institute/courses/13things/7448.html.

History.com editors. “Samhain.” History.com, October 19, 2021. https://www.history.com/topics/holidays/samhain.

Rogers, Nicholas. Halloween: From Pagan Ritual to Party Night. (Oxford University Press, 2002).

The Editors of Encyclopaedia Britannica. “Halloween.” Encyclopaedia Britannica. Retrieved October 30, 2021, from https://www.britannica.com/topic/Halloween.

The Editors of Encyclopaedia Britannica. “Samhain.” Encyclopaedia Britannica. Retrieved October 30, 2021, from https://www.britannica.com/topic/Samhain.

Wikipedia. “Samhain.” Retrieved October 30, 2021, from https://en.wikipedia.org/wiki/Samhain.

Why Do We STILL Do Daylight Saving Time?

Why do we do change our clocks twice a year for Daylight Saving Time? The answer involves Benjamin Franklin’s trusty almanac, bug hunting, and coal-powered warfare (notice that farmers are not on the list).

“Spring forward, fall back.” Every second Sunday in March, groans echo throughout 75 countries in the world as everyone gets up an hour earlier than their body is used to. And every first Sunday in November, those same people rejoice when they get to sleep in for an extra hour. The idea is to maximize sunlight during waking hours in the Northern Hemisphere’s spring and summer by shifting our clocks to add an hour of sunlight to the end of the day. We’re not actually losing or gaining any time; we’re simply robbing an hour from March and giving it to November to “save” daylight. But who is the Robin Hood responsible for such theft?

The Origin of Daylight Saving Time

You’ve probably heard that Daylight Saving Time (DST) was proposed to benefit farmers who wanted extra daylight to work in their fields later in the evening, but this is a myth.

In 1784, Benjamin Franklin published a satirical letter in the Journal de Paris, lamenting that most Parisians slept until noon (at least, he did) even when the sun rose at 6:00 a.m. According to his almanac, which listed the hour of the sunrise and sunset each day, they were missing out on six whole hours of natural sunlight but burning candles late into the night. Though Franklin didn’t suggest a shift in clocks, he suggested a shift in schedules to align life more fully with the rise and set of the sun, who “gives light as soon as he rises.” He calculated that, by doing so, the country could save the modern equivalent of $200 million by “the economy of using sunshine instead of candles.”

As Franklin’s letter hints, the primary policy rationale behind DST is actually energy conservation, though society was burning coal more than candles by the time it was proposed.

In 1855, a New Zealand entomologist named George Hudson suggested a two-hour time shift to allow for more light in the evening hours to go bug-hunting. In the early 1900s, William Willett independently came up with the idea to help Great Britain avoid wasting daylight and proposed it in Parliament, backed by Winston Churchill and Sir Arthur Conan Doyle—but to no avail.

Finally, in 1916—two years into World War I—Germany took notice of Willett’s idea of moving the clock forward and adopted it as a way to conserve energy during the war effort. Almost every other country involved in the war soon passed daylight saving laws. Because industrialized nations were primarily using coal power, the time shift actually did save energy and contribute to the war effort during this era.

(And in World War I, coal was power. As Germany faced international blockades and domestic shortages of necessary resources, the British allied forces’ control of the coal industry became one of the decisive, war-ending assets that led to the defeat of the Axis powers. Coal fueled the British blockade that weakened Germany to the point of defeat.)

Though DST was mainly a way to save fuel, another economic objective behind it after the war was to encourage people to use the extended daylight hours in the evening to shop, attend sporting and recreational events, and spend more time outdoors.

A map showing countries that observe Daylight Saving Time.
Countries that observe Daylight Saving Time.
Image by TimeZonesBoy, May 30, 2013, CC BY-SA 3.0, Wikimedia Commons.

Daylight Saving Time in America

A senator changing the time on a clock.
The U.S. Senate clock is changed for the first DST in America, 1918.
Image by U.S. Government.

The United States formally adopted Daylight Saving Time in 1918. The dates when the time change occurs have been changed over the years, and the most current legislation is the 1966 Uniform Time Act, which regulates time zones and the observance of DST across the country.

Currently, all states but Hawaii and Arizona currently observe Daylight Saving Time. Hawaii abandoned Daylight Saving Time in 1967 because it is close enough to the equator that the sun generally rises and sets around the same time each day, regardless of the time of year. (Likewise, most tropical nations and territories do not observe Daylight Saving Time either because variations in day length are negligible.) Since 1968, Arizona has permanently been on Mountain Standard Time, with the exception of the Navajo Nation and the Hopi Reservation, which do observe Daylight Saving Time. (This means that if you drive east on the Arizona State Route 264 while DST is in place, you will change time zones six times in less than 100 miles!) Due to its location, there is plenty of daylight in Arizona year-round, and residents benefit from cooler temperatures in the evening rather than more sunlight.

In 2021 alone, thirty-three states have introduced legislation addressing the issue of DST. In the last four years, nineteen states have passed legislation or resolutions to enact DST year-round. However, they still need the approval of Congress for this legislation to take effect. Some critics of DST argue that permanently turning our clocks ahead an hour will not only eliminate the nuisance of the time change but, more importantly, alleviate some of the health consequences of DST in the spring while maintaining quality of life in the winter. Different states vary in their preference of remaining permanently on daylight time or standard time, but, as noted by the National Conference of State Legislatures, “the actual March and November time changes are almost universally reviled because of all the accompanying adjustments we must make, like coming home from work in the dark and the slower-than-expected resetting of our internal time clocks” (NCSL, 2021).

The Pros and Cons

We know we can’t create more daylight, even if we tried. The earth will continue its rotation and revolution, unhindered by our puny human efforts. But we can manipulate the way we think about it by altering our construction of time. One benefit of DST is that it provides longer evenings in the spring and summer since we wake up an hour earlier. The extra hour of light provides time for outdoor activities, encouraging a more active lifestyle and increased spending in the tourism and recreation industries.

A cartoon with Hercules trying to stop a clock. "You can't stop time. . . but you can turn it back one hour at 2 a.m. on Oct. 28 when daylight saving time ends and standard time begins."
A cartoon reminding people to set their clocks back. DST now occurs the first Sunday in November.
Image from U.S. Government.

A potential benefit of DST is increased safety. Some studies have found that DST contributes to a reduction in pedestrian fatalities and crimes such as robbery in the evening hours simply because it stays light later. However, other studies have found that fatal car crashes increase by 6% in the week after we “spring forward,” especially in the morning hours, due to a disruption in sleep cycles. Sleep deprivation causes more drowsy driving incidents during this period, as well as contributing to an uptick in heart attacks, strokes, and workplace injuries. These unwarranted interruptions in our circadian rhythms seem to do more harm than good. One researcher commented, “It would be better for sleep, the body clock, and overall health to have more morning light and less evening light, as is the case under permanent standard time” (Ries, 2020). However, it’s important to note that these disruptions are temporary, often lasting just a few days. For example, the incidence of heart attacks rises 25% on the Monday following the March change to DST, but the overall incidence of heart attacks throughout that week is average as compared to the rest of the year.

The most often cited benefit of DST is that our daily routines coincide more with the hours of natural daylight, reducing the need for artificial light and yielding energy savings, albeit very modest ones—a meta-analysis found average energy savings of 0.34% due to DST. Energy savings are largest farther from the equator, while subtropical areas actually increase energy use due to DST. Another study found that even when electricity usage for lighting goes in the winter down due to DST, energy usage for heating and cooling goes up, rendering the overall effect neutral. The researchers concluded that “the effects of daylight saving time on energy consumption are too small to justify the biannual time-shifting” (Havranek, Herman, and Irsova, 2016, p. 26).

Research shows that only 33% of Americans are in favor of continuing Daylight Saving Time. Most see it as an annoyance, and most proposed “benefits” turn into downfalls with a little investigation. More than 140 countries have adopted DST at some point, but about half have abolished it since. Will the United States be next?

Sources

Blakemore, Erin. “Daylight Saving Time 2019: The Odd History of Changing Our Clocks.” National Geographic, November 1, 2019. https://www.nationalgeographic.com/science/article/daylight-savings-time-arizona-florida-spring-forward-science.

Coate, Douglas, and Sara Markowitz. “The Effects of Daylight and Daylight Saving Time on US Pedestrian Fatalities and Motor Vehicle Occupant Fatalities.” Accident Analysis & Prevention, vol.36, no. 3 (May 2004): 351–357. https://doi.org/10.1016/S0001-4575(03)00015-0.

Ducharme, Jamie. “The Reason Some States Don’t Observe Daylight Saving Time.” Time, November 4, 2017. https://time.com/5005600/states-without-daylight-savings-time/.

Franklin, Benjamin. Letter to the Editor of the Journal of Paris. 1784. http://www.webexhibits.org/daylightsaving/franklin3.html.

Fritz, Joseph, Trang VoPham, Kenneth P. Wright Jr., and Céline Vetter. “A Chronobiological Evaluation of the Acute Effects of Daylight Saving Time on Traffic Accident Risk.” Current Biology, vol. 30, no. 4 (January 2020): 729–735.E2. https://doi.org/10.1016/j.cub.2019.12.045.

Havranek, Tomas, Dominik Herman, and Zuzana Irsova. “Does Daylight Saving Save Energy? A Meta-Analysis.” MPRA Paper No. 74518, October 12, 2016. https://mpra.ub.uni-muenchen.de/74518/1/MPRA_paper_74518.pdf.

Kotchen, Matthew J., and Laura E. Grant. “Does Daylight Saving Time Save Energy? Evidence from a Natural Experiment in Indiana.” NBER Working Paper 14429, October 2008. https://www.nber.org/papers/w14429.

National Conference of State Legislatures. “Daylight Saving Time | State Legislation.” October 8, 2021. https://www.ncsl.org/research/transportation/daylight-savings-time-state-legislation.aspx.

Ries, Julia. “The Number of Fatal Car Accidents Spikes after Daylight Saving Time.” Healthline, March 6, 2020. https://www.healthline.com/health-news/daylight-saving-can-make-driving-less-safe.

Thorsen, Steffen, and Anne Buckle. “Daylight Saving Time Statistics.” Time and Date. Retrieved October 11, 2021, from https://www.timeanddate.com/time/dst/statistics.html.

Wei-Haas, Maya. “Why Daylight Saving Time Exists—And Is So Unpopular.” National Geographic, March 12, 2021. https://www.nationalgeographic.com/science/article/daylight-saving-time.

Wikipedia. “Time in Arizona.” Retrieved October 11, 2021, from https://en.wikipedia.org/wiki/Time_in_Arizona.

Zeller, Joseph. “Coal: A Significant Factor in Germany’s Defeat in World War I.” Canadian Military History, vol. 27, no. 1 (2018).

Take This Article with a Grain of Salt

Why do we take something uncertain “with a grain of salt”? The answer involves a universal antidote to poison, Bible commentary, and some questionable photos of Ireland.

To take something with a grain of salt means to understand that something may not be completely accurate, to interpret something skeptically because it may be unverified or uncertain. For example, if you were relating an interesting fact about panda bears that you heard from a tourist at the zoo, you could tell your friends to “take it with a grain of salt” since you aren’t sure whether the source of information is trustworthy.

Outside the United States, other English-speaking countries use the phrase “take it with a pinch of salt” to mean the same thing.

But why a grain or pinch of salt? Why not a twist of lime or a drizzle of olive oil?

The Roman Cure

King Mithridates VI (135–63 BCE), ruler of the Hellenistic Kingdom of Pontus, was continually in conflict with the Roman Republic for decades. His relentless attempts to build an empire made him one of Rome’s most formidable opponents and one of the most celebrated rulers of Pontus. In addition to his military endeavors, he has gone down in history as “The Poison King.”

Portrait of the king of Pontus Mithridates VI as Heracles. Marble, Roman imperial period (1st century).
Portrait of the king of Pontus Mithridates VI as Heracles. Marble, Roman imperial period (1st century).
Image by Sting, CC BY-SA 2.5, via Wikimedia Commons

Mithridates was obsessed with toxicology and paranoid that his enemies were planning to poison him. His fear over real and imagined assassination attempts led him to research all known toxins and their remedies, experimenting on prisoners of war to understand the effects of various substances. He attempted to make himself immune to poison, Princess Bride-style, by ingesting small doses and gradually increasing the amount to build up tolerance. Later scholars including Pliny the Elder (CE 23–79) claimed that Mithridates developed and regularly ingested a universal antidote for all known poisons, known as mithridate or mithridatium. Pliny wrote that Mithridates’ panacea contained over 50 different ingredients, including small amounts of various poisons, that were ground into power and mixed with honey. The original recipe, however, has been lost to history. Historians today believe that Mithridates likely did not actually have such an antidote, but continued to fund research while publicly bragging that he already had it to fend off potential assassination attempts.

Three jars used to hold the semi-mythical drug mithridatum,
By Wellcome Images, CC BY-SA 4.0 via Wikimedia Commons.

Pliny, a Roman author and natural philosopher, amassed a great body of knowledge from studying and investigating natural and geographic phenomena. He wrote the Naturalis Historia, which claimed to cover all ancient knowledge and became an editorial model for later encyclopedias.

Plinywrote in the Naturalis Historia that after the Roman general Pompey (106–48 BCE) defeated Mithridates, he found in Mithridates’ private cabinet the following recipe for an antidote in Mithridates’ own handwriting:

Take two dried walnuts, two figs and twenty leaves of rue; pound them all together, with the addition of a grain of salt; if a person takes this mixture fasting, he will be proof against all poisons for that day.

The Latin phrase addito salis grano literally means “after having added a grain of salt,” but it was translated as “with a grain of salt” (cum grano salis in Latin) to more closely match the grammar of modern Romance languages. The idea here is that a poison or an unsavory medical cure is more easily swallowed with a small amount of salt.

The Modern Medicine

The implication that a grain of salt can mediate the effect of poison did not take on a metaphorical slant until much later, influenced by scholarly study of classical Latin texts. In 1647, the English religious commentator John Trapp wrote, “This is to be taken with a grain of salt.” No one is exactly sure what he meant, and it’s possible that this expression did not convey the same meaning it holds today. Perhaps a particular piece of commentary on the Bible was a little hard to swallow, for whatever reason.

The phrase didn’t really gain traction until the early twentieth century. It did not surface again until the August 1908 edition of The Athenæum, a U.S. literary journal. The journal included this text:

Our reasons for not accepting the author’s pictures of early Ireland without many grains of salt . . .

Apparently, the photographer’s work did not meet the editorial guidelines of the journal. By this time, it seems that the metaphor was already common enough that readers understood the meaning even when it was slightly altered for rhetorical effect.

From here, the saying “with a grain of salt”—based on the idea of using salt to make something unpalatable easier to swallow—began to catch on as a metaphor for adding a little skepticism when consuming potentially doubtful information.

The UK caught on later in the century. The earliest printed citation comes from the 1948 book Cicero & the Roman Republic:

A more critical spirit slowly developed, so that Cicero and his friends took more than the proverbial pinch of salt before swallowing everything written by these earlier authors.

This quote itself provides a good lesson on studying etymology and language change—use good judgment, vet your sources, and take things with a grain of salt when it seems that there are gaps in the historical narrative.

Salt Bae
Just ask Salt Bae, who became an internet meme in 2017.

Sources

Corwell, F. H. Cicero & the Roman Republic. Pelican Book, 1948.

Hall-Geisel, Kristen. “What Does It Mean to ‘Take It with a Grain of Salt’?” How Stuff Works, June 30, 2020. https://people.howstuffworks.com/grain-of-salt.htm.

Hyden, Mark. “Mithridates’ Poison Elixir: Fact or Fiction?” World History Encyclopedia, June 2, 2016. https://www.worldhistory.org/article/906/mithridates-poison-elixir-fact-or-fiction/.

Gutoskey, Ellen. “Why Do We Tell People to Take Something ‘With a Grain of Salt’?” Mental Floss, September 22, 2021, https://www.mentalfloss.com/article/648536/take-it-grain-salt-meaning-and-origins.

Martin, Gary. “The Meaning and Origin of the Expression: Take with a Grain of Salt.” The Phrase Finder. Retrieved October 11, 2021, from https://www.phrases.org.uk/meanings/take-with-a-grain-of-salt.html.

The Idioms. “Take with a grain of salt.” https://www.theidioms.com/take-with-a-grain-of-salt/.

Wikipedia. “Pliny the Elder.” Retrieved October 11, 2021, from https://en.wikipedia.org/wiki/Pliny_the_Elder.

Where Did S’mores Come From?

Where did s’mores come from? The answer involves the Girl Scouts, a Wikipedia hoax, and the Father of American Vegetarianism.

We know they’re called s’mores because they’re so good, you always want “some more” of them. But who actually invented everyone’s favorite campfire treat?

S’mores were most likely invented in the 1920s by Girl Scout troops at Camp Andree Clark in upstate New York. As reported in the Norwalk Hour in September 1925, the Girl Scouts shared a treat called “Some-Mores” with each other that “consist of a graham cracker on which is placed a piece of Hershey chocolate, a toasted marshmallow, another piece of chocolate and a graham cracker.” Other sources shows that the Camp Fire Girls (a girls’ organization formed two years before the Girl Scouts) began to enjoy s’mores around the same time.

1940s (unspecified)— Seven scouts in a semi-circle roast marshmallow on sticks over a stone campfire. Source: NHPC.

Another early source for a s’mores recipe is a 1927 official Girl Scouts guidebook called Tramping and Trailing With the Girl Scouts. The guidebook included a recipe for “Some Mores,” and, similar to the description in the Norwalk newspaper, they haven’t changed a bit since then—a marshmallow toasted over a campfire is sandwiched with some Hershey’s chocolate between two graham crackers, allowing the heat of the marshmallow to melt the chocolate. Of this gooey, sweet snack, the guidebook remarks, “Though it tastes like ‘some more’ one is really enough.”

The identity of the author of the recipe, however, is contested. The guidebook does not list an author, but in 2009, a troop leader by the name Loretta Scott Crew was credited for the recipe on Wikipedia. There is reason to believe her name was invented as a hoax to test the trust of the internet in Wikipedia, the result being that an erroneous attribution is still circulated to this day.

Experimental young girls were not the only ones to combine graham crackers, marshmallows, and chocolate—various mass-produced treats included all three elements before the Girl Scouts even existed. Nabisco’s Mallomar, dating from 1913, was a round graham cracker topped with a marshmallow and covered in dark chocolate, and the MoonPie of 1917 was a palm-sized graham cracker and marshmallow sandwich dipped in chocolate.

MoonPie
MoonPie. Image by Evan-Amos from Wikimedia Commons.

But before that . . .

Marshmallows

You may have heard of the marshmallow plant. It looks something like this:

Althaea officinalis. Image by H. Zeli, June 27, 2009, CC BY-SA 3.0, from Wikimedia Commons.

Marshmallow (Althaea officinalis) is an herb has been used for thousands of years as an herbal remedy for sore throats, indigestion, and pain relief. It is native to the marshes of Europe and Asia. In Ancient Greece, China, and Rome, the marshmallow plant was used for food and medicinal purposes. In Egypt, marshmallow sap was combined with honey, producing a candy reserved for the gods and rulers. The root of the plant was also boiled with sugar to release the root sap until it thickened, then strained and cooled to make “suckets,” which were like cough drops.

It wasn’t until the mid-1800s that the French whipped up a light, airy confection using marshmallow sap, egg whites, and sugar purely for enjoyment. These were expensive to make and reserved only for the upper classes. But soon, the marshmallow sap was replaced with gelatin to produce a cheaper version that was still light and fluffy. (In case you’re wondering, modern marshmallows are made with corn syrup, sugar, gelatin, and preservatives.)

The idea of roasting marshmallows over a campfire came about in the 1890s. It became very popular in the beach towns of the Northeast, where marshmallow roasting parties were held at summer resorts. This was seen as a trendy, novel way to flirt and connect with people over a delicious treat that tasted like a “sublimated combination of candy and cake” (as reported in a letter from Asbury Park, New Jersey). This was also “an excellent medium for flirtation” when a young person ate a marshmallow off a potential suitor’s stick.

Graham Crackers

Graham crackers, on the other hand, were intended to be quite the opposite. Sylvester Graham, who invented the graham cracker in the mid-1800s, was a firm supporter of the temperance movement and believed that his crackers would suppress sexual desire. He was an adherent to a school of thought that claimed minimizing pleasure and stimulation of all kinds was the surest path to good health. Along with graham crackers, he and his followers—the Grahamites—ate a steady, bland diet of graham flour and graham bread and led one of the first vegetarian movements in the United States. Graham was known as the Father of Vegetarianism in America.

Some of this may seem laughable in the face of modern science, but Graham had many things right—he was a proponent of regular exercise, clean water, and preventive care, as well as being one of the first to suggest that stress causes disease.

Convinced that commercial bakeries were adding unhealthy ingredients to their products (and rightly so—there were few food regulations at the time), Graham promoted his crackers made with only coarse-ground wheat, oil, molasses, and salt (no sugar allowed). After he died, sugar was added to graham crackers for mass production—right around the time the marshmallow was gaining popularity.

Hershey’s Chocolate

This deserves a post of its own. Suffice it to say that “the Great American Chocolate Bar” made its debut in 1900 and has never looked back. With all the ingredients in place and available to the masses, the time was ripe to combine them into a delicious new treat.

S’mores Today

Today, thanks to the endless creativity of ordinary people and the instant sharing of ideas through the interwebz, there are many different ways to eat s’mores. Swap out plain milk chocolate for white chocolate or a Reese’s cup or a Ghiradelli raspberry dark chocolate square. Use chocolate graham crackers or Oreos or pretzels. Make things a little saucier by adding peanut butter, caramel, lemon curd, or Nutella or into the mix. Or add strawberries, banana slices, or even bacon for a unique spin on the standard s’more.

“One is really enough,” but with so many variations, how could you stop at just one?

Sources

Cronkelton, Emily, “Everything You Need to Know about Marshmallow Root.” Healthline, March 19, 2019. https://www.healthline.com/health/food-nutrition/marshmallow-root#side-effects-and-risks.

Food Timelines. Food Timeline FAQs: Candy.  https://www.foodtimeline.org/foodcandy.html#marshmallowroasts.

Gentile, Jessica. “If You Love S’mores, You Have the Girl Scouts to Thank.” Chowhound, August 10, 2020. https://www.chowhound.com/food-news/205847/the-history-of-smores/.

Kelly, Debra. “Where Does the Term S’mores Come From?” Mashed, November 8, 2016. https://www.mashed.com/30437/term-smores-come/.

“Marshmallow Roasts are the Fad.” Asbury Park, New Jersey, letter in the New York World and Chicago Daily Tribune, August 8, 1892 (p. 6). Retrieved from https://www.foodtimeline.org/foodcandy.html#marshmallowroasts.

Roberts, Anna Monette. “If You Were a Girl Scout, You’ll Be Proud of This.” Popsugar, August 10, 2015. https://www.popsugar.com/food/Who-Invented-Smores-38029222.

Rupp, Rebecca. “The Gooey Story of S’mores.” National Geographic, August 14, 2015. https://www.nationalgeographic.com/culture/article/the-gooey-story-of-smores.

Tramping and Trailing With the Girl Scouts. Girl Scouts of the United States of America, 1927, p. 63.

Wikipedia:Articles for deletion/Loretta Scott Crew. Retrieved August 29, 2021, from https://en.wikipedia.org/wiki/Wikipedia:Articles_for_deletion/Loretta_Scott_Crew.  

Wikipedia. “Graham Cracker.” Retrieved September 2, 2021, from https://en.wikipedia.org/wiki/Graham_cracker.

The Language of Love

Why do we have such strange ways of saying we’re in love—whether we’re infatuated, head over heels, or crushing on someone? The answer involves structural metaphors, semantic change, and secret diaries.

Love Is Out of Control

Sometimes it’s not enough just to say we like someone. Sometimes we’re SO passionately enamored that normal, everyday words just don’t describe it. In fact, we’ve created an entire metaphorical system to describe the way we experience love. A structural metaphor, as this is called, is when we map an abstract concept onto a more concrete concept and develop layers of metaphors upon this structural foundation—a foundation that we don’t even need to define or explain to understand. Structural metaphors are so embedded in language that it is difficult to communicate without them, and we expect others to understand them intuitively.

When it comes to love, we have a few structural metaphors that serve as the foundation for the way we talk about it: love is out of control, love is magic, and love is a journey.

Let’s take a look at love is out of control. Think about how it feels to fall in love with someone: your palms sweat, your heart races, you feel a little shaky, and you would do anything to impress the object of your affection. You do things that seem silly or dumb or out of control based on intense feelings of attraction.

You’re falling in love, a variation using vertical velocity on the theme of being out of control.

Maybe you’re infatuated—a word that comes from the verb infatuate, meaning “to turn something into foolishness, to make a fool of.”

Or you’re besotted—“affected with a foolish manifestation.”

You’re crazy about her, he’s driving you wild, you’re madly in love with your new beau!

Head over Heels

So why do we say that someone is “head over heels” when they’re in love? Isn’t your head . . . usually above your heels?

In the 14th century, the phrase “heels over head” came about as a way to describe the feeling of being upside down—hopelessly, topsy turvy in love with someone. This phrase builds on the structural metaphor of love is out of control—think about how it feels to do a somersault—you’re dizzy, the blood rushes to your head, the world suddenly looks very different.

In the 18th and 19th centuries, this phrase was gradually recast as “head over heels,” which makes a little less sense but still conveys to us a sense of doing somersaults or cartwheels. The phrase can also mean literally tumbling upside down, or it can mean running frantically.

Crush

Speaking of being head over heels, the internet wants to know—am I crazy or falling in love? (Is it really just another crush?)

The primary meaning of crush is to smash or to pound something to particles. Figuratively, it means to humiliate or demoralize, to “cause overwhelming pain to someone,” or to “suppress or overwhelm as if by pressure or by weight.”

A crush is an intense and usually transient affection, and it can definitely be overwhelming, drawing on one of the meanings of the word crush. Emotions are running wild. It can be humiliating if the crush is not reciprocated. It is the emotional equivalent—in both the positive feeling of being in love and the negative fear of rejection or humiliation—of being smashed to pieces. When we think about the structural metaphor of love is out of control, crush seems like a good way to describe this feeling!

Drax from Guardians of the Galaxy: No one has a crush on me. I am too strong to be crushed.

The first recorded use of crush as a noun meaning “a person one is infatuated with” occurred in 1884 in the diary of Isabella Maud Rittenhouse, a young woman of Cairo, Illinois, and later an outspoken leader of the women’s suffrage movement. She wrote, “Wintie is weeping because her crush is gone.”

Soon thereafter, in 1895, the word was first used as a verb meaning “to be infatuated with someone.” In a book about life at Yale University, it was recorded that “Miss Palfrey . . . consented to wear his bunch of blue violets. It was a ‘crush,’ you see, on both sides.”

The new use of crush may have been influenced by the scandalous 1856 novel Madam Bovary, whose English translation includes a passage that describes an overwhelming and potentially disastrous infatuation:

But the more Emma recognized her love, the more she crushed it down that it might not be evident, that she might make it less. She would have liked Leon to guess it, and she imagined chances, catastrophes that should facilitate this.

Alternatively, Warren Clements proposes that Isabella Maud Rittenhouse’s use of crush may have been a parallel to the word mash, which had been used since the 1870s to mean one’s “sweetheart.” To be mashed was to be “flirtatious or head over heels in love.” (Another term popular around the same time was pash, a shortened form of passion: “He really has a pash for you!”)

Going back even further, Ernest Weekley’s Etymological Dictionary of Modern English postulated that since mash was “regarded as spoon-diet,” the kind of fare for one who couldn’t chew their food properly, mash may have been related to the slang term spoony. Since the 1820s, spoony had been used to mean being romantic in a goofy, sentimental way.

From spoony to the mash eaten with that spoon to crush, a synonym of mash, we sure have a way of describing our out-of-control romantic attachments.

Sources

Adams, Cecil. “Shouldn’t the Expression “Head over Heels” Be “Heels over Head”? The Straight Dope,May 17, 1991. https://www.straightdope.com/21341906/shouldn-t-the-expression-head-over-heels-be-heels-over-head.

Clements, Warren. “Feeding Love by the Spoonful.” The Globe and Mail, August 26, 2011. https://www.theglobeandmail.com/arts/feeding-love-by-the-spoonful/article626912/.

“Crush.” Etymology Online Dictionary. Retrieved August 14, 2021, from https://www.etymonline.com/word/crush#etymonline_v_416.

“Crush.” Merriam-Webster. Retrieved August 14, 2021, from https://www.merriam-webster.com/dictionary/crush.

“Head over Heels.” Dictionary.com. Retrieved August 14, 2021, from https://www.dictionary.com/browse/head-over-heels.

“Isabella Maud Rittenhouse Mayne.” Find a Grave. Retrieved August 14, 2021, from https://www.findagrave.com/memorial/144080841/isabella-maud-mayne.

Lakoff, George, and Mark Johnson. Metaphors We Live By. Chicago, IL: University of Chicago, 1980.

Lawler, John. “Making the Point with Metaphors—Not Just for Poets.” The Editorial Eye 28, no. 4 (April 2005): 1–3. http://websites.umich.edu/~jlawler/April05Eye.pdf.

The Archetypal Apple

Why are apples seen as the “default” fruit in Western culture? The answer involves Greek myths, Latin spelling mistakes, and English semantic narrowing.

The English word apple comes from the Old English æppel, which meant not only “apple” but “any kind of fruit” or “fruit in general.” It’s an old, old word stemming from Proto-Indo-European *ab(e)l-, meaning “apple.” In Middle English and Early Modern English, eppel or appel was mostly used as a generic term for all types of fruit, excluding berries but including nuts. Dates were fingeræppla (“finger apples”), cucumbers were eorþæppla (“earth apples,”), and bananas were appels of paradis (more on that later!).

The simple answer to our question then, is that it is a matter of semantic narrowing. Apple went from being a general term for fruit to denoting the fruit we know today as an apple. Languages descended from Greek and Latin went through a similar process for the word for fruit as well. The Greek word melon originally meant “apple,” but it was combined with other roots to form words like mēlopepon “gourd-apple.” Melon was used in Greek as a generic term for any type of unknown fruit. The Latin word, pomme, makes reference to Pomona, the goddess of fruit trees. Pomme has likewise been used to refer to any fruit in general or specifically to apples (the phrase pomme de terre in French literally means “earth apple” and is the term for “potato”). The semantic value of the apple lies in the fact that it is an archetype for fruit, a pattern or prototype for all other fruits.

The apple holds great meaning in many cultural traditions throughout the world. Fruit in general is often seen as a symbol of fertility due to both its form and function. The Penguin Dictionary of Symbols records that the various meanings attached to the apple are—at their core—all interconnected. The apple is seen as a key to knowledge and wisdom of some kind, whether that be knowledge of mortal life and humanity, knowledge about oneself, or intimately knowing another. Let’s take a look at some of the ways the world views the apple.

Apples in Olympus

Greek, Chinese, and Norse tradition all contain various references to and stories about apples wherein they are symbols of fertility, beauty, and eternal youth. Apples can also be a negative symbol of temptation or vanity.

In Greek mythology, Eris, the goddess of chaos and discord, threw an apple into the wedding party of Thetis and Peleus out of anger that she had not been invited. (The “apple” in the story was actually a now-extinct fruit grown in the Balkans that was similar to a pomegranate.) She inscribed into the apple kallisti (To the Prettiest One). The goddesses Hera, Athena, and Aphrodite all claimed the apple of discord, and Zeus appointed Paris of Troy to select which of the three the apple should belong to. Aphrodite, goddess of love and fertility, persuaded Paris to give her the apple by promising him that she would make Helen—her half-sister and the most beautiful woman in the world—fall in love with him. The resulting relationship between Helen and Paris precipitated the Trojan War. Thus, an “apple of discord” is the kernel of a small argument that leads to a much bigger dispute!

Golden Apple of Discord by Jacob Jordaens, 1633. Museo del Prado.

Based on this story, the apple became a sacred relic of Aphrodite (or Venus, in Roman tradition). Throwing an apple at someone was the ancient Greek version of a marriage proposal or declaration of love, and to catch the apple was to accept. Today, newlyweds share an apple on their wedding night to ensure a “fruitful” union.

Apples in the Garden

The fig tree, native to the Mediterranean and Middle East, held similar symbolic meanings as the apple, as evidenced in ancient religious texts. The fig was one of the earliest domesticated fruits in the world, along with the olive and grape—all of which have their origins in the Fertile Crescent, one of several cradles of human civilization.

The Garden of Eden, the location of the creation narrative in Abrahamic religions, contained an abundant variety of trees and plants. Adam and Eve, the first human beings, are commanded by God to eat freely from the garden except from the Tree of Knowledge of Good and Evil.

In the Hebrew Bible, the Tree of Knowledge is not identified with a particular type of fruit. Eve is tempted by a serpent to eat the fruit of the Tree of Knowledge, and she and Adam both choose to do so. Their eyes are opened as they receive knowledge, and they cover themselves with fig leaves when they realize they are naked. In Hebrew tradition, the Tree of Knowledge itself is considered to be a fig tree, though this is not stated in the text.

In Islamic tradition, the Tree of Immortality, as it is known, is often portrayed as a fig or olive tree.

Manafi al-Hayawan (The Useful Animals) by Abu Said Ubaud Allah Ibn Bakhitshu.
1294–99 CE. Maragh, Iran.

Buddhist tradition sees the fig as a symbol of enlightenment. The Buddha reached enlightenment under a Bodhi tree, a species of fig tree. This symbolism from another area of the world corroborates the metaphor used in the Garden of Eden account—the fruit of knowledge is enlightenment.

In addition to representing knowledge, the fig is strongly associated with fertility and abundant life in many cultures. It is a symbol of male and female joining together—its plump shape is a metaphor for female fertility, and the sap of the tree represents male fertility.

In Christian tradition, the fruit of the Tree of Knowledge is often portrayed as an apple, though it is variously seen as a fig, pear, or pomegranate—all richly evocative of the ideas of fertility, the cycle of life, and desire due to their resemblance to human sexual anatomy. One fruit contains many seeds, each with the potential to produce a tree, which will then produce more fruit, and the cycle continues forever. The dual meanings of temptation and fertility are thus strongly associated with fruit in general, and the apple and fig in particular. Some Christians hold that one of the results of Adam and Eve’s choice to eat the fruit is the physical condition needed for procreation, as well as the knowledge needed to navigate mortality.

La tentation d’Adam et Ève, XIIIth century.

So how did the fig get turned into an apple?

The idea that the fruit of the Tree of Knowledge was an apple comes from a mix-up of the Latin words mălum, meaning “evil,” and mālum, meaning “apple.” The fruit of the Tree of Knowledge of Good and Evil was turned into the fruit of the Tree of Knowledge of Good and Apples!

Later literature that drew from the Bible, such as John Milton’s Paradise Lost (1667) continued cast the fruit as an apple, which only reinforced this mistranslation—but also reinforced the existing link between apples and knowledge. Renaissance art often featured the apple as the “forbidden fruit” in the Garden of Eden story.

Another matter to consider is that in contrast to figs, “apples were historically among the most difficult fruit trees to cultivate and among the last major ones to be domesticated in Eurasia, because their propagation requires the difficult technique of grafting” (Diamond, 1997, p. 150). Perhaps the cultural and dietary significance of apples was greater for Latin speakers at the time they were interpreting the Bible, while figs were more prominent for Hebrew speakers of an earlier era—though this is no more than conjecture.

The apple is referenced elsewhere in the Old Testament as well—readers are instructed to keep God and God’s commandments as “the apple of thine eye” (Proverbs 7:2, see also Deuteronomy 32:10, Psalms 17:8). The “apple” of one’s eye refers to the pupil, which resides in the very center of one’s eye and is fixed upon the thing one desires. In Hebrew, the word used for “apple” in these verses literally means “dark part of the eye.” The word “apple” was substituted in English translations of the Bible, using an idiom that first appeared in Old English around the ninth century. The phrasing in the English translation indicates that, to an English speaker, an apple represents the thing most desired or cherished above all others.

In the Song of Solomon, the apple is likewise a metaphor for beauty and desire: “As the apple tree among the trees of the wood, so is my beloved among the sons. I sat down under his shadow with great delight, and his fruit was sweet to my taste” (Song of Solomon 2:3). The word “apple” here is variously translated as “orange” or “citron”—the idea being that fruit of any kind is a symbol of desire and sweetness, among other meanings.

If an apple represents a thing that is most desirable, it makes sense for English speakers to cast the tree in the Garden of Eden as an apple tree, which Eve saw was “to be desired to make one wise” (Genesis 3:6, emphasis added).

Adam’s Apple

The Adam’s apple, a laryngeal protuberance formed by cartilage, is present in all humans, but is much more pronounced in men.

You may have heard the folk etymology behind the Adam’s apple as something like this: After Eve ate the fruit (“apple”) of the Tree of Knowledge, she convinced Adam to taste it, and a chunk of it got stuck in his throat as a reminder of his transgression. He then passed this on to all of his posterity in the form of a protuberance in the throat. It was later given the name “Adam’s apple” as a reminder of the Garden of Eden account.

Though it seems like plausible thinking behind the name in a society that held the Bible in high regard, this was not really the inspiration behind it. The English term as applied to human anatomy has been in use since 1625. The French pomme d’Adam and the German Adamsapfel both refer to the same thing. From the medieval period until the 1700s, a term meaning “Adam’s apple” was also used in various languages to describe literal fruits—pomelos, citrons, and plantains, for example, were all called “Adam’s apple” at one point. A Mediterranean variety of lime with indentations resembling the mark of a person’s teeth was a particularly vivid reminder of Adam biting into the fruit in the Garden of Eden.

Medieval Latin texts use the term pomum Adami as a name for several different fruits, including the pomegranate. This name implied that these were among the “fruits of paradise” enjoyed in the Garden of Eden. Around the same time, medieval Arab medical scholars were cataloging the anatomy of the throat, deciding on a word meaning “pomegranate” as the name for the laryngeal protuberance. We don’t know the exact reason why they chose this metaphor, but the pomegranate, too, is highly symbolic in Islamic religious tradition and beyond. European writers adopted the Latin translation, pommun granatum, for the laryngeal protuberance, then applied the synonym already in existence: pomum Adami.

And there you have it—the apple features prominently in mythology and religious thought, while etymologically capturing the essence of fruit itself. It is a symbol of temptation and knowledge, desire and abundant life.

The various meanings of the apple show up elsewhere in everyday life and popular culture. Snow White in her naivety was tempted to eat a poison apple that put her under a curse that only a prince could break (which strongly parallels Christian themes, if you think about it). Johnny Appleseed went down in American folk history as the sower of both apple seeds and religious ideals, spreading fruit and wisdom in service of nature and his fellow humans everywhere he went. Your laptop and phone most likely have an Apple logo on them. A student presents an apple to the teacher as a gift of knowledge.

How do you like them apples?

Sources

“Adam’s Apple.” Online Etymology Dictionary. Retrieved July 31, 2021, from https://www.etymonline.com/word/Adam’s%20apple#etymonline_v_40638.

“Apple.” Online Etymology Dictionary. Retrieved July 31, 2021, from https://www.etymonline.com/search?q=apple.

Diamond, Jared. Guns, Germs, and Steel: The Fates of Human Societies. New York City:W.W. Norton, 1997.

“Fruit in Mythology.” Encyclopedia of Myths. Retrieved July 31, 2021, from http://www.mythencyclopedia.com/Fi-Go/Fruit-in-Mythology.html.

Kettler, Sarah. “7 Facts About Johnny Appleseed.” Biography, June 11, 2020. https://www.biography.com/news/johnny-appleseed-story-facts.

“Melon.” Online Etymology Dictionary. Retrieved July 31, 2021, from https://www.etymonline.com/word/melon.

Merriam-Webster. “Why Is It Called an Adam’s Apple?” Word History. https://www.merriam-webster.com/words-at-play/why-is-it-called-an-adams-apple-word-history.

“Pomona.” Online Etymology Dictionary. Retrieved July 31, 2021, from https://www.etymonline.com/word/Pomona.

Smithfield, Brad. “In Ancient Greece, Throwing an Apple at Someone Was Considered a Marriage Proposal.” The Vintage News, September 10, 2016. https://www.thevintagenews.com/2016/09/10/ancient-greece-throwing-apple-someone-considered-marriage-proposal/.

Tearle, Oliver. “The Curious Symbolism of Apples in Literature and Myth.” Interesting Literature, April 2021. https://interestingliterature.com/2021/04/apples-symbolism-in-literature-myth-meaning-analysis/.

Wikipedia. “Apple of My Eye.” Retrieved July 27, 2021, from https://en.wikipedia.org/wiki/Apple_of_my_eye.

Spelling Bee

Why do good spellers compete in a spelling “bee”? The answer involves all the favorite subjects of a spelling bee winner—etymology, philology, and, of course, spelling.

The Queen Bee

According to Merriam-Webster, lookups of the word “murraya” spiked 100,000% on July 8–9, 2021.

Patterns of word usage ebb and flow over time, and—based on current events, pop culture, and other new or recycled ideas—so does our interest in certain words. One of those events is the Scripps National Spelling Bee.

On July 8, 2021, eighth grader Zaila Avant-garde of Louisiana became the first Black American to win the highest honor that may be bestowed upon the orthographically gifted. The winning word “murraya,” which most of us have probably never heard before, refers to a genus of tropical Asiatic and Australian trees named for Swedish botanist Johan A. Murray.

And Zaila’s ability to spelling obscure words is just the beginning of her talents—she also holds three basketball-related world records, she can unicycle and juggle simultaneously, and her side interests include gene editing and neuroscience.

The Helpful Bee

Unlike “murraya,” the word bee itself isn’t likely to turn up on a spelling bee word list. But most people, even spelling champions, probably don’t know the origin of the word. (Are honeybees particularly good at spelling competitions?)

As used in the context of a spelling bee, “bee” is an alteration of a word that was rendered “been” in some dialects of English. The word descends from the Middle English “bene,” which denoted “voluntary help given by neighbors toward the accomplishment of a particular task” (Merriam-Webster). “Bene” is also related to the English word “boon,” which similarly indicates a blessing, benefit, or favor.

This word “bee” has been used to describe community activities where neighbors made a social event out of helping each other with tasks. Historically, you might have attended a quilting bee, a (corn) husking bee, or a (barn) raising bee.

A pioneer quilting bee. The Quilt That Walked to Golden, p. 31.

And yes, some linguists also connect this term with the insect type of bee. The industrious and cooperative nature of bees provides an apt metaphor for a group of friendly neighbors working together to accomplish a task.

“Spelling bee” began to show up in print sources around the turn of the twentieth century. However, it was often modified with terms like “old-fashioned,” indicating that the spelling bee had been around for quite some time but under different names. Before then, a spelling competition might have been called trials in spelling, spelling school, spelling match, spelling-fight, spelling combat, or spelldown (these are all beginning to sound more like a Wizard’s Duel than anything else!).

The spelling bee, which is often described as a “brain sport,” is typically seen as competitive rather than cooperative. But the hard work required to prepare for such a competition and the buzzing of young contestants reciting letters point us toward the characteristics of the honeybee.

And what’s more, bees are foundational to our ecosystem. They pollinate the flowering plants that we depend on for food and raw materials and beauty and turn it into sweet, sweet honey. Likewise, the foundational elements of letters and words that build up the English language, which has been cross-pollinated with Latin and French and many other linguistic influences, are combined to produce the rich vocabulary and ever-evolving possibilities of expression that English offers today.

The Spelling Bee

In English, there was no such thing as “correct” spelling until the eighteenth century. Before then, writers freely spelled the same words in different ways. While others would generally understand what they meant, there was mounting frustration that there was no regularity in the written language. This frustration along with a Protestant push to increase literacy so that common people could read the Bible led to the publication of English dictionaries. Samuel Johnson’s 1755 dictionary was a highly influential dictionary of this time that prescribed English spelling and word usage. We can view the dictionary as a record of sometimes arbitrary decisions about which spelling of a word would be considered correct—decisions that we now see as indisputable.

Once there was an agreed-upon standard, “correct” language use came to be a sign of education. In class-conscious Britain, correct pronunciation was the mark of the elite, while in America, correct spelling was the signature of a scholar. By the mid-eighteenth century, it was common for American schools to hold spelling competitions for students—and thus, the spelling bee was born from the uniquely American obsession with prescribing how to write the English language. As mentioned previously, these competitions went by different names until the turn of the twentieth century.

Norman Rockwell, Cousin Reginald Spells Peloponnesus (Spelling Bee), 1918. Image from Wikimedia Commons.

The spelling bee was first held on the national level in 1925. Nine newspapers joined together to host the National Spelling Bee to promote literacy. The bee has been held every year since then except for during World War II and during the COVID-19 pandemic. In 1957, Scripps adopted Merriam-Webster’s Third International Dictionary as the official dictionary of the bee.

Spelling bees spread to many other countries around the world, but they are generally limited to English-speaking areas. Why? Because other languages have much more predictable spelling systems. English is one of the only languages where so much memorization is required!

To win a spelling bee takes more than just raw talent. It requires an exceptional degree of diligence and discipline for daily study, a love for the English language and its historical development, and support from expert coaches and commercial word lists. The 2006 film Akeelah and the Bee underscores the role of adult and community support, following the story of a young girl from an urban neighborhood and single-parent household who ascends to the ranks of the national spelling bee. Akeelah is a truly brilliant speller who overcomes both self-doubt and mocking from others. With courage and intelligence, she beats the odds and inspires all those who have rallied around her.

As elite spellers pass on their wisdom to the next generation and as coaching and commercial resources have become essential for success in the Scripps National Spelling Bee, the bar is raised higher and higher each year. Data analysis provides an avenue for analyzing weakness and improving efficiency in studying. We are getting better and better at the game.

In 2019, the bee ended in an eight-way tie as the contestants blew through round after round of challenging words as if they were a breeze. The Octo-Champs, as they are known, broke the game. Sports Illustrated wrote, “They hadn’t beaten one another. Instead, together, they’d beaten the dictionary.” Merriam-Webster responded: “The Dictionary concedes and adds that it is SO. PROUD.” After the astounding win, the rules were changed to include multiple-choice vocabulary questions and a lightning round to eliminate the possibility of a tie.

Whether it is a fierce competition on an international level or a local elementary school contest, the spelling bee is a celebration of the “correct” orthography that—while still not fixed, but much less fluid than in times past—is a mark of dedication and education. In fact, one former champion describes winning the spelling bee as an embodiment of the American meritocracy, as it requires both individual discipline and access to resources for study to beat the competition (Sealfon, 2019).

Sources

Baccellieri, Emma. “How the Octo-Champs of the 2019 National Spelling Bee Have Changed the Game.” Sports Illustrated, June 7, 2019. https://www.si.com/more-sports/2019/06/07/scripps-national-spelling-bee-8-way-tie-unprecedented-result-merriam-webster-dictionary.

Bowman, Emma. “National Spelling Bee Adds New Rules to Help Winners Sting the Competition.” NPR, April 23, 2021. https://www.npr.org/2021/04/23/990400434/national-spelling-bee-adds-new-rules-to-help-winners-sting-the-competition.

Fogarty, Mignon. “Why Is It Called a ‘Spelling Bee’?” Quick and Dirty Tips, June 7, 2018. https://www.quickanddirtytips.com/education/grammar/why-is-it-called-a-spelling-bee.

Merriam-Webster. “6 Actual Names for Historical Spelling Bees.” Word History. Merriam-Webster online dictionary. Retrieved July 16, 2021, from https://www.merriam-webster.com/words-at-play/alternate-spelling-bee-titles.

Merriam-Webster. “Trending: Murraya.” Merriam-Webster Trend Watch. Merriam-Webster online dictionary. https://www.merriam-webster.com/news-trend-watch/zaila-avant-garde-wins-bee-with-murraya-20210709.

Sealfon, Rebecca. “The History of the Spelling Bee.” Smithsonian Magazine, May 2019. https://www.smithsonianmag.com/arts-culture/history-spelling-bee-180971916/.

Shankar, Shalani. “Why It’s Big News When a Black Girl Wins the Scripps National Spelling Bee.” Chicago Sun-Times, July 12, 2021. https://chicago.suntimes.com/2021/7/12/22574106/scripps-national-spelling-bee-black-education-zaila-avant-garde-sun-times.

The Editors of Encyclopedia Brittanica. “Dictionary.” Retrieved July 17, 2021, from https://www.britannica.com/summary/dictionary.

Jumping on the Bandwagon

What’s a bandwagon, and why is everyone jumping on it? The answer involves the circus, Theodore Roosevelt, and cognitive biases.

The phrase “to jump on the bandwagon” means to join the most popular side or party or to go along with something that is growing in popularity. But today, there is generally no wagon in sight.

The Original Bandwagons

The word bandwagon dates back to the late 1840s and originally referred to a large, ornate wagon that carried the band in a circus procession. P.T. Barnum, the famous circus owner and “greatest showman,” recorded, “At Vicksburg we sold all our land conveyances excepting our horses and the ‘band wagon’”—one of the first printed references to such an attraction. Barnum’s circus put on parades through the towns they performed before they set up their show. These spectacles were made grander by bright wagons engraved with circus animals that held the performing musicians. It was a publicity stunt, and an effective one, too.

Barnum’s circus was not the only one on the bandwagon. An 1847 Louisiana newspaper described “a magnificent band-wagon, capable of holding twenty musicians” belonging to Messrs Stone and McCollum’s Circus.

Bandwagons began to be used in parades during other celebrations and political processions, especially for the Fourth of July. In the late 1800s, politicians caught on to the publicity potential as well, and they began using bandwagons in parades on the campaign trail.

When a campaign began to generate steam, other politicians and aspiring leaders rented seats on the bandwagon and rode through town in hopes of gaining an association with the successful candidate.

Riding “under the bandwagon” was a mark of favor and popularity, a must for any budding politician to get exposure among their future constituents.

The Bandwagon as a Metaphor

In 1884, the Woodstock Sentinel of Illinois published an article called “Anything to Beat Hamilton,” an interview about political tactics used by candidates to beat out incumbent Governor John Marshall Hamilton. The article reported:

The principal candidates for attorney general are Geo. Hunt, of Paris, Ill., and the present incumbent. I understand that Hunt is running under the wing of Oglesby. If so, he’ll beat McCartney, provided the latter allows Hunt to load him into the Hamilton band-wagon.”


“It is very evident, Judge, that you consider that any man who goes into Hamilton’s band-wagon is liable to get left.”

This is the earliest known reference that took the idea of riding in a politician’s bandwagon beyond the physical act. The bandwagon was now a metaphor for ingratiating or identifying oneself with a popular political figure in hopes that some of their success would rub off by association.

The metaphor would begin to take on a largely negative, opportunistic tone as well, in the vein of going along with the crowd and getting behind whoever was popular at the moment to raise one’s own status. As this article warned, jumping onto the bandwagon was not always the key to success—if that politician fell out of favor, so would everyone else parading along with them. Political speeches throughout the 1890s similarly warned voters not to jump on any candidate’s bandwagon too quickly.

In Theodore Roosevelt’s Letters, 1899, the future U.S. president generalized the phrase to refer to going along with anything that was popular, not just a political candidate: “When I once became sure of one majority they tumbled over each other to get aboard the band wagon.”

The Bandwagon Effect

The bandwagon effect is recognized as a cognitive bias whereby people tend to adopt certain behaviors or beliefs just because many other people are doing it. Marketers, propagandists, and political candidates all use the bandwagon effect to gain followers. “Everyone else is doing it, and so should you” is a tempting appeal for consumers and constituents alike. As Roosevelt observed, people will tumble over each other to avoid being left behind, to conform to group norms, to feel included and accepted.

Psychologists have proposed several different factors that play into the bandwagon effect. First, because social connection is key for human survival and well-being, we desperately want to feel like we belong as part of a group, so we often feel pressure to conform to an idea or trend that is popular among people we know. Second, the fear of missing out on something important is also a strong motivator for human behavior. Third, using group norms as a benchmark acts as a shortcut in the individual decision-making process. Finally, we all like to be on the winning side of things—and the person, idea, or behavior that is most likely to win is the one that is most popular. The bandwagon is reinforced through a positive feedback loop. This means that when more people are aware of or actively doing something (when more people jump on the bandwagon), other people are more likely to accept it and jump on the bandwagon as well.

The dark side of the bandwagon effect is when the group norms are questionable from an ethical or moral standpoint. The bandwagon has much potential for good—such as in developing positive views toward populations that were historically marginalized in society—but it can also contribute to the growth of extreme or dangerous social and political movements. The pressure to conform can also confine individual expression and lead to feelings of exclusion for those who do not jump on the bandwagon. It can be difficult to hold out on something one believes to be wrong when it seems like everyone else is doing it; however, the truth is that rarely is everyone else doing it. Your brain’s perception that everyone around you is participating in a belief or behavior you deem unconscionable could be a result of a cognitive bias, and media portrayals may play into this bias by showing a distorted view of reality.

In the digital age, the potential to exploit the bandwagon appeal is magnified by social media platforms. If an influencer has tens of thousands of followers, you might assume that they have something important to say or that people you know are benefiting from their content or product recommendations. Propagandists use bots and other fake accounts to build a large following and convince real users that everyone else is following them, so they must be legitimate.

Additionally, mass media and social media can create a false perception of public opinion about a given practice or issue. Those who make decisions about what content to promote to a large audience may choose to frame extreme, biased, or obscure ideas in a way that asserts that “everyone” believes this—and so should you! Those who influence the audience’s perception can then, in fact, induce a bandwagon effect.

The reverse bandwagon effect occurs when people avoid doing something because they perceive it as being popular, and they do not want to do it because everyone else is doing it. For the contrarians among us, just know that you are still under the influence of a cognitive bias that influences the way you act!

From cheering for sports teams to playing the stock market, from fashion trends to human rights, the bandwagon effect can be seen in just about every area of society. None of us are immune to this bias. However, when we are aware of our cognitive biases, we can be better prepared to make decisions that are consistent with who we are rather than what we perceive others are doing.

Sources

“Anything to Beat Hamilton.” Woodstock Sentinel, February 14, 1884.

“Bandwagon.” Online Etymology Dictionary. https://www.etymonline.com/word/bandwagon#etymonline_v_260.

Delwiche, Aaron. “Bandwagon.” The Propaganda Critic, August 8, 2018. https://propagandacritic.com/index.php/how-to-decode-propaganda/bandwagon/.

Tréguer, Pascal. “Origin of ‘to Jump on the Bandwagon.’” Word Histories, January 22, 2018. https://wordhistories.net/2018/01/22/jump-bandwagon-origin/.

The Decision Lab. “Why Do We Support Opinions as They Become More Popular? Retrieved June 27, 2021, from https://thedecisionlab.com/biases/bandwagon-effect/.

Upton, Emily. “The Origin of the Phrase ‘Jump on the Bandwagon.’” Today I Found Out, April 24, 2014. http://www.todayifoundout.com/index.php/2014/04/origin-phrase-jump-bandwagon/.

Cover image by Freekee, July 21, 2009, public domain via Wikimedia Commons.

Why Are Salads “Salted”?

Why does the word salad sound suspiciously like the word for salted in many languages? And where did salads come from, anyway? The answer takes us from ancient Rome to the high-class hotels of New York to Tijuana, Mexico.

Let’s talk about salads. From garden salad to pasta salad to taco salad to glorified rice (which is a real thing), the variety of dishes we apply this name to are vastly different from one another. A salad can be served at any point in a meal—as a first course, as a side dish, as a main entrée, or even as a dessert. The only requirement to be considered a salad is to have various food items mixed together, a sufficiently broad definition to classify a mix of tuna and mayonnaise in the same category as a mix of tropical fruits and jello.

The Roman Salad

The word salad came into English by way of the Old French salade, from Latin salata. The meaning of this word is what it sounds like–“salted.” In many Romance languages, the word for salad means salted (e.g., ensalada in Spanish). Additionally, German, Swedish, and Russian all borrowed the word salat into their language.

The original salads were herba salata, a popular Roman dish of leafy greens flavored with salt, olive oil, and vinegar. Similar dishes made from lettuce and other greens were widely enjoyed in ancient Egypt, Greece, and Rome.

In the second century CE, Hippocrates reinforced the practice of eating salads by claiming that eating raw vegetables before a meal helped clear the intestines and ensure healthy digestion. Others countered and said the salad should be eaten after the meal because the vinegar in the dressing would conflict with the flavor of the wine drunk during the meal.

Despite its popularity in the Roman Empire, salad was not consumed in every area of the world. In China, for example, salads were not appealing, and raw vegetables were considered to carry a risk of illness. Instead, vegetables were boiled or cooked in stir-fries.

Shakespeare’s Salads

During the Middle Ages, salads were a staple food for common people and royalty alike, composed of flowers, herbs, wild berries, and vegetables grown in household gardens. A salad was typically served as a starter for a meal.

In Renaissance Europe, salads still included mainly greens, but the seventeenth century “grand salad” for the first time included small bits of meat.

In the 1606 play Antony and Cleopatra, Shakespeare coined the phrase “salad days” to describe youthful inexperience—a synonym to being called a “greenie” when you’re new to a group or an activity. Cleopatra, regretting her youthful fling with Caesar, says

My salad days, / When I was green in judgment, cold in blood/To say as I said then

In 1699, British author, horticulturist, and vegetarian John Evelyn published a master work on salads called Acetaria: A Discourse on Sallets. Where many Britons saw meat and grains as the more desirable parts of a meal, Evelyn believed wholeheartedly in the benefits of eating salads and promoted a meatless diet. He also brought a spirit of experimentation and innovation to the kitchen and published many unique salad recipes.

Modern Salad and Social Stereotypes

According to Laura Shapiro, author of Perfection Salad, salads became a fixation of home cooks during the 1920s era of home economics and scientific cooking. She writes, “The object of scientific salad making was to subdue the raw greens until they bore as little resemblance as possible to their natural state. . . . This arduous approach to salad making became an identifying feature of cooking-school cookery and the signature of a refined household.” Because of the fussy and “dainty” nature they took on during this period, salads became associated with the women who typically prepared them. This led to a gendering of the salad, which had not been seen in those terms before. Additionally, salads were now seen as a food for the wealthy. Buying the ingredients for salad was only available to those who could afford both the vegetables and the means for storage and refrigeration.

In later decades, pressure to be thin and conform to societal beauty standards further reinforced the association of salads as “ladies’ food” in company with yogurt, chicken, and other “diet” foods. Salads took on a halo of health, even though heavier salads with creamy dressings and fried toppings may not have been the most nutritious choice. Still, the bare-bones salad made of little more than lettuce and chopped vegetables came to be seen as something you should eat but won’t enjoy, a code for “joyless healthy eating”—although marketers tried their best to convince women that they were supposed to love eating salad (remember Women Laughing Alone with Salad?). The food that was once favored in ancient Rome and enjoyed by both the upper classes and lower classes throughout time was now a symbol of deprivation. It was a symbol of never quite feeling satisfied, of never quite filling your own needs, of shrinking oneself down in order to please other people. The salad now comes with a higher price tag, but it is also stigmatized in part because it meant not feeling full or satisfied, in part because it was associated with women, and in part because it just didn’t seem like a complete meal to most Americans.

Salad is shedding its gender stereotypes and associations with deprivation and daintiness. A salad can be eaten by men and women, boys and girls—and hopefully, with the understanding that there is nothing inherent in salad that makes it gendered and with the understanding that something typically ascribed to women is not somehow inferior.

A salad can also be substantial and delicious, filled with a variety of ingredients and toppings, or it can accompany an entrée as a side. Due to both cultural pushback on the stereotypes of salads, and the innovations of chefs, salad restaurants are becoming more popular lunch destinations and offer a wide variety of options for salad lovers. You can have your salad and feel satisfied too!

On that note, let’s take a look at the origins of some common, delicious salads whose names you may never have questioned before.

Caesar Salad (Which Caesar?)

Image by Geoff Peters, CC BY 2.0 via Wikimedia Commons.

Despite the name, Caesar salad actually originated in Tijuana, Mexico, in 1924. The salad is named for the restaurateur, Caesar Cardini, who ran an Italian restaurant called Caesar’s. On the Fourth of July weekend, the restaurant was incredibly busy with American tourists, and Cardini improvised a new salad to feed hungry guests. Waiters made the salad tableside: garlic croutons, grated parmesan cheese, soft-boiled eggs, and a vinaigrette made with Italian olive oil and Worcestershire sauce were all placed atop pieces of romaine lettuce, which diners could eat with their hands.

In 1926, Cardini’s brother Alex, a former World War I pilot in the Italian Air Force, joined Caesar at the restaurant and modified his salad recipe. Alex’s version of the salad had the creamy Caesar dressing we know today flavored with anchovies. Alex’s “Aviator Salad,” as he called it, became more popular and was renamed “Caesar’s salad.” During the Prohibition era, Hollywood celebrities and ordinary Californians made the trip to Tijuana to dine at Caesar’s, equally excited for the Caesar’s salad as they were for the alcohol.

Soon, Caesar’s salad began to appear in restaurants across the U.S. and Europe. This may be attributed to the influence of Wallis Warfield Simpson, the mistress and eventually wife of Prince Edward VIII of Wales, former King of England. A frequent traveler to Tijuana and San Diego, she met the prince during her travels and also frequented Caesar’s. As duchess-to-be, she attempted to instruct European chefs on how to re-create her favorite salad. She also began to cut up the lettuce into smaller pieces rather than eating the salad with her fingers, exemplifying what she believed was the proper etiquette for a royal lady.

Cobb Salad (Where’s the Corn?)

Image by miheco, CC BY-SA 2.0 via Flickr.

Cobb salad was created in the 1930s at a Los Angeles restaurant called the Brown Derby, owned by Robert Howard Cobb. The most likely origin story holds that in 1937, Cobb fixed himself a late-night meal, using up anything and everything left in the kitchen to satisfy his hunger. He then mentioned his cobbled-together creation to Sid Gauman, a top Hollywood promoter, who requested to try the dish. Gauman loved it: the original mix of chopped avocado, celery, tomato, chives, watercress, hard-boiled eggs, chicken, bacon strips, and Roquefort cheese was a whole lot of variety and flavor all on one plate together.

With the backing of Hollywood-style promotion, the Cobb salad was on its way to international success. The modern Cobb typically contains chopped chicken, bacon, hard-boiled eggs, bleu cheese, tomatoes, avocado, and lettuce, though there are lots of variations.

Chef Salad (Who’s the Chef?)

Image by arbyreed, CC BY-NC-SA 2.0 via Flickr.

Historians disagree on the exact origin of and ingredients that compose the chef salad. Many claim that the original chef salad emerged in seventeenth-century England as a dish called “salmagundi.” This was a mixture of chopped meat, anchovies, eggs, onions, and olive oil. Other historians claim that the chef salad was invented by chef Victor Seydoux at the Hotel Buffalo in New York. Guests liked the improvised salad made from meat, cheese, tomatoes, cucumbers, and hard-boiled eggs so much that it was added to the menu. When given the honor of naming the salad, Seydoux remarked “Well, it’s really a chef’s salad.” Seydoux later worked at the Ritz-Carlton, where chef Louis Diat propelled his salad to fame and served it to many well-to-do travelers. He added smoked ox tongue and French dressing, which are not found on your typical salad today. Chef salad is really more of a label for a variety of composed salads and can be adapted to whatever you have in the kitchen.

And that’s the beauty of salads—they lend themselves to imagination, improvisation, and innovation. From just a few ingredients to elaborate mixtures of exotic flavors, from appetizers to dessert, we’ve just scratched the surface of the dish that lends itself to endless variety and experimentation.

Sources

Beck, Julie. “The Sad Ballad of Salad.” The Atlantic, July 28, 2016. https://www.theatlantic.com/health/archive/2016/07/the-sad-ballad-of-salad/493274/.

Briskin, Lizzy. “The History behind Your Favorite Salads.” Munchery, September 11, 2020. https://www.munchery.com/blog/history-behind-your-favorite-salads/.

“Here Is the Definitive History of Mankind’s Finest Food: The Salad.” HuffPost. https://www.huffpost.com/entry/evolution-of-the-salad_n_7101632.

Olver, Lynn. “Salads.” The Food Timeline. Retrieved June 22, 2012, from https://www.foodtimeline.org/foodsalads.html.

Ridder, Knight. “The Word Salad Was Coined from the Latin for Salted Greens.” Chicago Tribune, August 30, 2000.https://www.chicagotribune.com/news/ct-xpm-2000-08-30-0008300394-story.html.

“Salad.” Etymology Online Dictionary. Retrieved June 22, 2012, from https://www.etymonline.com/word/salad#etymonline_v_22616.

Sarah B. “Salat: A Medieval Aromatic Salad.” A Dollop of History, August 25, 2016. https://historydollop.com/2016/08/25/salat-an-aromatic-medieval-salad/.

Shakespeare. Antony and Cleopatra. Act 1, Scene 5.

Shapiro, Laura. Perfection Salad: Women and Cooking at the Turn of the Century. New York: North Point Press, 1986 (p. 96–99).

Stradley, Linda. “History of Salads and Salad Dressings.” What’s Cooking America. Retrieved June 21, 2021, from https://whatscookingamerica.net/History/SaladHistory.htm.

Wikipedia. “Chef Salad.” Retrieved June 21, 2012, from https://en.wikipedia.org/wiki/Chef_salad.

Running for Fun

Why do people run for fun—not because they’re being chased by a tiger or forced to run the mile in gym class? The answer involves the Olympics, the police, and advocacy for women’s athletics.

“If you don’t think you were born to run, you’re not only denying history. You’re denying who you are.” –Christopher McDougall, Born to Run

Run to Live

People have been running since the dawn of humanity. Evolutionary biologists posit that specific anatomical characteristics that are unique to humans enhance our ability to run and do not convey any additional benefits for walking.

Many other animals can run, of course, but humans are uniquely suited to distance running. From literally head to toe, we are made to run. Our upright posture, enhanced neck and head stability, and skeletal and muscle adaptations that enable long-distance, bipedal running are some of the evolutionary traits that make us human. Long legs relative to our body size help lengthen our stride. Springlike tendons in our legs and feet work like large elastics to absorb shock and conserve energy. Our sweat glands allow us to cool off without panting and keep our bodies from overheating. Large gluteal muscles are critical for stabilizing the body during running. The arrangement of bones in the feet and toes provides a stiff arch that can push off the ground more efficiently. Evolutionary biologists Dennis Bramble and Daniel Lieberman state that “the fossil evidence of these features suggests that endurance running . . . may have been instrumental in the evolution of the human body form” (Bramble and Lieberman, 2004). This means that running is encoded into our genes. The trade-off of all these beneficial biological adaptations was that our species is no longer well-suited to live in trees as our primate ancestors did.

According to Lieberman, about 2.6 million years ago, early human species began to eat meat, which could be obtained through either scavenging or hunting. About 2 million years ago, these distance running adaptations became characteristic of Homo erectus populations, theancestors to modern humans, as those who were better runners were better able to survive. Maybe we couldn’t run faster than a cheetah, but we could outrun any animal on earth in terms of distance. Persistence hunting was thus a primary survival strategy for early human species. The slow but steady pursuit of prey yielded great rewards as humans simply outlasted their target. Armed with only simple weapons, they tracked and chased animals during the hottest part of the day and made their prey run faster than it could sustain for long, eventually overpowering an animal as it developed hyperthermia and collapsed. Sweaty and relentless, early humans used this strategy to great advantage (Alvin, 2017).

Additionally, the consumption of animal meat provided more calories than plants alone, which fueled the growth of larger body sizes. Meat also provided amino acids and nutrients needed to develop more complex brains and higher-level cognitive functioning.

Some anthropologists criticize this hypothesis because there are few populations that practice persistence running today and the fact that it is effective in hot, grassland or savanna-type environments but may not be as effective in other climates. It’s true that this strategy is not common among present-day hunter-gatherer societies; however, more advanced technology like spears and projectile weapons have made persistence hunting less necessary. Additionally, our ancestral environments 2 million years ago likely differed from the current environments for which ethnographic research has been conducted.

Live to Run

Some other animals, like dogs and horses, can run great distances if they are forced to or to escape from danger. But humans are better at it, and what’s more, humans voluntarily run for miles on end.

Track and field is one of the oldest sports in the world. Reaching back to prehistoric times, humans have put their natural abilities in running, jumping, and throwing to use in athletic events for thousands of years. If there was ever an official beginning of running as a sport, it was the first ancient Olympic Games held in 776 BCE, in which the stadion footrace was the only event.

Running has also long been used as an exercise to build and maintain physical fitness for military activities.

Running as a form of recreation—not just for athletic competition, military conditioning, or survival—began to gain traction first in New Zealand. Cross-country running coach Arthur Lydiard promoted jogging through his Auckland Joggers Club. Bill Bowerman, a University of Oregon running coach and later cofounder of Nike, returned from a trip to New Zealand in 1962 and brought back Lydiard’s wisdom to the United States. Bowerman published a pamphlet and a book on the topic of jogging, casting it as an activity not just for pro athletes but for the average person who wanted to live a healthy lifestyle. He showed through research that running improved cardiovascular health, and his book was endorsed by medical professionals.

In the late 1960s, running for exercise started to gain traction in the United States, though it was still considered strange. Professional athletes and boxers ran as part of their training, but now, ordinary people were starting to join in. It was weird. Everyone stared at them. In 1968, the Chicago Tribune devoted a whole page to a strange new fitness trend called jogging, and the New York Times poked fun at the new “in” sport. What freaks, these people who chose to run in their free time! The most dedicated ones would run up to an entire mile.

Joggers in these early years attracted the attention of suspicious neighbors and police officers who were alarmed at grown men and women running down the street, suspecting “folly” at play. Runners were sometimes stopped on the street and given citations for their unusual use of the road.

The Running Boom

In the 1970s, nearly 25 million people hit the ground running in races, on trails, and on roads throughout America, largely inspired by Frank Shorter’s victory in the 1972 Olympic marathon. The 26.2 mile race was relatively unknown to Americans up until this point.

Shorter’s 2:12:19 finish marked the first American gold medal in the marathon since 1908. The finish was intercepted by a German imposter who darted into the stadium before Shorter. He was a college student who put on a track uniform and joined the race for the last kilometer, first to the tune of cheering from the audience and then to booing as officials got word of the hoax. “That’s not Frank, it’s an imposter, get that guy off the track! It’s a fraud, Frank!” the commentator called out over the radio. (The Washington Post named this one of the most memorable sports calls in history.) The coverage of this event changed the way the nation thought about long-distance running. The marathon, once an obscure event that you’d have to be crazy to attempt, was now front and center.

During the “running boom,” road racing events spread throughout the country, allowing public participation rather than restricting participation to exclusive, members-only athletic clubs. Ordinary people were doing 5Ks, 10Ks, and even marathons now. Australia, the UK, and other European nations saw a similar trend. Additional factors that contributed to the craze included several other books and studies about the health benefits of running, professional and Olympic runners such as Steve Prefontaine, and companies like Nike that gave a high profile to running in popular culture. Now it was not only acceptable but cool to be a runner.

Around this time, women’s participation in athletic events was gaining more acceptance. Title IX opened more opportunities to compete in events at the college level, and universities expanded cross-country and track teams for women to fulfill Title IX requirements. Women found road running and marathon running to be a prime entry point into the world of professional and college athletics. The feats of pioneering women runners like Kathrine Switzer (first woman to run the Boston Marathon), Jacqueline Hansen (two-time world record holder and successful advocate for adding the women’s marathon, 5,000 meter, and 10,000 meter events to the Olympics), Miki Gorman (elite marathoner famous for a New York City-Boston-New York City triple win in 1976–77) and Joan Benoit (first to win the women’s Olympic marathon) inspired women to take up running—for recreation and for aspiring to competitive speeds and distances, for health and for ambition and for fun.

Running made a smooth transition from survival strategy to sport, and from sport to many other roles in play, exercise, stress relief, and community and social life.

Benefits of Running

It turns out that running has great health benefits as well. The cognitive and physical benefits of running and other types of aerobic activity have been studied extensively. Running facilitates cell growth and expansion in the hippocampus, which is the area of the brain associated with memory. It improves neural plasticity and promotes neurogenesis, which in turn lead to better memory and learning capabilities (Schulkin, 2016).

Running is often a way for people to relieve stress. This works because the body releases endorphins during and after running, producing a feeling of euphoria. Schulkin (2016) writes, “Long-distance running partially involves combating pain and discomfort. . . . To struggle is to succeed, and to cope with struggling, the human body has evolved to release hormones associated with euphoric states so that when one is faced with a particularly trying physical feat, the [brain] is permeated with chemicals that induce a sense of calmness.”

Physically, running improves cardiovascular and respiratory health, strengthens the immune system, improves mental health, and can positively influence feelings of confidence and self-esteem.

Running is not for everyone, but it is for a lot of us. Distance running is in our genes—it’s one of the most quintessentially human things we can do, and it helps us become physically and mentally resilient. And the good news is that you don’t have to run 26.2 miles to reap many of the benefits of running any more than you need to hunt for your food in the grasslands.

BONUS: Check out “When Running Was for Weirdos” below.

Sources

Alex, Bridget. “Running Made Us Human: How We Evolved to Run Marathons.” Discover Magazine, April 12, 2019. https://www.discovermagazine.com/planet-earth/running-made-us-human-how-we-evolved-to-run-marathons.

Bramble, Dennis M., and Daniel E. Lieberman. “Endurance Running and the Evolution of Homo.” Nature 432, no. 7015 (2004): 345–52. https://doi.org/10.1038/nature03052.

Lieberman, Daniel E., Dennis M. Bramble, David A. Raichlen, and John J. Shea. “The Evolution of Endurance Running and the Tyranny of Ethnography: A Reply to Pickering and Bunn.” Journal of Human Evolution 53, no. 4 (2007): 439–442. https://dash.harvard.edu/handle/1/3743587.

Edwards, Phil. “When Running for Exercise Was for Weirdos.” Vox, August 9, 2015. https://www.vox.com/2015/8/9/9115981/running-jogging-history.

Pobiner, Briana. “Evidence for Meat-Eating by Early Humans.” Nature Education Knowledge 4, no. 6(2013): 1. Human Origins Program, Smithsonian Institution.

Powell, Alvin. “Humans Hot, Sweaty, Natural-Born Runners.” The Harvard Gazette, April 19, 2017. https://news.harvard.edu/gazette/story/2007/04/humans-hot-sweaty-natural-born-runners/.

Schulkin, Jay. “Evolutionary Basis of Human Running and Its Impact on Neural Function.” Frontiers in Systems Neuroscience 10, no. 59. (2016). https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4939291/

University of Utah. “How Running Made Us Human: Endurance Running Let Us Evolve to Look the Way We Do.” ScienceDaily, November 24, 2004. https://www.sciencedaily.com/releases/2004/11/041123163757.htm.

“When Did the History of Running Begin?” My Running Tips.  http://www.myrunningtips.com/history-of-running.html.

Wikipedia. “Running Boom of the 1970s.” https://en.wikipedia.org/wiki/Running_boom_of_the_1970s.