The Archetypal Apple

Why are apples seen as the “default” fruit in Western culture? The answer involves Greek myths, Latin spelling mistakes, and English semantic narrowing.

The English word apple comes from the Old English æppel, which meant not only “apple” but “any kind of fruit” or “fruit in general.” It’s an old, old word stemming from Proto-Indo-European *ab(e)l-, meaning “apple.” In Middle English and Early Modern English, eppel or appel was mostly used as a generic term for all types of fruit, excluding berries but including nuts. Dates were fingeræppla (“finger apples”), cucumbers were eorþæppla (“earth apples,”), and bananas were appels of paradis (more on that later!).

The simple answer to our question then, is that it is a matter of semantic narrowing. Apple went from being a general term for fruit to denoting the fruit we know today as an apple. Languages descended from Greek and Latin went through a similar process for the word for fruit as well. The Greek word melon originally meant “apple,” but it was combined with other roots to form words like mēlopepon “gourd-apple.” Melon was used in Greek as a generic term for any type of unknown fruit. The Latin word, pomme, makes reference to Pomona, the goddess of fruit trees. Pomme has likewise been used to refer to any fruit in general or specifically to apples (the phrase pomme de terre in French literally means “earth apple” and is the term for “potato”). The semantic value of the apple lies in the fact that it is an archetype for fruit, a pattern or prototype for all other fruits.

The apple holds great meaning in many cultural traditions throughout the world. Fruit in general is often seen as a symbol of fertility due to both its form and function. The Penguin Dictionary of Symbols records that the various meanings attached to the apple are—at their core—all interconnected. The apple is seen as a key to knowledge and wisdom of some kind, whether that be knowledge of mortal life and humanity, knowledge about oneself, or intimately knowing another. Let’s take a look at some of the ways the world views the apple.

Apples in Olympus

Greek, Chinese, and Norse tradition all contain various references to and stories about apples wherein they are symbols of fertility, beauty, and eternal youth. Apples can also be a negative symbol of temptation or vanity.

In Greek mythology, Eris, the goddess of chaos and discord, threw an apple into the wedding party of Thetis and Peleus out of anger that she had not been invited. (The “apple” in the story was actually a now-extinct fruit grown in the Balkans that was similar to a pomegranate.) She inscribed into the apple kallisti (To the Prettiest One). The goddesses Hera, Athena, and Aphrodite all claimed the apple of discord, and Zeus appointed Paris of Troy to select which of the three the apple should belong to. Aphrodite, goddess of love and fertility, persuaded Paris to give her the apple by promising him that she would make Helen—her half-sister and the most beautiful woman in the world—fall in love with him. The resulting relationship between Helen and Paris precipitated the Trojan War. Thus, an “apple of discord” is the kernel of a small argument that leads to a much bigger dispute!

Golden Apple of Discord by Jacob Jordaens, 1633. Museo del Prado.

Based on this story, the apple became a sacred relic of Aphrodite (or Venus, in Roman tradition). Throwing an apple at someone was the ancient Greek version of a marriage proposal or declaration of love, and to catch the apple was to accept. Today, newlyweds share an apple on their wedding night to ensure a “fruitful” union.

Apples in the Garden

The fig tree, native to the Mediterranean and Middle East, held similar symbolic meanings as the apple, as evidenced in ancient religious texts. The fig was one of the earliest domesticated fruits in the world, along with the olive and grape—all of which have their origins in the Fertile Crescent, one of several cradles of human civilization.

The Garden of Eden, the location of the creation narrative in Abrahamic religions, contained an abundant variety of trees and plants. Adam and Eve, the first human beings, are commanded by God to eat freely from the garden except from the Tree of Knowledge of Good and Evil.

In the Hebrew Bible, the Tree of Knowledge is not identified with a particular type of fruit. Eve is tempted by a serpent to eat the fruit of the Tree of Knowledge, and she and Adam both choose to do so. Their eyes are opened as they receive knowledge, and they cover themselves with fig leaves when they realize they are naked. In Hebrew tradition, the Tree of Knowledge itself is considered to be a fig tree, though this is not stated in the text.

In Islamic tradition, the Tree of Immortality, as it is known, is often portrayed as a fig or olive tree.

Manafi al-Hayawan (The Useful Animals) by Abu Said Ubaud Allah Ibn Bakhitshu.
1294–99 CE. Maragh, Iran.

Buddhist tradition sees the fig as a symbol of enlightenment. The Buddha reached enlightenment under a Bodhi tree, a species of fig tree. This symbolism from another area of the world corroborates the metaphor used in the Garden of Eden account—the fruit of knowledge is enlightenment.

In addition to representing knowledge, the fig is strongly associated with fertility and abundant life in many cultures. It is a symbol of male and female joining together—its plump shape is a metaphor for female fertility, and the sap of the tree represents male fertility.

In Christian tradition, the fruit of the Tree of Knowledge is often portrayed as an apple, though it is variously seen as a fig, pear, or pomegranate—all richly evocative of the ideas of fertility, the cycle of life, and desire due to their resemblance to human sexual anatomy. One fruit contains many seeds, each with the potential to produce a tree, which will then produce more fruit, and the cycle continues forever. The dual meanings of temptation and fertility are thus strongly associated with fruit in general, and the apple and fig in particular. Some Christians hold that one of the results of Adam and Eve’s choice to eat the fruit is the physical condition needed for procreation, as well as the knowledge needed to navigate mortality.

La tentation d’Adam et Ève, XIIIth century.

So how did the fig get turned into an apple?

The idea that the fruit of the Tree of Knowledge was an apple comes from a mix-up of the Latin words mălum, meaning “evil,” and mālum, meaning “apple.” The fruit of the Tree of Knowledge of Good and Evil was turned into the fruit of the Tree of Knowledge of Good and Apples!

Later literature that drew from the Bible, such as John Milton’s Paradise Lost (1667) continued cast the fruit as an apple, which only reinforced this mistranslation—but also reinforced the existing link between apples and knowledge. Renaissance art often featured the apple as the “forbidden fruit” in the Garden of Eden story.

Another matter to consider is that in contrast to figs, “apples were historically among the most difficult fruit trees to cultivate and among the last major ones to be domesticated in Eurasia, because their propagation requires the difficult technique of grafting” (Diamond, 1997, p. 150). Perhaps the cultural and dietary significance of apples was greater for Latin speakers at the time they were interpreting the Bible, while figs were more prominent for Hebrew speakers of an earlier era—though this is no more than conjecture.

The apple is referenced elsewhere in the Old Testament as well—readers are instructed to keep God and God’s commandments as “the apple of thine eye” (Proverbs 7:2, see also Deuteronomy 32:10, Psalms 17:8). The “apple” of one’s eye refers to the pupil, which resides in the very center of one’s eye and is fixed upon the thing one desires. In Hebrew, the word used for “apple” in these verses literally means “dark part of the eye.” The word “apple” was substituted in English translations of the Bible, using an idiom that first appeared in Old English around the ninth century. The phrasing in the English translation indicates that, to an English speaker, an apple represents the thing most desired or cherished above all others.

In the Song of Solomon, the apple is likewise a metaphor for beauty and desire: “As the apple tree among the trees of the wood, so is my beloved among the sons. I sat down under his shadow with great delight, and his fruit was sweet to my taste” (Song of Solomon 2:3). The word “apple” here is variously translated as “orange” or “citron”—the idea being that fruit of any kind is a symbol of desire and sweetness, among other meanings.

If an apple represents a thing that is most desirable, it makes sense for English speakers to cast the tree in the Garden of Eden as an apple tree, which Eve saw was “to be desired to make one wise” (Genesis 3:6, emphasis added).

Adam’s Apple

The Adam’s apple, a laryngeal protuberance formed by cartilage, is present in all humans, but is much more pronounced in men.

You may have heard the folk etymology behind the Adam’s apple as something like this: After Eve ate the fruit (“apple”) of the Tree of Knowledge, she convinced Adam to taste it, and a chunk of it got stuck in his throat as a reminder of his transgression. He then passed this on to all of his posterity in the form of a protuberance in the throat. It was later given the name “Adam’s apple” as a reminder of the Garden of Eden account.

Though it seems like plausible thinking behind the name in a society that held the Bible in high regard, this was not really the inspiration behind it. The English term as applied to human anatomy has been in use since 1625. The French pomme d’Adam and the German Adamsapfel both refer to the same thing. From the medieval period until the 1700s, a term meaning “Adam’s apple” was also used in various languages to describe literal fruits—pomelos, citrons, and plantains, for example, were all called “Adam’s apple” at one point. A Mediterranean variety of lime with indentations resembling the mark of a person’s teeth was a particularly vivid reminder of Adam biting into the fruit in the Garden of Eden.

Medieval Latin texts use the term pomum Adami as a name for several different fruits, including the pomegranate. This name implied that these were among the “fruits of paradise” enjoyed in the Garden of Eden. Around the same time, medieval Arab medical scholars were cataloging the anatomy of the throat, deciding on a word meaning “pomegranate” as the name for the laryngeal protuberance. We don’t know the exact reason why they chose this metaphor, but the pomegranate, too, is highly symbolic in Islamic religious tradition and beyond. European writers adopted the Latin translation, pommun granatum, for the laryngeal protuberance, then applied the synonym already in existence: pomum Adami.

And there you have it—the apple features prominently in mythology and religious thought, while etymologically capturing the essence of fruit itself. It is a symbol of temptation and knowledge, desire and abundant life.

The various meanings of the apple show up elsewhere in everyday life and popular culture. Snow White in her naivety was tempted to eat a poison apple that put her under a curse that only a prince could break (which strongly parallels Christian themes, if you think about it). Johnny Appleseed went down in American folk history as the sower of both apple seeds and religious ideals, spreading fruit and wisdom in service of nature and his fellow humans everywhere he went. Your laptop and phone most likely have an Apple logo on them. A student presents an apple to the teacher as a gift of knowledge.

How do you like them apples?

Sources

“Adam’s Apple.” Online Etymology Dictionary. Retrieved July 31, 2021, from https://www.etymonline.com/word/Adam’s%20apple#etymonline_v_40638.

“Apple.” Online Etymology Dictionary. Retrieved July 31, 2021, from https://www.etymonline.com/search?q=apple.

Diamond, Jared. Guns, Germs, and Steel: The Fates of Human Societies. New York City:W.W. Norton, 1997.

“Fruit in Mythology.” Encyclopedia of Myths. Retrieved July 31, 2021, from http://www.mythencyclopedia.com/Fi-Go/Fruit-in-Mythology.html.

Kettler, Sarah. “7 Facts About Johnny Appleseed.” Biography, June 11, 2020. https://www.biography.com/news/johnny-appleseed-story-facts.

“Melon.” Online Etymology Dictionary. Retrieved July 31, 2021, from https://www.etymonline.com/word/melon.

Merriam-Webster. “Why Is It Called an Adam’s Apple?” Word History. https://www.merriam-webster.com/words-at-play/why-is-it-called-an-adams-apple-word-history.

“Pomona.” Online Etymology Dictionary. Retrieved July 31, 2021, from https://www.etymonline.com/word/Pomona.

Smithfield, Brad. “In Ancient Greece, Throwing an Apple at Someone Was Considered a Marriage Proposal.” The Vintage News, September 10, 2016. https://www.thevintagenews.com/2016/09/10/ancient-greece-throwing-apple-someone-considered-marriage-proposal/.

Tearle, Oliver. “The Curious Symbolism of Apples in Literature and Myth.” Interesting Literature, April 2021. https://interestingliterature.com/2021/04/apples-symbolism-in-literature-myth-meaning-analysis/.

Wikipedia. “Apple of My Eye.” Retrieved July 27, 2021, from https://en.wikipedia.org/wiki/Apple_of_my_eye.

Spelling Bee

Why do good spellers compete in a spelling “bee”? The answer involves all the favorite subjects of a spelling bee winner—etymology, philology, and, of course, spelling.

The Queen Bee

According to Merriam-Webster, lookups of the word “murraya” spiked 100,000% on July 8–9, 2021.

Patterns of word usage ebb and flow over time, and—based on current events, pop culture, and other new or recycled ideas—so does our interest in certain words. One of those events is the Scripps National Spelling Bee.

On July 8, 2021, eighth grader Zaila Avant-garde of Louisiana became the first Black American to win the highest honor that may be bestowed upon the orthographically gifted. The winning word “murraya,” which most of us have probably never heard before, refers to a genus of tropical Asiatic and Australian trees named for Swedish botanist Johan A. Murray.

And Zaila’s ability to spelling obscure words is just the beginning of her talents—she also holds three basketball-related world records, she can unicycle and juggle simultaneously, and her side interests include gene editing and neuroscience.

The Helpful Bee

Unlike “murraya,” the word bee itself isn’t likely to turn up on a spelling bee word list. But most people, even spelling champions, probably don’t know the origin of the word. (Are honeybees particularly good at spelling competitions?)

As used in the context of a spelling bee, “bee” is an alteration of a word that was rendered “been” in some dialects of English. The word descends from the Middle English “bene,” which denoted “voluntary help given by neighbors toward the accomplishment of a particular task” (Merriam-Webster). “Bene” is also related to the English word “boon,” which similarly indicates a blessing, benefit, or favor.

This word “bee” has been used to describe community activities where neighbors made a social event out of helping each other with tasks. Historically, you might have attended a quilting bee, a (corn) husking bee, or a (barn) raising bee.

A pioneer quilting bee. The Quilt That Walked to Golden, p. 31.

And yes, some linguists also connect this term with the insect type of bee. The industrious and cooperative nature of bees provides an apt metaphor for a group of friendly neighbors working together to accomplish a task.

“Spelling bee” began to show up in print sources around the turn of the twentieth century. However, it was often modified with terms like “old-fashioned,” indicating that the spelling bee had been around for quite some time but under different names. Before then, a spelling competition might have been called trials in spelling, spelling school, spelling match, spelling-fight, spelling combat, or spelldown (these are all beginning to sound more like a Wizard’s Duel than anything else!).

The spelling bee, which is often described as a “brain sport,” is typically seen as competitive rather than cooperative. But the hard work required to prepare for such a competition and the buzzing of young contestants reciting letters point us toward the characteristics of the honeybee.

And what’s more, bees are foundational to our ecosystem. They pollinate the flowering plants that we depend on for food and raw materials and beauty and turn it into sweet, sweet honey. Likewise, the foundational elements of letters and words that build up the English language, which has been cross-pollinated with Latin and French and many other linguistic influences, are combined to produce the rich vocabulary and ever-evolving possibilities of expression that English offers today.

The Spelling Bee

In English, there was no such thing as “correct” spelling until the eighteenth century. Before then, writers freely spelled the same words in different ways. While others would generally understand what they meant, there was mounting frustration that there was no regularity in the written language. This frustration along with a Protestant push to increase literacy so that common people could read the Bible led to the publication of English dictionaries. Samuel Johnson’s 1755 dictionary was a highly influential dictionary of this time that prescribed English spelling and word usage. We can view the dictionary as a record of sometimes arbitrary decisions about which spelling of a word would be considered correct—decisions that we now see as indisputable.

Once there was an agreed-upon standard, “correct” language use came to be a sign of education. In class-conscious Britain, correct pronunciation was the mark of the elite, while in America, correct spelling was the signature of a scholar. By the mid-eighteenth century, it was common for American schools to hold spelling competitions for students—and thus, the spelling bee was born from the uniquely American obsession with prescribing how to write the English language. As mentioned previously, these competitions went by different names until the turn of the twentieth century.

Norman Rockwell, Cousin Reginald Spells Peloponnesus (Spelling Bee), 1918. Image from Wikimedia Commons.

The spelling bee was first held on the national level in 1925. Nine newspapers joined together to host the National Spelling Bee to promote literacy. The bee has been held every year since then except for during World War II and during the COVID-19 pandemic. In 1957, Scripps adopted Merriam-Webster’s Third International Dictionary as the official dictionary of the bee.

Spelling bees spread to many other countries around the world, but they are generally limited to English-speaking areas. Why? Because other languages have much more predictable spelling systems. English is one of the only languages where so much memorization is required!

To win a spelling bee takes more than just raw talent. It requires an exceptional degree of diligence and discipline for daily study, a love for the English language and its historical development, and support from expert coaches and commercial word lists. The 2006 film Akeelah and the Bee underscores the role of adult and community support, following the story of a young girl from an urban neighborhood and single-parent household who ascends to the ranks of the national spelling bee. Akeelah is a truly brilliant speller who overcomes both self-doubt and mocking from others. With courage and intelligence, she beats the odds and inspires all those who have rallied around her.

As elite spellers pass on their wisdom to the next generation and as coaching and commercial resources have become essential for success in the Scripps National Spelling Bee, the bar is raised higher and higher each year. Data analysis provides an avenue for analyzing weakness and improving efficiency in studying. We are getting better and better at the game.

In 2019, the bee ended in an eight-way tie as the contestants blew through round after round of challenging words as if they were a breeze. The Octo-Champs, as they are known, broke the game. Sports Illustrated wrote, “They hadn’t beaten one another. Instead, together, they’d beaten the dictionary.” Merriam-Webster responded: “The Dictionary concedes and adds that it is SO. PROUD.” After the astounding win, the rules were changed to include multiple-choice vocabulary questions and a lightning round to eliminate the possibility of a tie.

Whether it is a fierce competition on an international level or a local elementary school contest, the spelling bee is a celebration of the “correct” orthography that—while still not fixed, but much less fluid than in times past—is a mark of dedication and education. In fact, one former champion describes winning the spelling bee as an embodiment of the American meritocracy, as it requires both individual discipline and access to resources for study to beat the competition (Sealfon, 2019).

Sources

Baccellieri, Emma. “How the Octo-Champs of the 2019 National Spelling Bee Have Changed the Game.” Sports Illustrated, June 7, 2019. https://www.si.com/more-sports/2019/06/07/scripps-national-spelling-bee-8-way-tie-unprecedented-result-merriam-webster-dictionary.

Bowman, Emma. “National Spelling Bee Adds New Rules to Help Winners Sting the Competition.” NPR, April 23, 2021. https://www.npr.org/2021/04/23/990400434/national-spelling-bee-adds-new-rules-to-help-winners-sting-the-competition.

Fogarty, Mignon. “Why Is It Called a ‘Spelling Bee’?” Quick and Dirty Tips, June 7, 2018. https://www.quickanddirtytips.com/education/grammar/why-is-it-called-a-spelling-bee.

Merriam-Webster. “6 Actual Names for Historical Spelling Bees.” Word History. Merriam-Webster online dictionary. Retrieved July 16, 2021, from https://www.merriam-webster.com/words-at-play/alternate-spelling-bee-titles.

Merriam-Webster. “Trending: Murraya.” Merriam-Webster Trend Watch. Merriam-Webster online dictionary. https://www.merriam-webster.com/news-trend-watch/zaila-avant-garde-wins-bee-with-murraya-20210709.

Sealfon, Rebecca. “The History of the Spelling Bee.” Smithsonian Magazine, May 2019. https://www.smithsonianmag.com/arts-culture/history-spelling-bee-180971916/.

Shankar, Shalani. “Why It’s Big News When a Black Girl Wins the Scripps National Spelling Bee.” Chicago Sun-Times, July 12, 2021. https://chicago.suntimes.com/2021/7/12/22574106/scripps-national-spelling-bee-black-education-zaila-avant-garde-sun-times.

The Editors of Encyclopedia Brittanica. “Dictionary.” Retrieved July 17, 2021, from https://www.britannica.com/summary/dictionary.

Jumping on the Bandwagon

What’s a bandwagon, and why is everyone jumping on it? The answer involves the circus, Theodore Roosevelt, and cognitive biases.

The phrase “to jump on the bandwagon” means to join the most popular side or party or to go along with something that is growing in popularity. But today, there is generally no wagon in sight.

The Original Bandwagons

The word bandwagon dates back to the late 1840s and originally referred to a large, ornate wagon that carried the band in a circus procession. P.T. Barnum, the famous circus owner and “greatest showman,” recorded, “At Vicksburg we sold all our land conveyances excepting our horses and the ‘band wagon’”—one of the first printed references to such an attraction. Barnum’s circus put on parades through the towns they performed before they set up their show. These spectacles were made grander by bright wagons engraved with circus animals that held the performing musicians. It was a publicity stunt, and an effective one, too.

Barnum’s circus was not the only one on the bandwagon. An 1847 Louisiana newspaper described “a magnificent band-wagon, capable of holding twenty musicians” belonging to Messrs Stone and McCollum’s Circus.

Bandwagons began to be used in parades during other celebrations and political processions, especially for the Fourth of July. In the late 1800s, politicians caught on to the publicity potential as well, and they began using bandwagons in parades on the campaign trail.

When a campaign began to generate steam, other politicians and aspiring leaders rented seats on the bandwagon and rode through town in hopes of gaining an association with the successful candidate.

Riding “under the bandwagon” was a mark of favor and popularity, a must for any budding politician to get exposure among their future constituents.

The Bandwagon as a Metaphor

In 1884, the Woodstock Sentinel of Illinois published an article called “Anything to Beat Hamilton,” an interview about political tactics used by candidates to beat out incumbent Governor John Marshall Hamilton. The article reported:

The principal candidates for attorney general are Geo. Hunt, of Paris, Ill., and the present incumbent. I understand that Hunt is running under the wing of Oglesby. If so, he’ll beat McCartney, provided the latter allows Hunt to load him into the Hamilton band-wagon.”


“It is very evident, Judge, that you consider that any man who goes into Hamilton’s band-wagon is liable to get left.”

This is the earliest known reference that took the idea of riding in a politician’s bandwagon beyond the physical act. The bandwagon was now a metaphor for ingratiating or identifying oneself with a popular political figure in hopes that some of their success would rub off by association.

The metaphor would begin to take on a largely negative, opportunistic tone as well, in the vein of going along with the crowd and getting behind whoever was popular at the moment to raise one’s own status. As this article warned, jumping onto the bandwagon was not always the key to success—if that politician fell out of favor, so would everyone else parading along with them. Political speeches throughout the 1890s similarly warned voters not to jump on any candidate’s bandwagon too quickly.

In Theodore Roosevelt’s Letters, 1899, the future U.S. president generalized the phrase to refer to going along with anything that was popular, not just a political candidate: “When I once became sure of one majority they tumbled over each other to get aboard the band wagon.”

The Bandwagon Effect

The bandwagon effect is recognized as a cognitive bias whereby people tend to adopt certain behaviors or beliefs just because many other people are doing it. Marketers, propagandists, and political candidates all use the bandwagon effect to gain followers. “Everyone else is doing it, and so should you” is a tempting appeal for consumers and constituents alike. As Roosevelt observed, people will tumble over each other to avoid being left behind, to conform to group norms, to feel included and accepted.

Psychologists have proposed several different factors that play into the bandwagon effect. First, because social connection is key for human survival and well-being, we desperately want to feel like we belong as part of a group, so we often feel pressure to conform to an idea or trend that is popular among people we know. Second, the fear of missing out on something important is also a strong motivator for human behavior. Third, using group norms as a benchmark acts as a shortcut in the individual decision-making process. Finally, we all like to be on the winning side of things—and the person, idea, or behavior that is most likely to win is the one that is most popular. The bandwagon is reinforced through a positive feedback loop. This means that when more people are aware of or actively doing something (when more people jump on the bandwagon), other people are more likely to accept it and jump on the bandwagon as well.

The dark side of the bandwagon effect is when the group norms are questionable from an ethical or moral standpoint. The bandwagon has much potential for good—such as in developing positive views toward populations that were historically marginalized in society—but it can also contribute to the growth of extreme or dangerous social and political movements. The pressure to conform can also confine individual expression and lead to feelings of exclusion for those who do not jump on the bandwagon. It can be difficult to hold out on something one believes to be wrong when it seems like everyone else is doing it; however, the truth is that rarely is everyone else doing it. Your brain’s perception that everyone around you is participating in a belief or behavior you deem unconscionable could be a result of a cognitive bias, and media portrayals may play into this bias by showing a distorted view of reality.

In the digital age, the potential to exploit the bandwagon appeal is magnified by social media platforms. If an influencer has tens of thousands of followers, you might assume that they have something important to say or that people you know are benefiting from their content or product recommendations. Propagandists use bots and other fake accounts to build a large following and convince real users that everyone else is following them, so they must be legitimate.

Additionally, mass media and social media can create a false perception of public opinion about a given practice or issue. Those who make decisions about what content to promote to a large audience may choose to frame extreme, biased, or obscure ideas in a way that asserts that “everyone” believes this—and so should you! Those who influence the audience’s perception can then, in fact, induce a bandwagon effect.

The reverse bandwagon effect occurs when people avoid doing something because they perceive it as being popular, and they do not want to do it because everyone else is doing it. For the contrarians among us, just know that you are still under the influence of a cognitive bias that influences the way you act!

From cheering for sports teams to playing the stock market, from fashion trends to human rights, the bandwagon effect can be seen in just about every area of society. None of us are immune to this bias. However, when we are aware of our cognitive biases, we can be better prepared to make decisions that are consistent with who we are rather than what we perceive others are doing.

Sources

“Anything to Beat Hamilton.” Woodstock Sentinel, February 14, 1884.

“Bandwagon.” Online Etymology Dictionary. https://www.etymonline.com/word/bandwagon#etymonline_v_260.

Delwiche, Aaron. “Bandwagon.” The Propaganda Critic, August 8, 2018. https://propagandacritic.com/index.php/how-to-decode-propaganda/bandwagon/.

Tréguer, Pascal. “Origin of ‘to Jump on the Bandwagon.’” Word Histories, January 22, 2018. https://wordhistories.net/2018/01/22/jump-bandwagon-origin/.

The Decision Lab. “Why Do We Support Opinions as They Become More Popular? Retrieved June 27, 2021, from https://thedecisionlab.com/biases/bandwagon-effect/.

Upton, Emily. “The Origin of the Phrase ‘Jump on the Bandwagon.’” Today I Found Out, April 24, 2014. http://www.todayifoundout.com/index.php/2014/04/origin-phrase-jump-bandwagon/.

Cover image by Freekee, July 21, 2009, public domain via Wikimedia Commons.

Why Are Salads “Salted”?

Why does the word salad sound suspiciously like the word for salted in many languages? And where did salads come from, anyway? The answer takes us from ancient Rome to the high-class hotels of New York to Tijuana, Mexico.

Let’s talk about salads. From garden salad to pasta salad to taco salad to glorified rice (which is a real thing), the variety of dishes we apply this name to are vastly different from one another. A salad can be served at any point in a meal—as a first course, as a side dish, as a main entrée, or even as a dessert. The only requirement to be considered a salad is to have various food items mixed together, a sufficiently broad definition to classify a mix of tuna and mayonnaise in the same category as a mix of tropical fruits and jello.

The Roman Salad

The word salad came into English by way of the Old French salade, from Latin salata. The meaning of this word is what it sounds like–“salted.” In many Romance languages, the word for salad means salted (e.g., ensalada in Spanish). Additionally, German, Swedish, and Russian all borrowed the word salat into their language.

The original salads were herba salata, a popular Roman dish of leafy greens flavored with salt, olive oil, and vinegar. Similar dishes made from lettuce and other greens were widely enjoyed in ancient Egypt, Greece, and Rome.

In the second century CE, Hippocrates reinforced the practice of eating salads by claiming that eating raw vegetables before a meal helped clear the intestines and ensure healthy digestion. Others countered and said the salad should be eaten after the meal because the vinegar in the dressing would conflict with the flavor of the wine drunk during the meal.

Despite its popularity in the Roman Empire, salad was not consumed in every area of the world. In China, for example, salads were not appealing, and raw vegetables were considered to carry a risk of illness. Instead, vegetables were boiled or cooked in stir-fries.

Shakespeare’s Salads

During the Middle Ages, salads were a staple food for common people and royalty alike, composed of flowers, herbs, wild berries, and vegetables grown in household gardens. A salad was typically served as a starter for a meal.

In Renaissance Europe, salads still included mainly greens, but the seventeenth century “grand salad” for the first time included small bits of meat.

In the 1606 play Antony and Cleopatra, Shakespeare coined the phrase “salad days” to describe youthful inexperience—a synonym to being called a “greenie” when you’re new to a group or an activity. Cleopatra, regretting her youthful fling with Caesar, says

My salad days, / When I was green in judgment, cold in blood/To say as I said then

In 1699, British author, horticulturist, and vegetarian John Evelyn published a master work on salads called Acetaria: A Discourse on Sallets. Where many Britons saw meat and grains as the more desirable parts of a meal, Evelyn believed wholeheartedly in the benefits of eating salads and promoted a meatless diet. He also brought a spirit of experimentation and innovation to the kitchen and published many unique salad recipes.

Modern Salad and Social Stereotypes

According to Laura Shapiro, author of Perfection Salad, salads became a fixation of home cooks during the 1920s era of home economics and scientific cooking. She writes, “The object of scientific salad making was to subdue the raw greens until they bore as little resemblance as possible to their natural state. . . . This arduous approach to salad making became an identifying feature of cooking-school cookery and the signature of a refined household.” Because of the fussy and “dainty” nature they took on during this period, salads became associated with the women who typically prepared them. This led to a gendering of the salad, which had not been seen in those terms before. Additionally, salads were now seen as a food for the wealthy. Buying the ingredients for salad was only available to those who could afford both the vegetables and the means for storage and refrigeration.

In later decades, pressure to be thin and conform to societal beauty standards further reinforced the association of salads as “ladies’ food” in company with yogurt, chicken, and other “diet” foods. Salads took on a halo of health, even though heavier salads with creamy dressings and fried toppings may not have been the most nutritious choice. Still, the bare-bones salad made of little more than lettuce and chopped vegetables came to be seen as something you should eat but won’t enjoy, a code for “joyless healthy eating”—although marketers tried their best to convince women that they were supposed to love eating salad (remember Women Laughing Alone with Salad?). The food that was once favored in ancient Rome and enjoyed by both the upper classes and lower classes throughout time was now a symbol of deprivation. It was a symbol of never quite feeling satisfied, of never quite filling your own needs, of shrinking oneself down in order to please other people. The salad now comes with a higher price tag, but it is also stigmatized in part because it meant not feeling full or satisfied, in part because it was associated with women, and in part because it just didn’t seem like a complete meal to most Americans.

Salad is shedding its gender stereotypes and associations with deprivation and daintiness. A salad can be eaten by men and women, boys and girls—and hopefully, with the understanding that there is nothing inherent in salad that makes it gendered and with the understanding that something typically ascribed to women is not somehow inferior.

A salad can also be substantial and delicious, filled with a variety of ingredients and toppings, or it can accompany an entrée as a side. Due to both cultural pushback on the stereotypes of salads, and the innovations of chefs, salad restaurants are becoming more popular lunch destinations and offer a wide variety of options for salad lovers. You can have your salad and feel satisfied too!

On that note, let’s take a look at the origins of some common, delicious salads whose names you may never have questioned before.

Caesar Salad (Which Caesar?)

Image by Geoff Peters, CC BY 2.0 via Wikimedia Commons.

Despite the name, Caesar salad actually originated in Tijuana, Mexico, in 1924. The salad is named for the restaurateur, Caesar Cardini, who ran an Italian restaurant called Caesar’s. On the Fourth of July weekend, the restaurant was incredibly busy with American tourists, and Cardini improvised a new salad to feed hungry guests. Waiters made the salad tableside: garlic croutons, grated parmesan cheese, soft-boiled eggs, and a vinaigrette made with Italian olive oil and Worcestershire sauce were all placed atop pieces of romaine lettuce, which diners could eat with their hands.

In 1926, Cardini’s brother Alex, a former World War I pilot in the Italian Air Force, joined Caesar at the restaurant and modified his salad recipe. Alex’s version of the salad had the creamy Caesar dressing we know today flavored with anchovies. Alex’s “Aviator Salad,” as he called it, became more popular and was renamed “Caesar’s salad.” During the Prohibition era, Hollywood celebrities and ordinary Californians made the trip to Tijuana to dine at Caesar’s, equally excited for the Caesar’s salad as they were for the alcohol.

Soon, Caesar’s salad began to appear in restaurants across the U.S. and Europe. This may be attributed to the influence of Wallis Warfield Simpson, the mistress and eventually wife of Prince Edward VIII of Wales, former King of England. A frequent traveler to Tijuana and San Diego, she met the prince during her travels and also frequented Caesar’s. As duchess-to-be, she attempted to instruct European chefs on how to re-create her favorite salad. She also began to cut up the lettuce into smaller pieces rather than eating the salad with her fingers, exemplifying what she believed was the proper etiquette for a royal lady.

Cobb Salad (Where’s the Corn?)

Image by miheco, CC BY-SA 2.0 via Flickr.

Cobb salad was created in the 1930s at a Los Angeles restaurant called the Brown Derby, owned by Robert Howard Cobb. The most likely origin story holds that in 1937, Cobb fixed himself a late-night meal, using up anything and everything left in the kitchen to satisfy his hunger. He then mentioned his cobbled-together creation to Sid Gauman, a top Hollywood promoter, who requested to try the dish. Gauman loved it: the original mix of chopped avocado, celery, tomato, chives, watercress, hard-boiled eggs, chicken, bacon strips, and Roquefort cheese was a whole lot of variety and flavor all on one plate together.

With the backing of Hollywood-style promotion, the Cobb salad was on its way to international success. The modern Cobb typically contains chopped chicken, bacon, hard-boiled eggs, bleu cheese, tomatoes, avocado, and lettuce, though there are lots of variations.

Chef Salad (Who’s the Chef?)

Image by arbyreed, CC BY-NC-SA 2.0 via Flickr.

Historians disagree on the exact origin of and ingredients that compose the chef salad. Many claim that the original chef salad emerged in seventeenth-century England as a dish called “salmagundi.” This was a mixture of chopped meat, anchovies, eggs, onions, and olive oil. Other historians claim that the chef salad was invented by chef Victor Seydoux at the Hotel Buffalo in New York. Guests liked the improvised salad made from meat, cheese, tomatoes, cucumbers, and hard-boiled eggs so much that it was added to the menu. When given the honor of naming the salad, Seydoux remarked “Well, it’s really a chef’s salad.” Seydoux later worked at the Ritz-Carlton, where chef Louis Diat propelled his salad to fame and served it to many well-to-do travelers. He added smoked ox tongue and French dressing, which are not found on your typical salad today. Chef salad is really more of a label for a variety of composed salads and can be adapted to whatever you have in the kitchen.

And that’s the beauty of salads—they lend themselves to imagination, improvisation, and innovation. From just a few ingredients to elaborate mixtures of exotic flavors, from appetizers to dessert, we’ve just scratched the surface of the dish that lends itself to endless variety and experimentation.

Sources

Beck, Julie. “The Sad Ballad of Salad.” The Atlantic, July 28, 2016. https://www.theatlantic.com/health/archive/2016/07/the-sad-ballad-of-salad/493274/.

Briskin, Lizzy. “The History behind Your Favorite Salads.” Munchery, September 11, 2020. https://www.munchery.com/blog/history-behind-your-favorite-salads/.

“Here Is the Definitive History of Mankind’s Finest Food: The Salad.” HuffPost. https://www.huffpost.com/entry/evolution-of-the-salad_n_7101632.

Olver, Lynn. “Salads.” The Food Timeline. Retrieved June 22, 2012, from https://www.foodtimeline.org/foodsalads.html.

Ridder, Knight. “The Word Salad Was Coined from the Latin for Salted Greens.” Chicago Tribune, August 30, 2000.https://www.chicagotribune.com/news/ct-xpm-2000-08-30-0008300394-story.html.

“Salad.” Etymology Online Dictionary. Retrieved June 22, 2012, from https://www.etymonline.com/word/salad#etymonline_v_22616.

Sarah B. “Salat: A Medieval Aromatic Salad.” A Dollop of History, August 25, 2016. https://historydollop.com/2016/08/25/salat-an-aromatic-medieval-salad/.

Shakespeare. Antony and Cleopatra. Act 1, Scene 5.

Shapiro, Laura. Perfection Salad: Women and Cooking at the Turn of the Century. New York: North Point Press, 1986 (p. 96–99).

Stradley, Linda. “History of Salads and Salad Dressings.” What’s Cooking America. Retrieved June 21, 2021, from https://whatscookingamerica.net/History/SaladHistory.htm.

Wikipedia. “Chef Salad.” Retrieved June 21, 2012, from https://en.wikipedia.org/wiki/Chef_salad.

Running for Fun

Why do people run for fun—not because they’re being chased by a tiger or forced to run the mile in gym class? The answer involves the Olympics, the police, and advocacy for women’s athletics.

“If you don’t think you were born to run, you’re not only denying history. You’re denying who you are.” –Christopher McDougall, Born to Run

Run to Live

People have been running since the dawn of humanity. Evolutionary biologists posit that specific anatomical characteristics that are unique to humans enhance our ability to run and do not convey any additional benefits for walking.

Many other animals can run, of course, but humans are uniquely suited to distance running. From literally head to toe, we are made to run. Our upright posture, enhanced neck and head stability, and skeletal and muscle adaptations that enable long-distance, bipedal running are some of the evolutionary traits that make us human. Long legs relative to our body size help lengthen our stride. Springlike tendons in our legs and feet work like large elastics to absorb shock and conserve energy. Our sweat glands allow us to cool off without panting and keep our bodies from overheating. Large gluteal muscles are critical for stabilizing the body during running. The arrangement of bones in the feet and toes provides a stiff arch that can push off the ground more efficiently. Evolutionary biologists Dennis Bramble and Daniel Lieberman state that “the fossil evidence of these features suggests that endurance running . . . may have been instrumental in the evolution of the human body form” (Bramble and Lieberman, 2004). This means that running is encoded into our genes. The trade-off of all these beneficial biological adaptations was that our species is no longer well-suited to live in trees as our primate ancestors did.

According to Lieberman, about 2.6 million years ago, early human species began to eat meat, which could be obtained through either scavenging or hunting. About 2 million years ago, these distance running adaptations became characteristic of Homo erectus populations, theancestors to modern humans, as those who were better runners were better able to survive. Maybe we couldn’t run faster than a cheetah, but we could outrun any animal on earth in terms of distance. Persistence hunting was thus a primary survival strategy for early human species. The slow but steady pursuit of prey yielded great rewards as humans simply outlasted their target. Armed with only simple weapons, they tracked and chased animals during the hottest part of the day and made their prey run faster than it could sustain for long, eventually overpowering an animal as it developed hyperthermia and collapsed. Sweaty and relentless, early humans used this strategy to great advantage (Alvin, 2017).

Additionally, the consumption of animal meat provided more calories than plants alone, which fueled the growth of larger body sizes. Meat also provided amino acids and nutrients needed to develop more complex brains and higher-level cognitive functioning.

Some anthropologists criticize this hypothesis because there are few populations that practice persistence running today and the fact that it is effective in hot, grassland or savanna-type environments but may not be as effective in other climates. It’s true that this strategy is not common among present-day hunter-gatherer societies; however, more advanced technology like spears and projectile weapons have made persistence hunting less necessary. Additionally, our ancestral environments 2 million years ago likely differed from the current environments for which ethnographic research has been conducted.

Live to Run

Some other animals, like dogs and horses, can run great distances if they are forced to or to escape from danger. But humans are better at it, and what’s more, humans voluntarily run for miles on end.

Track and field is one of the oldest sports in the world. Reaching back to prehistoric times, humans have put their natural abilities in running, jumping, and throwing to use in athletic events for thousands of years. If there was ever an official beginning of running as a sport, it was the first ancient Olympic Games held in 776 BCE, in which the stadion footrace was the only event.

Running has also long been used as an exercise to build and maintain physical fitness for military activities.

Running as a form of recreation—not just for athletic competition, military conditioning, or survival—began to gain traction first in New Zealand. Cross-country running coach Arthur Lydiard promoted jogging through his Auckland Joggers Club. Bill Bowerman, a University of Oregon running coach and later cofounder of Nike, returned from a trip to New Zealand in 1962 and brought back Lydiard’s wisdom to the United States. Bowerman published a pamphlet and a book on the topic of jogging, casting it as an activity not just for pro athletes but for the average person who wanted to live a healthy lifestyle. He showed through research that running improved cardiovascular health, and his book was endorsed by medical professionals.

In the late 1960s, running for exercise started to gain traction in the United States, though it was still considered strange. Professional athletes and boxers ran as part of their training, but now, ordinary people were starting to join in. It was weird. Everyone stared at them. In 1968, the Chicago Tribune devoted a whole page to a strange new fitness trend called jogging, and the New York Times poked fun at the new “in” sport. What freaks, these people who chose to run in their free time! The most dedicated ones would run up to an entire mile.

Joggers in these early years attracted the attention of suspicious neighbors and police officers who were alarmed at grown men and women running down the street, suspecting “folly” at play. Runners were sometimes stopped on the street and given citations for their unusual use of the road.

The Running Boom

In the 1970s, nearly 25 million people hit the ground running in races, on trails, and on roads throughout America, largely inspired by Frank Shorter’s victory in the 1972 Olympic marathon. The 26.2 mile race was relatively unknown to Americans up until this point.

Shorter’s 2:12:19 finish marked the first American gold medal in the marathon since 1908. The finish was intercepted by a German imposter who darted into the stadium before Shorter. He was a college student who put on a track uniform and joined the race for the last kilometer, first to the tune of cheering from the audience and then to booing as officials got word of the hoax. “That’s not Frank, it’s an imposter, get that guy off the track! It’s a fraud, Frank!” the commentator called out over the radio. (The Washington Post named this one of the most memorable sports calls in history.) The coverage of this event changed the way the nation thought about long-distance running. The marathon, once an obscure event that you’d have to be crazy to attempt, was now front and center.

During the “running boom,” road racing events spread throughout the country, allowing public participation rather than restricting participation to exclusive, members-only athletic clubs. Ordinary people were doing 5Ks, 10Ks, and even marathons now. Australia, the UK, and other European nations saw a similar trend. Additional factors that contributed to the craze included several other books and studies about the health benefits of running, professional and Olympic runners such as Steve Prefontaine, and companies like Nike that gave a high profile to running in popular culture. Now it was not only acceptable but cool to be a runner.

Around this time, women’s participation in athletic events was gaining more acceptance. Title IX opened more opportunities to compete in events at the college level, and universities expanded cross-country and track teams for women to fulfill Title IX requirements. Women found road running and marathon running to be a prime entry point into the world of professional and college athletics. The feats of pioneering women runners like Kathrine Switzer (first woman to run the Boston Marathon), Jacqueline Hansen (two-time world record holder and successful advocate for adding the women’s marathon, 5,000 meter, and 10,000 meter events to the Olympics), Miki Gorman (elite marathoner famous for a New York City-Boston-New York City triple win in 1976–77) and Joan Benoit (first to win the women’s Olympic marathon) inspired women to take up running—for recreation and for aspiring to competitive speeds and distances, for health and for ambition and for fun.

Running made a smooth transition from survival strategy to sport, and from sport to many other roles in play, exercise, stress relief, and community and social life.

Benefits of Running

It turns out that running has great health benefits as well. The cognitive and physical benefits of running and other types of aerobic activity have been studied extensively. Running facilitates cell growth and expansion in the hippocampus, which is the area of the brain associated with memory. It improves neural plasticity and promotes neurogenesis, which in turn lead to better memory and learning capabilities (Schulkin, 2016).

Running is often a way for people to relieve stress. This works because the body releases endorphins during and after running, producing a feeling of euphoria. Schulkin (2016) writes, “Long-distance running partially involves combating pain and discomfort. . . . To struggle is to succeed, and to cope with struggling, the human body has evolved to release hormones associated with euphoric states so that when one is faced with a particularly trying physical feat, the [brain] is permeated with chemicals that induce a sense of calmness.”

Physically, running improves cardiovascular and respiratory health, strengthens the immune system, improves mental health, and can positively influence feelings of confidence and self-esteem.

Running is not for everyone, but it is for a lot of us. Distance running is in our genes—it’s one of the most quintessentially human things we can do, and it helps us become physically and mentally resilient. And the good news is that you don’t have to run 26.2 miles to reap many of the benefits of running any more than you need to hunt for your food in the grasslands.

BONUS: Check out “When Running Was for Weirdos” below.

Sources

Alex, Bridget. “Running Made Us Human: How We Evolved to Run Marathons.” Discover Magazine, April 12, 2019. https://www.discovermagazine.com/planet-earth/running-made-us-human-how-we-evolved-to-run-marathons.

Bramble, Dennis M., and Daniel E. Lieberman. “Endurance Running and the Evolution of Homo.” Nature 432, no. 7015 (2004): 345–52. https://doi.org/10.1038/nature03052.

Lieberman, Daniel E., Dennis M. Bramble, David A. Raichlen, and John J. Shea. “The Evolution of Endurance Running and the Tyranny of Ethnography: A Reply to Pickering and Bunn.” Journal of Human Evolution 53, no. 4 (2007): 439–442. https://dash.harvard.edu/handle/1/3743587.

Edwards, Phil. “When Running for Exercise Was for Weirdos.” Vox, August 9, 2015. https://www.vox.com/2015/8/9/9115981/running-jogging-history.

Pobiner, Briana. “Evidence for Meat-Eating by Early Humans.” Nature Education Knowledge 4, no. 6(2013): 1. Human Origins Program, Smithsonian Institution.

Powell, Alvin. “Humans Hot, Sweaty, Natural-Born Runners.” The Harvard Gazette, April 19, 2017. https://news.harvard.edu/gazette/story/2007/04/humans-hot-sweaty-natural-born-runners/.

Schulkin, Jay. “Evolutionary Basis of Human Running and Its Impact on Neural Function.” Frontiers in Systems Neuroscience 10, no. 59. (2016). https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4939291/

University of Utah. “How Running Made Us Human: Endurance Running Let Us Evolve to Look the Way We Do.” ScienceDaily, November 24, 2004. https://www.sciencedaily.com/releases/2004/11/041123163757.htm.

“When Did the History of Running Begin?” My Running Tips.  http://www.myrunningtips.com/history-of-running.html.

Wikipedia. “Running Boom of the 1970s.” https://en.wikipedia.org/wiki/Running_boom_of_the_1970s.

Why Does Red Mean Stop and Green Mean Go?

Why do we know to automatically stop at a red traffic light and go at a green light? The answer involves a train crash, a gas explosion, and the Model T Ford.

Railroad Signals

In the 1830s, the railroad industry developed a system of signals that would direct train engineers to stop or go. Like modern traffic lights, they used three lights to signal which action the trains should take. Red, the color of blood, has been a signal of danger for thousands of years. It easily lent itself to being the color for stop. At this time, green was chosen as the color for caution, and white was the color for go.

Soon, it became apparent that white was a bad choice. It was easily confused with other white lights. In 1914, a red lens fell out of a light fixture and left it shining white light, turning a stop signal into a go signal. A train zoomed through the white signal and crashed into a train going the opposite direction. To prevent similar incidents, green was reassigned to mean go, and yellow was chosen to represent caution. Yellow was different enough from the other two colors that it stood out and was readily visible to train engineers.

Around the same time the colored light signals were developed, railroads began using a mechanical signaling system called a semaphore. These were poles with an attached arm that pivoted to different positions to signal train drivers. Today, most countries have phased out semaphores in favor of colored lights. The term “semaphore” is now also used as a synonym for a traffic light and as a more general term for any visual signaling system. It comes from the Greek sema (“sign” or “signal”) and phoros (“bearer”), literally meaning “a bearer of signals.”

A railroad semaphore.
Photo by David Ingham, June 8, 2008, CC BY-SA 2.0 via Wikimedia Commons.

To the Streets

In 1865, London faced a growing problem with clashes between horse-drawn traffic and pedestrians in the streets. Railway manager and train engineer John Peake Knight, who specialized in building signaling systems for British railways, presented a lighted signaling system to the Metropolitan Police as a solution for controlling traffic and preventing accidents. His design combined a semaphore for use during the day and a system of red and green gas-powered lights for the night.

On December 10, 1865, Knight’s semaphore/light signal was implemented at an intersection near the Parliament building. It worked great, but only for about a month, when a leak in the gas line supplying the lights caused an explosion. The police officer operating the semaphore was badly burned, and the semaphore/light system was immediately discontinued.

The Model T

In 1913, the year the Model T Ford was rolled out, about 4,000 people died in car crashes in the United States. Roads and highways were simply not designed for vehicles that could travel at 40 miles per hour (which was lightning fast compared to about 15 miles per hour max for horse-drawn carriages). As the Model T made cars more affordable for the middle class, more and more people were driving on the road. Soon, crowded intersections became confusing and dangerous for motorists, pedestrians, horses, and cyclists competing for the right of way.

Police officers stationed in traffic towers manually signaled drivers using lights, semaphores, or their arms. When they used lights, red meant stop and green meant go, but they did not use a yellow light—instead, they blew a whistle to alert drivers that they were about to change the signal. However, few drivers paid attention, especially at busy intersections, and crashes continued to occur.

The Electric Traffic Light

With the growing use of electric lights rather than gas lamps in the late 1800s, the stage was set for the invention of the electric traffic light. The first in history was Lester Wire’s 1912 handmade contraption in Salt Lake City. A police officer exasperated with a growing number of traffic incidents in the city, Wire constructed a wooden box that looked like a birdhouse, set red and green lights in it, and raised it up on a ten-foot pole. While Wire was truly the first to invent the electric traffic signal, he is often overshadowed by others who came later and had civil authorities and patents on their side.

In 1914, Cleveland engineer James Hoge also had the idea to borrow the red-green light system used by railroads. He suspended electric lights on a wire above the intersection of Euclid Avenue and East 105th Street in Cleveland, creating the first “municipal traffic control system.” Hoge’s traffic apparatus was similar to Knight’s and Wire’s in that a police officer had to sit in a traffic tower and switch the light every so often. Unlike these more rudimentary systems, however, Hoge’s caught on quickly.

A police officer in a traffic tower.
Photo by Olle Karlsson, July 24, 1953, Sweden, from Wikimedia Commons.

In 1920, Detroit police officer William L. Potts invented a traffic signal suitable for a four-way intersection that used all three colors present in the railway system—red, yellow, and green. However, Pott’s system still required someone to manually change the light, a tedious and costly way to manage traffic.

Automatic signals were thus in high demand, and several systems were invented throughout the 1920s. The first ones changed the color of the lights at timed intervals, but this meant that vehicles had to stop even when there were no other cars crossing the intersection. Charles Adler Jr. then invented a signal that could detect a car’s horn honking. To get the light to change, a car could honk its horn. Though the signal would not be able to change again for at least 10 seconds, this system was not popular because of all the noise from honking cars.

Later on, more efficient and less noisy systems were invented to sense when cars were present at an intersection and time the traffic lights accordingly. By the 1930s, traffic lights were beginning to spread to other countries in the world, becoming a signal of progress, growth, and industry in the US and abroad. Additionally, in 1935, the various systems in use in the US were standardized by the federal government, and all cities with stoplights were required to adopt the red, yellow, green light system to avoid confusion and inconsistency from one city to the next. The colors also must be lined up in the order red, yellow, green from top to bottom (which also helps colorblind drivers to distinguish which light is on).

How effective was the traffic light at preventing accidents? Smithsonian Magazine explains that due to the traffic light, “motor vehicle fatality rates in the United States fell by more than 50 percent between 1914 and 1930” (Nelson, 2018).

Socialization, Semiotics, and Gamification

Unfortunately, the traffic light also contributed to the rise of road rage on the streets. As the Smithsonian Magazine notes, pedestrians and drivers no longer had to acknowledge one another at intersections; they merely waited until the lights signaled they could go. Patience wore thin as people began to grumble when waiting for red lights to change.

From a young age, the public began to be socialized into the knowledge of traffic light signals along with models for good citizenship. As early as 1919, a schoolteacher in Cleveland came up with a game called “Red Light, Green Light,” which taught children to recognize traffic signals. The red/yellow/green system soon became instinctual as people learned the simple set of actions they represented. In the words of communication experts, “our ability to respond appropriately” to conventions like the traffic light “depends on our ability to use cultural experience to interpret signs and symbols appropriately, instantly, and instinctively.” The study of these types of symbols is called semiotics. To interpret symbols like the traffic light, “humans rely on signifiers and message shortcuts, whose meanings develop over time into almost universally accepted aspects of language” (Rackham and Gray, 2021). Where the traffic light is a signifier, we don’t have to be told to stop when the light is red—we have a message shortcut that bridges the gap between seeing a red light and knowing to stop.

Soon, the red/yellow/green light scheme was ingrained in many aspects of culture, an assumed semiotic system that permeated many different areas of life. Traffic signals were incorporated into other children’s games and toys. Educational programs on everything from nutrition to healthy relationships use green, yellow, and red to signify when to proceed with an action, when to slow down or use caution, and when to stop. Your boss may give you the “green light” to proceed with a project, or you might receive a “yellow light” while negotiations are on hold.

Yellow: The Color of Ambiguity

While the green light is unambiguously a signal to go and the red light unequivocally means stop, the yellow light is—well, somewhat up to interpretation and context. Some drivers see it as a sign to slow down and prepare to stop, while others see it as an indication to speed up and get through the light before it turns red. Laws regulating yellow lights are intentionally vague, as drivers are expected to use their own discretion and common sense to navigate an intersection at a yellow light. Semiotics is not always so simple, it seems.

BONUS: Click here for some interesting and unique crosswalk signals from around the world.

Sources

Adams, Cecil. “Who Decided Red Means ‘Stop’ and Green Means ‘Go’?” The Straight Dope, March 7, 1986. https://www.straightdope.com/21341613/who-decided-red-means-stop-and-green-means-go.

History.com Editors. “First Electric Traffic Signal Installed.” This Day in History. History.com, August 3, 2020. https://www.history.com/this-day-in-history/first-electric-traffic-signal-installed.

I Drive Safely. “The History and Meaning of Colored Traffic Lights.” Retrieved May 29, 2021, from https://www.idrivesafely.com/defensive-driving/trending/history-and-meaning-colored-traffic-lights.

Marusek, Sarah. “Visual Jurisprudence of the American Yellow Traffic Light.” International Journal for the Semiotics of Law, 27, 183–191.https://link.springer.com/article/10.1007/s11196-013-9323-z.

Nelson, Megan Kate. “A Brief History of the Stoplight.” Smithsonian Magazine, May 2018. https://www.smithsonianmag.com/innovation/brief-history-stoplight-180968734/.

Rackham, Scott, and Paxton Gray. Social Media Communication. Orem, UT: MyEducator, 2021.

Scott. “The Origin of the Green, Yellow, and Red Color Scheme for Traffic Lights.” Today I Found Out, March 8, 2012. https://www.todayifoundout.com/index.php/2012/03/the-origin-of-the-green-yellow-and-red-color-scheme-for-traffic-lights/.

“Semaphore.” Online Etymology Dictionary. Accessed May 29, 2021, from https://www.etymonline.com/search?q=semaphore.

Birthday Candles: Satanic Ritual, Moon Worship, or the Gift of Industrial Revolution?

Why do we blow out candles on birthday cakes? The answer involves Egyptian theocracy, the moon goddess, and (as for many holidays) the mixing of Christian and pagan tradition.

A tumblr post that went around the internet a while back drew attention to the strange, ritualistic custom that takes place during birthday celebrations:

“A small gathering of people huddle around an object on fire, chanting ritualistically a repetitive song in unison until the fire is blown out and a knife is stabbed into the object.”

The post laughingly called this a bit satanic. The real story of the birthday cake does have a little something to do with evil spirits—but more in the vein of warding them away with candlelight and merrymaking.

The origin of the birthday celebration itself must be put together piece by piece, drawing from different time periods and cultural traditions.

Ancient Egypt, Greece, and Rome

When a pharaoh was crowned in ancient Egypt, it was believed that he or she was transformed or reborn into a god. As early as 3,000 BCE, Egyptians celebrated the pharaoh’s coronation day as the birthday of a god. The Greeks may have inherited this custom of celebrating the birth of a god from the Egyptians. In many pagan belief systems, days of major change in the world or in a person’s life were thought to invite evil spirits into the world. When birthdays began to be celebrated for common people rather than just religious figures, a widespread belief was that evil spirits would visit people on their birthdays, so a party must be held to scare them away. Party-goers helped the birthday person feel cheerful, made a racket with noisemakers, and brought candles as a light in the darkness to ward away the sprits. These early birthday parties were thus considered a form of protection against evil.

But what’s a birthday party without some cake?

The candles came first. In ancient Greece, round cakes were baked in honor of Artemis, the moon goddess. Candles were placed on top to represent the glow of the moon. In Greece, as in many ancient societies, it was believed that the smoke of the candles would carry their prayers to heaven (in this case, to the moon).

The birthday cake came second. Ancient Romans may have been the first to celebrate birthdays of non-religious figures. Birthday celebrations included a sweet, bread-like pastry made from flour, nuts, honey, and yeast. (Cake and bread were largely interchangeable terms until more recently in history, and sugar in its many refined forms was not used in the Mediterranean until about the thirteenth century.) These honey cakes were typically found at birthday celebrations for members of the imperial family, private birthday celebrations for family and friends, and also at weddings. Fiftieth birthday celebrations merited a special pastry made from flour, grated cheese, honey, and olive oil.

Rome may have had the first birthday celebrations and birthday cakes for the common man, but it was really justthe common man. Blatant inequality between the sexes meant that women’s birthdays were not acknowledged until the twelfth century, hundreds of years later.

Germany’s Kinderfeste

Early Christians considered these celebrations inappropriately pagan and did not observe birthdays until about the fourth century, when they began to celebrate the birth of Jesus. In medieval Germany, a sweet bread was baked in the shape of baby Jesus to commemorate His birth.

Sometime between the fifteenth and eighteenth centuries, Germans began to celebrate a child’s birthday with a Kinderfeste party. In the morning, a cake called a Geburstagstorte would be topped with the number of candles corresponding to the child’s age, plus one candle representing the “light of life,” the hope for another year of life to come. The candles were lit and left to burn all day until after dinner, at which point the child would make a wish, try to blow out all the candles, and then eat the cake.  Blowing out the candles signified that the birthday wishes would reach God as the smoke floated to the heavens—a highly Christian interpretation. At the Kinderfeste, the child was surrounded by family and friends, which was supposed to provide protection from evil spirits who might attempt to steal their soul—a relic of pagan superstition.

Ein Kinderfest
By Ludwig Knaus, 1868, oil on canvas, image from Wikimedia Commons.

Though birthday cakes were mostly for children at this time, wealthy people of all ages also had fabulous birthday desserts. In 1746, traveler Andreas Frey wrote of Count Ludwig von Zinzindorf’s extravagant birthday celebration that featured a cake with candles: “There was a Cake as large as any Oven could be found to bake it, and Holes made in the Cake according to the Years of the Person’s Age, every one having a Candle stuck into it, and one in the Middle” (Frey, 15).

Let Them Eat Cake

Geburstagstorten were originally similar to the lightly sweetened Roman birthday pastries, but as time went on, it became more common to bake sweeter cakes with sugar. In the seventeenth century, multiple layers, icing, and decorations were introduced as well. This luxury dessert would only have been available to the upper class.

The Industrial Revolution changed everything, as mass production of sugar and other ingredients and utensils made birthday cakes available to almost everyone. Additionally, bakeries could now offer pre-made cakes at reasonable prices.

COVID-19 may have put a temporary damper on blowing all over a cake everyone is about to eat (and also on gathering in groups for a party), but that hasn’t stopped anyone from quarantine-baking some delicious cakes. It’s someone’s birthday somewhere, right?

Sources

Frey, Andreas. A True and Authentic Account of Andrew Frey. Containing the Occasion of His Coming among the … Moravians [&c.]. Transl, Volume 8. Oxford University, 1753.

madcenturion. Tumblr. April 23, 2013.

McCormick, Chloe. “The Real Reason We Eat Cake on Birthdays.”

Spoon University. https://spoonuniversity.com/lifestyle/origin-of-birthday-cake.

Origjanska, Magda. “Finding the Origin of the Birthday Cake with Candles (and Song) Tradition.” The Vintage News, January 8, 2018. https://www.thevintagenews.com/2018/01/08/birthday-cake/.

Pump It Up Admin. “How Did the Tradition of Birthdays Begin?” February 3, 2017. https://www.pumpitupparty.com/blog/how-did-the-tradition-of-birthdays-begin/#:~:text=Birthdays%20first%20started%20as%20a,%E2%80%9D%20days%2C%20welcomed%20evil%20spirits.

Sterling, Justine. “A Brief History of the Birthday Cake.” Food & Wine, May 23, 2017. https://www.foodandwine.com/desserts/cake/brief-history-birthday-cake.

Van Lulling, Todd. “This Is Why You Get to Celebrate Your Birthday Every Year.” HuffPost, November 11, 2013. https://www.huffpost.com/entry/history-of-birthdays_n_4227366.

What’s Cooking America. “Birthday Cake History.” https://whatscookingamerica.net/History/Cakes/BirthdayCake.htm/.

A Short History of Hamburgers

Why are hamburgers called hamburgers if they’re not made out of ham? The answer spans time and space from the Mongol invasion of Russia to the German revolutions of 1848 to the McDonald’s Big Mac.

A hamburger and French fries is probably the most American meal you could think of. Let’s consider that for a moment . . . a hamburger, named not for the meat it’s made of but for Hamburg, Germany, and French fries are considered quintessentially American. This is yet another testament to the powerful influences of immigration and cultural exchange that continue to shape the culture of America today.

“Two All-Beef Patties . . .”

There is considerable controversy over the origin of the hamburger. Because ground beef steak and bread have been eaten separately in many different countries for centuries, it is unknown exactly how the hamburger as we know it came to be. For one, there are similar dishes found throughout Europe. Sicia omentata from fourth-century Rome was a baked beef patty mixed with pine kernels, peppercorns, and white wine. Steak tartare had its origins in the twelfth-century Mongol invasion of Russia, when Mongol invaders stashed meat under their saddles to tenderize it while they rode to battle and then ate it raw. Russians called this preparation steak tartare, after their name for the Mongols. When ships from the port of Hamburg came to Russia to trade, they brought back steak tartare as raw ground beef shaped into a patty with a raw egg yolk on top.

Modern steak tartare.
Image by Rainier Zenz, CC BY-SA 3.0 via Wikimedia Commons.

A more direct German precursor to the hamburger is the seventeenth-century Frikadeller, which were flat, pan-fried beef meatballs. In eighteenth-century England and America, the Hamburgh sausage was prepared with chopped beef, spices, and wine and was supposedly a recipe that mimicked the preferences of immigrants and visitors from Hamburg. A nineteenth-century adaptation, called the Hamburg steak, is the most recognizably hamburger-like preparation and carried the Hamburg name. It was a minced beef filet, sometimes mixed with onions and bread crumbs, then salted and smoked and served raw in a pan sauce.

Hamburg steak
Hamburg steak.
Image by OiMax, CC BY 2.0 via Wikimedia Commons.

In 1848, political revolutions throughout the German Confederation pushed many Germans to immigrate to the United States. Known as the “Forty-Eighters,” these immigrants were just the first of a wave of European immigrants. The 1850s saw a larger increase in the immigrant population in the US relative to the overall population than any other decade in history. The German-born population alone increased 118.6% during this decade as immigrants arrived in New York by boat and spread throughout the East and Midwest states.

And here’s where the controversy comes in. The first version of the hamburger origin story claims that Germans arriving on the Hamburg-America line had already been preparing and consuming the Hamburg steak, as it was a popular meal among workers, and the smoked preparation kept well at sea. Immigrants enjoyed the meal and continued to make Hamburg steaks out of fresh meat once they got to America.

The second version goes that the Hamburg steak was created to meet demand for quick, cheap food for German sailors and immigrants arriving in America. Hamburg was known as an exporter of high-quality beef, so the Hamburg steak was offered in America as an idea of what might appeal to those arriving from Germany. Street vendors opened up along the coast where the Hamburg-America line docked, selling lightly grilled meat patties in the “Hamburg style” as a quick meal, perhaps accompanied with bread.

Port of Hamburg, 1862.
Image from Library of Congress via Wikimedia Commons.

In the second half of the nineteenth century, following this wave of immigration, the Hamburg steak was found at restaurants all over the port of New York. It was rather expensive at first (a whole 11 cents!), but with the growth in rural beef production and railroad transportation, the cost of beef decreased, and the meal became more widely available. Cookbooks of the time included detailed instructions for preparing the “hamburger steak,” as it was known from 1889 on.

The hamburger steak was soon viewed as a quintessentially American food, influenced as it was by the waves of immigrants who formed the character of the country. As Chicago and other cities in the East developed into major centers for the large-scale processing of beef, the hamburger steak became widely affordable and available to the average consumer—it was the “American beef dream.”

“Special Sauce, Lettuce, Cheese, Pickles, Onions”

By the early 1900s, the term hamburger steak was shortened to simply hamburger. Sometime between 1885 and 1904, someone decided to put the hamburger steak between two slices of bread, thus inventing the hamburger sandwich we know today. Some credit the founder of fast-food joint White Castle as the inventor of the hamburger sandwich, while others cite small-town cooks in Texas or Oklahoma or Ohio who placed a Hamburg steak between two slices of bread. Hamburgers were served on two thick slices of toast at the St. Louis World’s Fair in 1904, where they gained major exposure and created a sensation among fair-goers. Various claims exist and are not well-documented, and it’s likely that multiple people had the idea for a hamburger sandwich around the same time.

Toppings soon followed. Onions had long accompanied the hamburger steak, and other vegetables like lettuce and tomatoes were added for a fresher appearance. Ketchup was first commercially produced in 1869 by Henry Heinz and soon became a near-universal condiment for the hamburger.

“. . . On a Sesame-Seed Bun”

Fast-food restaurants played a major role in cementing the hamburger as the all-American meal. White Castle, which opened its doors in 1919, is regarded as one of the first true fast-food restaurants. When Upton Sinclair’s book The Jungle, published in 1906, caused public outrage and anxiety over the state of meat processing in the country, restaurants had to deal with negative perceptions of products made from ground meat. White Castle took efforts to promote itself as a clean and hygienic facility, and paired this with rapid service and a simple menu centered around hamburgers and coffee.

Like many foods and practices that had their origins in Germany, the hamburger may have lost popularity during World War I due to anti-German sentiment. Additionally, the word “hamburger” conjured up images of greasy, cheap fair food for some consumers. For these reasons, White Castle hamburgers were rebranded as “sliders” to avoid referencing a German city or invoking other unsavory connotations. (Frankfurters received a similar treatment and were called “hot dogs” from then on, and they never quite regained their name.)

But other fast-food companies did not necessarily follow suit, and the term hamburger was still in use during the Great Depression as White Castle’s production methods became faster, more efficient, and more standardized, providing customers with a predictable meal and experience no matter where they were in the country. This concept would revolutionize the world of restaurants as the birth of fast food.

By the 1940s, the term “hamburger” was shortened to “burger,” which became a new combining form—giving us the parts we needed to build words like cheeseburger, veggie burger, and baconburger. Around the same time, McDonald’s came onto the scene, building on White Castle’s system and adding drive-in service. A competing chain called Bob’s Big Boy lays claim to the first documented instance of making a hamburger with the now-standard sesame-seed bun, due to a request from a customer who wanted “something different.” This order also resulted in the first double-decker burger as the chef cut the bun in three pieces to hold two hamburger patties. (Though the sesame seeds used for hamburger buns today have been rendered tasteless, they add visual appeal and cause people to salivate when they see them.)

By Peter Klashorst, CC BY 2.0 via Wikimedia Commons.

When the Big Mac was introduced in 1967, all bets were off: McDonald’s was the leader in fast food and the main driver behind the popularity of the American-style hamburger worldwide. “Two all-beef patties, special sauce, lettuce, cheese, pickles, onions on a sesame-seed bun” was now the established recipe for a premium-quality fast-food hamburger.

Influenced by immigrants and innovation, the hamburger has now became an internationally recognized symbol of American culture and of globalization. Just ask Inspector Closeau:

Sources

Barksdale, Nate. “How the Hamburger Began—And How It Became an Iconic American Food.” History.com, January 6, 2021. https://www.history.com/news/hamburger-helpers-the-history-of-americas-favorite-sandwich.

“German Immigration in the 1850s.” Ancestry.com.  https://www.ancestry.com/historicalinsights/german-immigration-1848.

“Hamburger.” The Merriam-Webster New Book of Word Histories. Springfield, Massachusetts: Merriam-Webster Inc., 1991, p. 210. https://archive.org/details/merriamwebsterne00merr/page/210/mode/2up.

“History and Legends of Hamburgers.” What’s Cooking America. Retrieved May 8, 2021, from https://whatscookingamerica.net/History/HamburgerHistory.htm.  

Ozersky, Josh. The Hamburger: A History. Yale University Press, 2008.

Satran, J. “How Did Hamburger Buns Get Their Seeds?” HuffPost, April 10, 2015. https://www.huffpost.com/entry/hamburger-bun-history_n_7029310.

Walhout, Hannah. “A History of the Burger: From Ancient Rome to the Drive-Thru.” Food & Wine, June 20, 2017. https://www.foodandwine.com/comfort-food/burgers/burger-timeline.

Wikipedia. “Hamburg Steak.” Retrieved May 12, 2021, from https://en.wikipedia.org/wiki/Hamburg_steak.

Wikipedia. “History of the Hamburger.” Retrieved May 12, 2021, from https://en.wikipedia.org/wiki/History_of_the_hamburger.

Wittke, Carl. Refugees of Revolution: The German Forty-Eighters in America. University of Pennsylvania Press, 1952.

The Secret History of White Chocolate

Is white chocolate actually chocolate—and where did it come from in the first place? The answer involves the snow-capped mountains of Switzerland, questionable pharmaceutical cookbooks, and children’s vitamins.

White Chocolate vs. Chocolate

First, let’s talk about chocolate. According to the Chicago Tribune, the cacao bean—the main ingredient of chocolate—contains about equal parts cocoa butter and cacao nibs. The cocoa butter provides the creamy, smooth mouthfeel of chocolate while the cacao nibs are responsible for the distinctive taste and aroma. Chocolate is made from both cocoa butter and cacao nibs, along with sugar and often milk. Per FDA regulations, chocolate must contain at least 10% cocoa mass to be labeled as chocolate. (Also called chocolate liquor, cocoa mass is the result of finely grinding cacao nibs and includes cocoa butter that is present in the cacao nibs.) White chocolate, on the other hand, is made from cocoa butter without the cacao nibs. By law, it must contain at least 20% cocoa butter and 14% milk products to be labeled as white chocolate.

Some chocolate purists thumb their nose at insistence of white chocolate daring to pretend that it is on par with its milk and dark chocolate cousins since it is missing a crucial ingredient. But some argue that because white chocolate is made from part of the cacao bean, it should be grouped with other types of chocolate. But the law settles this debate: legally, white chocolate cannot just be called chocolate.

The Unwritten History

Despite its sweet, innocent taste, white chocolate has a hidden past. The history of white chocolate is less than straightforward, and Nestlé has long tried to claim ownership. Until recently, the story went that Nestlé developed white chocolate in 1936 as a way to use up excess milk powder that had been produced during World War I. However, this story skips over many earlier uses of white chocolate—and the real story has less to do with powdered milk as a wartime product and more to do with infant formula.

Food historian Sarah Wassberg Johnson has uncovered several sources showing that white chocolate had actually been made as early as the 1860s. A new technique in the world of chocolate making called the Broma process may have had something to do with it. Developed in 1865, the Broma process involves placing cacao beans into a bag at a warm temperature to allow the cocoa butter to drip out, leaving the beans ready for processing into cocoa powder. Johnson proposes that this created a surplus of cocoa butter and spurred experimentation to find new ways to use it.

Recipes for white chocolate begin to appear in cookbooks shortly thereafter, although the formulas were quite different from modern white chocolate.

An 1869 recipe for white chocolate caramels was simply a recipe for caramels with the addition of cocoa butter. This recipe appeared in a cookbook by two French chefs.

Recipe for white chocolate caramel tablets

One recipe from an 1872 American cookbook included tapioca, powdered sugar, oatmeal, and Iceland moss (yum) along with “concentrated tincture of Caraccas cacao,” and it was designated as a suitable composition for “delicate persons.”

Another cookbook, specifically for druggists, directed the cook to mix sugar, rice flour, arrowroot, vanilla, and powdered gum Arabic (because everyone keeps that on hand . . .) with cacao butter and then pour the mixture into molds.

The White Chocolate Candy Bar

In the 1910s and ’20s, these experiments moved outside the realm of home kitchens and pharmacies as Swiss chocolatiers began to produce white chocolate. A skeptical article in the International Confectioner sneered at rumors of “snow white chocolate” in Switzerland, which the author, T. B. McRobert, saw as an imaginary nod to the snowy Swiss Alps. McRobert said he would never eat such a thing, as it would have to be bleached with toxic gases to produce the white color. But the rumors grew, and it turns out white chocolate was real (and safe to eat!). Swiss white chocolate was made from cocoa butter and sugar, sometimes with milk powder, chestnut meal, or vanilla.

The early twentieth century saw rapid growth in the candy industry, especially during World War I, which paved the way for the first commercially produced white chocolate. The Double Zero Bar was introduced in 1920 by the Hollywood Brands company in Minnesota. The novel confection consisted of a caramel, peanut, and almond nougat covered in white chocolate fudge, a unique look for a candy bar. If all this talk about chocolate is making you hungry, you’re in luck—this candy bar still produced today, though it’s now called the ZERO Bar and sold by Hershey’s.

A vintage Zero bar wrapper.
The original Zero bar wrapper.
Image from the Candy Wrapper Museum.

Nestlé’s White Chocolate Vitamins

While Nestlé certainly doesn’t have a claim to producing the first white chocolate, it does have a claim to being the first to commercially produce solid white chocolate.

German-Swiss chemist Henri Nestlé had spent part of his career working on an infant formula that could help alleviate the high infant mortality rate in Germany. He subsequently experimented with powdered and condensed milk products that could improve peoples’ quality of life. In 1879, he founded the Nestlé Company with chocolatier Daniel Peter in Switzerland. The duo had perfected a recipe for milk chocolate in 1875 using Nestlé condensed milk, and their partnership continued to prove fruitful. Nestlé began using his scientific expertise to develop new, innovative products both in the candy industry and in the areas of medicine and health.

In 1936, Nestlé worked with the pharmaceutical company Roche to develop a new product called Nestrovit, a tablet made from vitamin-enriched condensed milk that would help provide children with essential nutrients for growth and development. Nestlé faced the challenge of finding a coating for the tablet that would protect the ingredients from damage and preserve their nutritional benefits. Using his knowledge of chocolate production, Nestlé added some cocoa butter to the Nestrovit formula and created a white chocolate coating for the tablet.

Aside from creating a successful health supplement, Nestlé saw the potential for even greater value in the new variety of chocolate he had made. In 1936, the Nestlé Company launched the Galak bar (branded as Milkybar in the UK), a pure white chocolate bar with a sweet and creamy flavor. In 1948, the Alpine White bar with almonds came on the scene and truly popularized white chocolate bars in the US and Canada market. Marketing for the Alpine White bar drew upon the snow-capped Swiss Alps—coming back full circle to prove the early skeptics of white chocolate wrong.

Milkybar, sold in the UK
Image by Evan-Amos from Wikimedia Commons.

White Chocolate Today

White chocolate is often passed over for its cocoa mass-containing counterparts, and many people associate it with cheaper, waxy-textured novelty candy. However, some chocolatiers have begun to take it more seriously. Specialty chocolate makers see white chocolate as a blank canvas for other flavors creative add-ins, without the taste of cacao nibs to overpower delicate flavors. Rosemary and sea salt, roasted strawberry, turmeric and pomegranate, and caramelized or “blond” white chocolate are only some of the unique flavors that artisan chocolatiers have dreamed up.

White chocolate may still be the underdog, and it may not actually be considered chocolate, but it seems that it has great potential for innovation in the future!

Sources

Blakely, Henry. The Druggist’s General Receipt Book: Containing a Copious Veterinary Formulary: Numerous Recipes in Patent and Proprietary Medicines, Druggists’ Nostrums, Etc.: Perfumery and Cosmetics: Beverages, Dietetic Articles, and Condiments: Trade Chemicals, Scientific Processes, and an Appendix of Useful Tables. Philadelphia: Lindsay & Blakiston, 1871. Digitized by Harvard University. https://catalog.hathitrust.org/Record/100598206.

Gouffé, Jules, and Alphonse Gouffé. The Royal Cookery Book (le Livre de Cuisine). 1869. Digitized by Harvard University.

Guittard. “Glossary of Terms.” https://www.guittard.com/in-the-kitchen/article/glossary-of-terms.

“Henri Nestlé.” Wikipedia. Retrieved May 5, 2021, from https://en.wikipedia.org/wiki/Henri_Nestl%C3%A9.

Johnson, Sarah Wassberg. “Before Nestle: A History of White Chocolate.” The Food Historian, February 14, 2021. https://www.thefoodhistorian.com/blog/before-nestle-a-history-of-white-chocolate.

Marchetti, Silvia. “How White Chocolate Evolved from a Coating for Kids’ Medicine into a Sweet, Creamy Treat,” November 9, 2019. https://www.scmp.com/magazines/style/news-trends/article/3036673/how-white-chocolate-evolved-coating-kids-medicine-sweet.

Seth, Simran. “For Those Who Think White Chocolate Isn’t ‘Real’ Chocolate, Have We Got Bars for You.” Chicago Tribune, November 28, 2017. https://www.chicagotribune.com/dining/recipes/ct-white-chocolate-is-real-chocolate-20171128-story.html.

TCHO. “Is White Chocolate Actually Chocolate?” January 9, 2018. https://tcho.com/blogs/news/is-white-chocolate-actually-chocolate.

The Dessert Book: A Complete Manual from the Best American and Foreign Authorities. With Original Economical Recipes. Boston: J. E. Tilton and Company, 1872. Digitized by Harvard University. https://babel.hathitrust.org/cgi/pt?id=hvd.32044087496899&view=1up&seq=125.

The Garden of Children

Why is the first year of school for children called kindergarten? The answer involves a nature mystic, a case of mistaken identity, and a socialism scare.

Kindergarten stands out from the other required years of education in the United States for its unique name. First grade, second grade, and third grade follow, all the way up to twelfth grade (plus some alternate names for the high school years). So why the special name for kindergarten?

The Founding of Kindergarten

The word kindergarten was coined in 1840 by German teacher and educational reformer Friedrich Fröbel, from the words Kinder (“children”) and Garten (“garden”). Like all nouns in  German, the word Kindergarten is capitalized, but this styling is usually not carried over into English.

Friedrich Fröbel, by C. W. Bardeen, 1897.
Image from the Library of Congress.

Fröbel used the word in a proposal that called for the development of early childhood education as a necessary part of widespread educational and social reform. He advocated for the unique needs of young children and opened up an experimental infant school in Prussia called the Child Nurture and Activity Institute. He later renamed it Kindergarten, reflecting his philosophy that young children should be nurtured like “plants in a garden.” Schools for young children in the 1700s and 1800s had formerly been glorified babysitters, philanthropic endeavors to care for impoverished children, or discipline in preparation for adulthood. Fröbel’s school instead focused on encouraging self-expression and learning through play, singing, gardening, and group activities, and it formed the basis for early childhood education techniques used today.

In 1851, Kindergarten schools were banned in Prussia due to a mix-up of Fröbel with his nephew Karl, who was a socialist and had published a treatise proclaiming more radical views about education. The government mistakenly attributed Karl’s “atheistic and demagogic” views to his uncle, who was sincerely religious (in the form of nature mysticism and pantheism) and dedicated to improving childhood education. The ban on Kindergarten led to a diaspora of German teachers to other countries in Europe and the United States, where they spread their teaching model to other schools. In 1856, Margaretha Meyer-Schurz opened the first German-speaking kindergarten in the U.S. A few years later, Elizabeth Palmer Peabody embraced Fröbel’s model after visiting Germany and opened the first English-speaking kindergarten in the U.S. Peabody is largely credited with popularizing the concept of kindergarten in America.

Im Kindergarten, by Hugo Oehmichen, 1879.
Image from Wikimedia Commons.

Translating Kindergarten

English borrowed the word kindergarten from German without translating it, but it is translated into Romance languages word for word in a way that preserves the original meaning of the Kinder + Garten roots. In French, the term is jardin d’enfants (“garden of children”), in Spanish, jardín de infancia (“garden of childhood”), and in Portuguese, jardim de infância (“garden of childhood”). A few non-Romance languages such as Hebrew do the same thing: gan yeladim means “garden of children.” A loanword that is translated this way is called a calque. Other words that use a similar translation scheme include honeymoon, Adam’s apple, and loanword itself.

These words are not very common in Romance languages anymore, nor is the term kindergarten widely used in the UK. During and after World War II, German language and culture was looked down upon in many nations, and some have claimed that these calques of Kindergarten were eclipsed by other terms devoid of German roots.

Kindergarten around the World

In many countries, children from ages three to seven attend kindergarten or the equivalent. Where the United States distinguishes between preschool and kindergarten, many other countries do not, and kindergarten is instead part of the preschool system. Children may attend the same kindergarten/preschool for two years or more before beginning their primary education.

Fröbel was one of the most influential educational reformers in the modern educational system, and the effects of his work—and his words—are still seen today. Kindergarten is a place where we can begin to explore and learn without many of the social pressures of older childhood—where we don’t have to be anyone but ourselves.

Sources

Curtis, Stanley James. “Friedrich Froebel.” Encyclopaedia Britannica.  https://www.britannica.com/biography/Friedrich-Froebel.

Eschner, Kat. “A Little History of American Kindergartens.” Smithsonian Magazine, May 16, 2017. https://www.smithsonianmag.com/smart-news/little-history-american-kindergartens-180963263/.

“kindergarten (n.).” Etymology Online Dictionary. Retrieved April 26, 2021, from https://www.etymonline.com/search?q=kindergarten.

“kindergarten (n.).” Oxford English Dictionary. Retrieved April 26, 2021.

Wikipedia. “Friedrich Fröbel.” Retrieved April 27, 2021, from https://en.wikipedia.org/wiki/Friedrich_Fr%C3%B6bel.

Wikipedia. “Kindergarten.” Retrieved April 26, 2021, from https://en.wikipedia.org/wiki/Kindergarten.

Wikipedia. “List of calques.” Retrieved April 26, 2021, from https://en.wikipedia.org/wiki/List_of_calques.