Running for Fun

Why do people run for fun—not because they’re being chased by a tiger or forced to run the mile in gym class? The answer involves the Olympics, the police, and advocacy for women’s athletics.

“If you don’t think you were born to run, you’re not only denying history. You’re denying who you are.” –Christopher McDougall, Born to Run

Run to Live

People have been running since the dawn of humanity. Evolutionary biologists posit that specific anatomical characteristics that are unique to humans enhance our ability to run and do not convey any additional benefits for walking.

Many other animals can run, of course, but humans are uniquely suited to distance running. From literally head to toe, we are made to run. Our upright posture, enhanced neck and head stability, and skeletal and muscle adaptations that enable long-distance, bipedal running are some of the evolutionary traits that make us human. Long legs relative to our body size help lengthen our stride. Springlike tendons in our legs and feet work like large elastics to absorb shock and conserve energy. Our sweat glands allow us to cool off without panting and keep our bodies from overheating. Large gluteal muscles are critical for stabilizing the body during running. The arrangement of bones in the feet and toes provides a stiff arch that can push off the ground more efficiently. Evolutionary biologists Dennis Bramble and Daniel Lieberman state that “the fossil evidence of these features suggests that endurance running . . . may have been instrumental in the evolution of the human body form” (Bramble and Lieberman, 2004). This means that running is encoded into our genes. The trade-off of all these beneficial biological adaptations was that our species is no longer well-suited to live in trees as our primate ancestors did.

According to Lieberman, about 2.6 million years ago, early human species began to eat meat, which could be obtained through either scavenging or hunting. About 2 million years ago, these distance running adaptations became characteristic of Homo erectus populations, theancestors to modern humans, as those who were better runners were better able to survive. Maybe we couldn’t run faster than a cheetah, but we could outrun any animal on earth in terms of distance. Persistence hunting was thus a primary survival strategy for early human species. The slow but steady pursuit of prey yielded great rewards as humans simply outlasted their target. Armed with only simple weapons, they tracked and chased animals during the hottest part of the day and made their prey run faster than it could sustain for long, eventually overpowering an animal as it developed hyperthermia and collapsed. Sweaty and relentless, early humans used this strategy to great advantage (Alvin, 2017).

Additionally, the consumption of animal meat provided more calories than plants alone, which fueled the growth of larger body sizes. Meat also provided amino acids and nutrients needed to develop more complex brains and higher-level cognitive functioning.

Some anthropologists criticize this hypothesis because there are few populations that practice persistence running today and the fact that it is effective in hot, grassland or savanna-type environments but may not be as effective in other climates. It’s true that this strategy is not common among present-day hunter-gatherer societies; however, more advanced technology like spears and projectile weapons have made persistence hunting less necessary. Additionally, our ancestral environments 2 million years ago likely differed from the current environments for which ethnographic research has been conducted.

Live to Run

Some other animals, like dogs and horses, can run great distances if they are forced to or to escape from danger. But humans are better at it, and what’s more, humans voluntarily run for miles on end.

Track and field is one of the oldest sports in the world. Reaching back to prehistoric times, humans have put their natural abilities in running, jumping, and throwing to use in athletic events for thousands of years. If there was ever an official beginning of running as a sport, it was the first ancient Olympic Games held in 776 BCE, in which the stadion footrace was the only event.

Running has also long been used as an exercise to build and maintain physical fitness for military activities.

Running as a form of recreation—not just for athletic competition, military conditioning, or survival—began to gain traction first in New Zealand. Cross-country running coach Arthur Lydiard promoted jogging through his Auckland Joggers Club. Bill Bowerman, a University of Oregon running coach and later cofounder of Nike, returned from a trip to New Zealand in 1962 and brought back Lydiard’s wisdom to the United States. Bowerman published a pamphlet and a book on the topic of jogging, casting it as an activity not just for pro athletes but for the average person who wanted to live a healthy lifestyle. He showed through research that running improved cardiovascular health, and his book was endorsed by medical professionals.

In the late 1960s, running for exercise started to gain traction in the United States, though it was still considered strange. Professional athletes and boxers ran as part of their training, but now, ordinary people were starting to join in. It was weird. Everyone stared at them. In 1968, the Chicago Tribune devoted a whole page to a strange new fitness trend called jogging, and the New York Times poked fun at the new “in” sport. What freaks, these people who chose to run in their free time! The most dedicated ones would run up to an entire mile.

Joggers in these early years attracted the attention of suspicious neighbors and police officers who were alarmed at grown men and women running down the street, suspecting “folly” at play. Runners were sometimes stopped on the street and given citations for their unusual use of the road.

The Running Boom

In the 1970s, nearly 25 million people hit the ground running in races, on trails, and on roads throughout America, largely inspired by Frank Shorter’s victory in the 1972 Olympic marathon. The 26.2 mile race was relatively unknown to Americans up until this point.

Shorter’s 2:12:19 finish marked the first American gold medal in the marathon since 1908. The finish was intercepted by a German imposter who darted into the stadium before Shorter. He was a college student who put on a track uniform and joined the race for the last kilometer, first to the tune of cheering from the audience and then to booing as officials got word of the hoax. “That’s not Frank, it’s an imposter, get that guy off the track! It’s a fraud, Frank!” the commentator called out over the radio. (The Washington Post named this one of the most memorable sports calls in history.) The coverage of this event changed the way the nation thought about long-distance running. The marathon, once an obscure event that you’d have to be crazy to attempt, was now front and center.

During the “running boom,” road racing events spread throughout the country, allowing public participation rather than restricting participation to exclusive, members-only athletic clubs. Ordinary people were doing 5Ks, 10Ks, and even marathons now. Australia, the UK, and other European nations saw a similar trend. Additional factors that contributed to the craze included several other books and studies about the health benefits of running, professional and Olympic runners such as Steve Prefontaine, and companies like Nike that gave a high profile to running in popular culture. Now it was not only acceptable but cool to be a runner.

Around this time, women’s participation in athletic events was gaining more acceptance. Title IX opened more opportunities to compete in events at the college level, and universities expanded cross-country and track teams for women to fulfill Title IX requirements. Women found road running and marathon running to be a prime entry point into the world of professional and college athletics. The feats of pioneering women runners like Kathrine Switzer (first woman to run the Boston Marathon), Jacqueline Hansen (two-time world record holder and successful advocate for adding the women’s marathon, 5,000 meter, and 10,000 meter events to the Olympics), Miki Gorman (elite marathoner famous for a New York City-Boston-New York City triple win in 1976–77) and Joan Benoit (first to win the women’s Olympic marathon) inspired women to take up running—for recreation and for aspiring to competitive speeds and distances, for health and for ambition and for fun.

Running made a smooth transition from survival strategy to sport, and from sport to many other roles in play, exercise, stress relief, and community and social life.

Benefits of Running

It turns out that running has great health benefits as well. The cognitive and physical benefits of running and other types of aerobic activity have been studied extensively. Running facilitates cell growth and expansion in the hippocampus, which is the area of the brain associated with memory. It improves neural plasticity and promotes neurogenesis, which in turn lead to better memory and learning capabilities (Schulkin, 2016).

Running is often a way for people to relieve stress. This works because the body releases endorphins during and after running, producing a feeling of euphoria. Schulkin (2016) writes, “Long-distance running partially involves combating pain and discomfort. . . . To struggle is to succeed, and to cope with struggling, the human body has evolved to release hormones associated with euphoric states so that when one is faced with a particularly trying physical feat, the [brain] is permeated with chemicals that induce a sense of calmness.”

Physically, running improves cardiovascular and respiratory health, strengthens the immune system, improves mental health, and can positively influence feelings of confidence and self-esteem.

Running is not for everyone, but it is for a lot of us. Distance running is in our genes—it’s one of the most quintessentially human things we can do, and it helps us become physically and mentally resilient. And the good news is that you don’t have to run 26.2 miles to reap many of the benefits of running any more than you need to hunt for your food in the grasslands.

BONUS: Check out “When Running Was for Weirdos” below.

Sources

Alex, Bridget. “Running Made Us Human: How We Evolved to Run Marathons.” Discover Magazine, April 12, 2019. https://www.discovermagazine.com/planet-earth/running-made-us-human-how-we-evolved-to-run-marathons.

Bramble, Dennis M., and Daniel E. Lieberman. “Endurance Running and the Evolution of Homo.” Nature 432, no. 7015 (2004): 345–52. https://doi.org/10.1038/nature03052.

Lieberman, Daniel E., Dennis M. Bramble, David A. Raichlen, and John J. Shea. “The Evolution of Endurance Running and the Tyranny of Ethnography: A Reply to Pickering and Bunn.” Journal of Human Evolution 53, no. 4 (2007): 439–442. https://dash.harvard.edu/handle/1/3743587.

Edwards, Phil. “When Running for Exercise Was for Weirdos.” Vox, August 9, 2015. https://www.vox.com/2015/8/9/9115981/running-jogging-history.

Pobiner, Briana. “Evidence for Meat-Eating by Early Humans.” Nature Education Knowledge 4, no. 6(2013): 1. Human Origins Program, Smithsonian Institution.

Powell, Alvin. “Humans Hot, Sweaty, Natural-Born Runners.” The Harvard Gazette, April 19, 2017. https://news.harvard.edu/gazette/story/2007/04/humans-hot-sweaty-natural-born-runners/.

Schulkin, Jay. “Evolutionary Basis of Human Running and Its Impact on Neural Function.” Frontiers in Systems Neuroscience 10, no. 59. (2016). https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4939291/

University of Utah. “How Running Made Us Human: Endurance Running Let Us Evolve to Look the Way We Do.” ScienceDaily, November 24, 2004. https://www.sciencedaily.com/releases/2004/11/041123163757.htm.

“When Did the History of Running Begin?” My Running Tips.  http://www.myrunningtips.com/history-of-running.html.

Wikipedia. “Running Boom of the 1970s.” https://en.wikipedia.org/wiki/Running_boom_of_the_1970s.

Why Does Red Mean Stop and Green Mean Go?

Why do we know to automatically stop at a red traffic light and go at a green light? The answer involves a train crash, a gas explosion, and the Model T Ford.

Railroad Signals

In the 1830s, the railroad industry developed a system of signals that would direct train engineers to stop or go. Like modern traffic lights, they used three lights to signal which action the trains should take. Red, the color of blood, has been a signal of danger for thousands of years. It easily lent itself to being the color for stop. At this time, green was chosen as the color for caution, and white was the color for go.

Soon, it became apparent that white was a bad choice. It was easily confused with other white lights. In 1914, a red lens fell out of a light fixture and left it shining white light, turning a stop signal into a go signal. A train zoomed through the white signal and crashed into a train going the opposite direction. To prevent similar incidents, green was reassigned to mean go, and yellow was chosen to represent caution. Yellow was different enough from the other two colors that it stood out and was readily visible to train engineers.

Around the same time the colored light signals were developed, railroads began using a mechanical signaling system called a semaphore. These were poles with an attached arm that pivoted to different positions to signal train drivers. Today, most countries have phased out semaphores in favor of colored lights. The term “semaphore” is now also used as a synonym for a traffic light and as a more general term for any visual signaling system. It comes from the Greek sema (“sign” or “signal”) and phoros (“bearer”), literally meaning “a bearer of signals.”

A railroad semaphore.
Photo by David Ingham, June 8, 2008, CC BY-SA 2.0 via Wikimedia Commons.

To the Streets

In 1865, London faced a growing problem with clashes between horse-drawn traffic and pedestrians in the streets. Railway manager and train engineer John Peake Knight, who specialized in building signaling systems for British railways, presented a lighted signaling system to the Metropolitan Police as a solution for controlling traffic and preventing accidents. His design combined a semaphore for use during the day and a system of red and green gas-powered lights for the night.

On December 10, 1865, Knight’s semaphore/light signal was implemented at an intersection near the Parliament building. It worked great, but only for about a month, when a leak in the gas line supplying the lights caused an explosion. The police officer operating the semaphore was badly burned, and the semaphore/light system was immediately discontinued.

The Model T

In 1913, the year the Model T Ford was rolled out, about 4,000 people died in car crashes in the United States. Roads and highways were simply not designed for vehicles that could travel at 40 miles per hour (which was lightning fast compared to about 15 miles per hour max for horse-drawn carriages). As the Model T made cars more affordable for the middle class, more and more people were driving on the road. Soon, crowded intersections became confusing and dangerous for motorists, pedestrians, horses, and cyclists competing for the right of way.

Police officers stationed in traffic towers manually signaled drivers using lights, semaphores, or their arms. When they used lights, red meant stop and green meant go, but they did not use a yellow light—instead, they blew a whistle to alert drivers that they were about to change the signal. However, few drivers paid attention, especially at busy intersections, and crashes continued to occur.

The Electric Traffic Light

With the growing use of electric lights rather than gas lamps in the late 1800s, the stage was set for the invention of the electric traffic light. The first in history was Lester Wire’s 1912 handmade contraption in Salt Lake City. A police officer exasperated with a growing number of traffic incidents in the city, Wire constructed a wooden box that looked like a birdhouse, set red and green lights in it, and raised it up on a ten-foot pole. While Wire was truly the first to invent the electric traffic signal, he is often overshadowed by others who came later and had civil authorities and patents on their side.

In 1914, Cleveland engineer James Hoge also had the idea to borrow the red-green light system used by railroads. He suspended electric lights on a wire above the intersection of Euclid Avenue and East 105th Street in Cleveland, creating the first “municipal traffic control system.” Hoge’s traffic apparatus was similar to Knight’s and Wire’s in that a police officer had to sit in a traffic tower and switch the light every so often. Unlike these more rudimentary systems, however, Hoge’s caught on quickly.

A police officer in a traffic tower.
Photo by Olle Karlsson, July 24, 1953, Sweden, from Wikimedia Commons.

In 1920, Detroit police officer William L. Potts invented a traffic signal suitable for a four-way intersection that used all three colors present in the railway system—red, yellow, and green. However, Pott’s system still required someone to manually change the light, a tedious and costly way to manage traffic.

Automatic signals were thus in high demand, and several systems were invented throughout the 1920s. The first ones changed the color of the lights at timed intervals, but this meant that vehicles had to stop even when there were no other cars crossing the intersection. Charles Adler Jr. then invented a signal that could detect a car’s horn honking. To get the light to change, a car could honk its horn. Though the signal would not be able to change again for at least 10 seconds, this system was not popular because of all the noise from honking cars.

Later on, more efficient and less noisy systems were invented to sense when cars were present at an intersection and time the traffic lights accordingly. By the 1930s, traffic lights were beginning to spread to other countries in the world, becoming a signal of progress, growth, and industry in the US and abroad. Additionally, in 1935, the various systems in use in the US were standardized by the federal government, and all cities with stoplights were required to adopt the red, yellow, green light system to avoid confusion and inconsistency from one city to the next. The colors also must be lined up in the order red, yellow, green from top to bottom (which also helps colorblind drivers to distinguish which light is on).

How effective was the traffic light at preventing accidents? Smithsonian Magazine explains that due to the traffic light, “motor vehicle fatality rates in the United States fell by more than 50 percent between 1914 and 1930” (Nelson, 2018).

Socialization, Semiotics, and Gamification

Unfortunately, the traffic light also contributed to the rise of road rage on the streets. As the Smithsonian Magazine notes, pedestrians and drivers no longer had to acknowledge one another at intersections; they merely waited until the lights signaled they could go. Patience wore thin as people began to grumble when waiting for red lights to change.

From a young age, the public began to be socialized into the knowledge of traffic light signals along with models for good citizenship. As early as 1919, a schoolteacher in Cleveland came up with a game called “Red Light, Green Light,” which taught children to recognize traffic signals. The red/yellow/green system soon became instinctual as people learned the simple set of actions they represented. In the words of communication experts, “our ability to respond appropriately” to conventions like the traffic light “depends on our ability to use cultural experience to interpret signs and symbols appropriately, instantly, and instinctively.” The study of these types of symbols is called semiotics. To interpret symbols like the traffic light, “humans rely on signifiers and message shortcuts, whose meanings develop over time into almost universally accepted aspects of language” (Rackham and Gray, 2021). Where the traffic light is a signifier, we don’t have to be told to stop when the light is red—we have a message shortcut that bridges the gap between seeing a red light and knowing to stop.

Soon, the red/yellow/green light scheme was ingrained in many aspects of culture, an assumed semiotic system that permeated many different areas of life. Traffic signals were incorporated into other children’s games and toys. Educational programs on everything from nutrition to healthy relationships use green, yellow, and red to signify when to proceed with an action, when to slow down or use caution, and when to stop. Your boss may give you the “green light” to proceed with a project, or you might receive a “yellow light” while negotiations are on hold.

Yellow: The Color of Ambiguity

While the green light is unambiguously a signal to go and the red light unequivocally means stop, the yellow light is—well, somewhat up to interpretation and context. Some drivers see it as a sign to slow down and prepare to stop, while others see it as an indication to speed up and get through the light before it turns red. Laws regulating yellow lights are intentionally vague, as drivers are expected to use their own discretion and common sense to navigate an intersection at a yellow light. Semiotics is not always so simple, it seems.

BONUS: Click here for some interesting and unique crosswalk signals from around the world.

Sources

Adams, Cecil. “Who Decided Red Means ‘Stop’ and Green Means ‘Go’?” The Straight Dope, March 7, 1986. https://www.straightdope.com/21341613/who-decided-red-means-stop-and-green-means-go.

History.com Editors. “First Electric Traffic Signal Installed.” This Day in History. History.com, August 3, 2020. https://www.history.com/this-day-in-history/first-electric-traffic-signal-installed.

I Drive Safely. “The History and Meaning of Colored Traffic Lights.” Retrieved May 29, 2021, from https://www.idrivesafely.com/defensive-driving/trending/history-and-meaning-colored-traffic-lights.

Marusek, Sarah. “Visual Jurisprudence of the American Yellow Traffic Light.” International Journal for the Semiotics of Law, 27, 183–191.https://link.springer.com/article/10.1007/s11196-013-9323-z.

Nelson, Megan Kate. “A Brief History of the Stoplight.” Smithsonian Magazine, May 2018. https://www.smithsonianmag.com/innovation/brief-history-stoplight-180968734/.

Rackham, Scott, and Paxton Gray. Social Media Communication. Orem, UT: MyEducator, 2021.

Scott. “The Origin of the Green, Yellow, and Red Color Scheme for Traffic Lights.” Today I Found Out, March 8, 2012. https://www.todayifoundout.com/index.php/2012/03/the-origin-of-the-green-yellow-and-red-color-scheme-for-traffic-lights/.

“Semaphore.” Online Etymology Dictionary. Accessed May 29, 2021, from https://www.etymonline.com/search?q=semaphore.

Birthday Candles: Satanic Ritual, Moon Worship, or the Gift of Industrial Revolution?

Why do we blow out candles on birthday cakes? The answer involves Egyptian theocracy, the moon goddess, and (as for many holidays) the mixing of Christian and pagan tradition.

A tumblr post that went around the internet a while back drew attention to the strange, ritualistic custom that takes place during birthday celebrations:

“A small gathering of people huddle around an object on fire, chanting ritualistically a repetitive song in unison until the fire is blown out and a knife is stabbed into the object.”

The post laughingly called this a bit satanic. The real story of the birthday cake does have a little something to do with evil spirits—but more in the vein of warding them away with candlelight and merrymaking.

The origin of the birthday celebration itself must be put together piece by piece, drawing from different time periods and cultural traditions.

Ancient Egypt, Greece, and Rome

When a pharaoh was crowned in ancient Egypt, it was believed that he or she was transformed or reborn into a god. As early as 3,000 BCE, Egyptians celebrated the pharaoh’s coronation day as the birthday of a god. The Greeks may have inherited this custom of celebrating the birth of a god from the Egyptians. In many pagan belief systems, days of major change in the world or in a person’s life were thought to invite evil spirits into the world. When birthdays began to be celebrated for common people rather than just religious figures, a widespread belief was that evil spirits would visit people on their birthdays, so a party must be held to scare them away. Party-goers helped the birthday person feel cheerful, made a racket with noisemakers, and brought candles as a light in the darkness to ward away the sprits. These early birthday parties were thus considered a form of protection against evil.

But what’s a birthday party without some cake?

The candles came first. In ancient Greece, round cakes were baked in honor of Artemis, the moon goddess. Candles were placed on top to represent the glow of the moon. In Greece, as in many ancient societies, it was believed that the smoke of the candles would carry their prayers to heaven (in this case, to the moon).

The birthday cake came second. Ancient Romans may have been the first to celebrate birthdays of non-religious figures. Birthday celebrations included a sweet, bread-like pastry made from flour, nuts, honey, and yeast. (Cake and bread were largely interchangeable terms until more recently in history, and sugar in its many refined forms was not used in the Mediterranean until about the thirteenth century.) These honey cakes were typically found at birthday celebrations for members of the imperial family, private birthday celebrations for family and friends, and also at weddings. Fiftieth birthday celebrations merited a special pastry made from flour, grated cheese, honey, and olive oil.

Rome may have had the first birthday celebrations and birthday cakes for the common man, but it was really justthe common man. Blatant inequality between the sexes meant that women’s birthdays were not acknowledged until the twelfth century, hundreds of years later.

Germany’s Kinderfeste

Early Christians considered these celebrations inappropriately pagan and did not observe birthdays until about the fourth century, when they began to celebrate the birth of Jesus. In medieval Germany, a sweet bread was baked in the shape of baby Jesus to commemorate His birth.

Sometime between the fifteenth and eighteenth centuries, Germans began to celebrate a child’s birthday with a Kinderfeste party. In the morning, a cake called a Geburstagstorte would be topped with the number of candles corresponding to the child’s age, plus one candle representing the “light of life,” the hope for another year of life to come. The candles were lit and left to burn all day until after dinner, at which point the child would make a wish, try to blow out all the candles, and then eat the cake.  Blowing out the candles signified that the birthday wishes would reach God as the smoke floated to the heavens—a highly Christian interpretation. At the Kinderfeste, the child was surrounded by family and friends, which was supposed to provide protection from evil spirits who might attempt to steal their soul—a relic of pagan superstition.

Ein Kinderfest
By Ludwig Knaus, 1868, oil on canvas, image from Wikimedia Commons.

Though birthday cakes were mostly for children at this time, wealthy people of all ages also had fabulous birthday desserts. In 1746, traveler Andreas Frey wrote of Count Ludwig von Zinzindorf’s extravagant birthday celebration that featured a cake with candles: “There was a Cake as large as any Oven could be found to bake it, and Holes made in the Cake according to the Years of the Person’s Age, every one having a Candle stuck into it, and one in the Middle” (Frey, 15).

Let Them Eat Cake

Geburstagstorten were originally similar to the lightly sweetened Roman birthday pastries, but as time went on, it became more common to bake sweeter cakes with sugar. In the seventeenth century, multiple layers, icing, and decorations were introduced as well. This luxury dessert would only have been available to the upper class.

The Industrial Revolution changed everything, as mass production of sugar and other ingredients and utensils made birthday cakes available to almost everyone. Additionally, bakeries could now offer pre-made cakes at reasonable prices.

COVID-19 may have put a temporary damper on blowing all over a cake everyone is about to eat (and also on gathering in groups for a party), but that hasn’t stopped anyone from quarantine-baking some delicious cakes. It’s someone’s birthday somewhere, right?

Sources

Frey, Andreas. A True and Authentic Account of Andrew Frey. Containing the Occasion of His Coming among the … Moravians [&c.]. Transl, Volume 8. Oxford University, 1753.

madcenturion. Tumblr. April 23, 2013.

McCormick, Chloe. “The Real Reason We Eat Cake on Birthdays.”

Spoon University. https://spoonuniversity.com/lifestyle/origin-of-birthday-cake.

Origjanska, Magda. “Finding the Origin of the Birthday Cake with Candles (and Song) Tradition.” The Vintage News, January 8, 2018. https://www.thevintagenews.com/2018/01/08/birthday-cake/.

Pump It Up Admin. “How Did the Tradition of Birthdays Begin?” February 3, 2017. https://www.pumpitupparty.com/blog/how-did-the-tradition-of-birthdays-begin/#:~:text=Birthdays%20first%20started%20as%20a,%E2%80%9D%20days%2C%20welcomed%20evil%20spirits.

Sterling, Justine. “A Brief History of the Birthday Cake.” Food & Wine, May 23, 2017. https://www.foodandwine.com/desserts/cake/brief-history-birthday-cake.

Van Lulling, Todd. “This Is Why You Get to Celebrate Your Birthday Every Year.” HuffPost, November 11, 2013. https://www.huffpost.com/entry/history-of-birthdays_n_4227366.

What’s Cooking America. “Birthday Cake History.” https://whatscookingamerica.net/History/Cakes/BirthdayCake.htm/.

A Short History of Hamburgers

Why are hamburgers called hamburgers if they’re not made out of ham? The answer spans time and space from the Mongol invasion of Russia to the German revolutions of 1848 to the McDonald’s Big Mac.

A hamburger and French fries is probably the most American meal you could think of. Let’s consider that for a moment . . . a hamburger, named not for the meat it’s made of but for Hamburg, Germany, and French fries are considered quintessentially American. This is yet another testament to the powerful influences of immigration and cultural exchange that continue to shape the culture of America today.

“Two All-Beef Patties . . .”

There is considerable controversy over the origin of the hamburger. Because ground beef steak and bread have been eaten separately in many different countries for centuries, it is unknown exactly how the hamburger as we know it came to be. For one, there are similar dishes found throughout Europe. Sicia omentata from fourth-century Rome was a baked beef patty mixed with pine kernels, peppercorns, and white wine. Steak tartare had its origins in the twelfth-century Mongol invasion of Russia, when Mongol invaders stashed meat under their saddles to tenderize it while they rode to battle and then ate it raw. Russians called this preparation steak tartare, after their name for the Mongols. When ships from the port of Hamburg came to Russia to trade, they brought back steak tartare as raw ground beef shaped into a patty with a raw egg yolk on top.

Modern steak tartare.
Image by Rainier Zenz, CC BY-SA 3.0 via Wikimedia Commons.

A more direct German precursor to the hamburger is the seventeenth-century Frikadeller, which were flat, pan-fried beef meatballs. In eighteenth-century England and America, the Hamburgh sausage was prepared with chopped beef, spices, and wine and was supposedly a recipe that mimicked the preferences of immigrants and visitors from Hamburg. A nineteenth-century adaptation, called the Hamburg steak, is the most recognizably hamburger-like preparation and carried the Hamburg name. It was a minced beef filet, sometimes mixed with onions and bread crumbs, then salted and smoked and served raw in a pan sauce.

Hamburg steak
Hamburg steak.
Image by OiMax, CC BY 2.0 via Wikimedia Commons.

In 1848, political revolutions throughout the German Confederation pushed many Germans to immigrate to the United States. Known as the “Forty-Eighters,” these immigrants were just the first of a wave of European immigrants. The 1850s saw a larger increase in the immigrant population in the US relative to the overall population than any other decade in history. The German-born population alone increased 118.6% during this decade as immigrants arrived in New York by boat and spread throughout the East and Midwest states.

And here’s where the controversy comes in. The first version of the hamburger origin story claims that Germans arriving on the Hamburg-America line had already been preparing and consuming the Hamburg steak, as it was a popular meal among workers, and the smoked preparation kept well at sea. Immigrants enjoyed the meal and continued to make Hamburg steaks out of fresh meat once they got to America.

The second version goes that the Hamburg steak was created to meet demand for quick, cheap food for German sailors and immigrants arriving in America. Hamburg was known as an exporter of high-quality beef, so the Hamburg steak was offered in America as an idea of what might appeal to those arriving from Germany. Street vendors opened up along the coast where the Hamburg-America line docked, selling lightly grilled meat patties in the “Hamburg style” as a quick meal, perhaps accompanied with bread.

Port of Hamburg, 1862.
Image from Library of Congress via Wikimedia Commons.

In the second half of the nineteenth century, following this wave of immigration, the Hamburg steak was found at restaurants all over the port of New York. It was rather expensive at first (a whole 11 cents!), but with the growth in rural beef production and railroad transportation, the cost of beef decreased, and the meal became more widely available. Cookbooks of the time included detailed instructions for preparing the “hamburger steak,” as it was known from 1889 on.

The hamburger steak was soon viewed as a quintessentially American food, influenced as it was by the waves of immigrants who formed the character of the country. As Chicago and other cities in the East developed into major centers for the large-scale processing of beef, the hamburger steak became widely affordable and available to the average consumer—it was the “American beef dream.”

“Special Sauce, Lettuce, Cheese, Pickles, Onions”

By the early 1900s, the term hamburger steak was shortened to simply hamburger. Sometime between 1885 and 1904, someone decided to put the hamburger steak between two slices of bread, thus inventing the hamburger sandwich we know today. Some credit the founder of fast-food joint White Castle as the inventor of the hamburger sandwich, while others cite small-town cooks in Texas or Oklahoma or Ohio who placed a Hamburg steak between two slices of bread. Hamburgers were served on two thick slices of toast at the St. Louis World’s Fair in 1904, where they gained major exposure and created a sensation among fair-goers. Various claims exist and are not well-documented, and it’s likely that multiple people had the idea for a hamburger sandwich around the same time.

Toppings soon followed. Onions had long accompanied the hamburger steak, and other vegetables like lettuce and tomatoes were added for a fresher appearance. Ketchup was first commercially produced in 1869 by Henry Heinz and soon became a near-universal condiment for the hamburger.

“. . . On a Sesame-Seed Bun”

Fast-food restaurants played a major role in cementing the hamburger as the all-American meal. White Castle, which opened its doors in 1919, is regarded as one of the first true fast-food restaurants. When Upton Sinclair’s book The Jungle, published in 1906, caused public outrage and anxiety over the state of meat processing in the country, restaurants had to deal with negative perceptions of products made from ground meat. White Castle took efforts to promote itself as a clean and hygienic facility, and paired this with rapid service and a simple menu centered around hamburgers and coffee.

Like many foods and practices that had their origins in Germany, the hamburger may have lost popularity during World War I due to anti-German sentiment. Additionally, the word “hamburger” conjured up images of greasy, cheap fair food for some consumers. For these reasons, White Castle hamburgers were rebranded as “sliders” to avoid referencing a German city or invoking other unsavory connotations. (Frankfurters received a similar treatment and were called “hot dogs” from then on, and they never quite regained their name.)

But other fast-food companies did not necessarily follow suit, and the term hamburger was still in use during the Great Depression as White Castle’s production methods became faster, more efficient, and more standardized, providing customers with a predictable meal and experience no matter where they were in the country. This concept would revolutionize the world of restaurants as the birth of fast food.

By the 1940s, the term “hamburger” was shortened to “burger,” which became a new combining form—giving us the parts we needed to build words like cheeseburger, veggie burger, and baconburger. Around the same time, McDonald’s came onto the scene, building on White Castle’s system and adding drive-in service. A competing chain called Bob’s Big Boy lays claim to the first documented instance of making a hamburger with the now-standard sesame-seed bun, due to a request from a customer who wanted “something different.” This order also resulted in the first double-decker burger as the chef cut the bun in three pieces to hold two hamburger patties. (Though the sesame seeds used for hamburger buns today have been rendered tasteless, they add visual appeal and cause people to salivate when they see them.)

By Peter Klashorst, CC BY 2.0 via Wikimedia Commons.

When the Big Mac was introduced in 1967, all bets were off: McDonald’s was the leader in fast food and the main driver behind the popularity of the American-style hamburger worldwide. “Two all-beef patties, special sauce, lettuce, cheese, pickles, onions on a sesame-seed bun” was now the established recipe for a premium-quality fast-food hamburger.

Influenced by immigrants and innovation, the hamburger has now became an internationally recognized symbol of American culture and of globalization. Just ask Inspector Closeau:

Sources

Barksdale, Nate. “How the Hamburger Began—And How It Became an Iconic American Food.” History.com, January 6, 2021. https://www.history.com/news/hamburger-helpers-the-history-of-americas-favorite-sandwich.

“German Immigration in the 1850s.” Ancestry.com.  https://www.ancestry.com/historicalinsights/german-immigration-1848.

“Hamburger.” The Merriam-Webster New Book of Word Histories. Springfield, Massachusetts: Merriam-Webster Inc., 1991, p. 210. https://archive.org/details/merriamwebsterne00merr/page/210/mode/2up.

“History and Legends of Hamburgers.” What’s Cooking America. Retrieved May 8, 2021, from https://whatscookingamerica.net/History/HamburgerHistory.htm.  

Ozersky, Josh. The Hamburger: A History. Yale University Press, 2008.

Satran, J. “How Did Hamburger Buns Get Their Seeds?” HuffPost, April 10, 2015. https://www.huffpost.com/entry/hamburger-bun-history_n_7029310.

Walhout, Hannah. “A History of the Burger: From Ancient Rome to the Drive-Thru.” Food & Wine, June 20, 2017. https://www.foodandwine.com/comfort-food/burgers/burger-timeline.

Wikipedia. “Hamburg Steak.” Retrieved May 12, 2021, from https://en.wikipedia.org/wiki/Hamburg_steak.

Wikipedia. “History of the Hamburger.” Retrieved May 12, 2021, from https://en.wikipedia.org/wiki/History_of_the_hamburger.

Wittke, Carl. Refugees of Revolution: The German Forty-Eighters in America. University of Pennsylvania Press, 1952.

The Secret History of White Chocolate

Is white chocolate actually chocolate—and where did it come from in the first place? The answer involves the snow-capped mountains of Switzerland, questionable pharmaceutical cookbooks, and children’s vitamins.

White Chocolate vs. Chocolate

First, let’s talk about chocolate. According to the Chicago Tribune, the cacao bean—the main ingredient of chocolate—contains about equal parts cocoa butter and cacao nibs. The cocoa butter provides the creamy, smooth mouthfeel of chocolate while the cacao nibs are responsible for the distinctive taste and aroma. Chocolate is made from both cocoa butter and cacao nibs, along with sugar and often milk. Per FDA regulations, chocolate must contain at least 10% cocoa mass to be labeled as chocolate. (Also called chocolate liquor, cocoa mass is the result of finely grinding cacao nibs and includes cocoa butter that is present in the cacao nibs.) White chocolate, on the other hand, is made from cocoa butter without the cacao nibs. By law, it must contain at least 20% cocoa butter and 14% milk products to be labeled as white chocolate.

Some chocolate purists thumb their nose at insistence of white chocolate daring to pretend that it is on par with its milk and dark chocolate cousins since it is missing a crucial ingredient. But some argue that because white chocolate is made from part of the cacao bean, it should be grouped with other types of chocolate. But the law settles this debate: legally, white chocolate cannot just be called chocolate.

The Unwritten History

Despite its sweet, innocent taste, white chocolate has a hidden past. The history of white chocolate is less than straightforward, and Nestlé has long tried to claim ownership. Until recently, the story went that Nestlé developed white chocolate in 1936 as a way to use up excess milk powder that had been produced during World War I. However, this story skips over many earlier uses of white chocolate—and the real story has less to do with powdered milk as a wartime product and more to do with infant formula.

Food historian Sarah Wassberg Johnson has uncovered several sources showing that white chocolate had actually been made as early as the 1860s. A new technique in the world of chocolate making called the Broma process may have had something to do with it. Developed in 1865, the Broma process involves placing cacao beans into a bag at a warm temperature to allow the cocoa butter to drip out, leaving the beans ready for processing into cocoa powder. Johnson proposes that this created a surplus of cocoa butter and spurred experimentation to find new ways to use it.

Recipes for white chocolate begin to appear in cookbooks shortly thereafter, although the formulas were quite different from modern white chocolate.

An 1869 recipe for white chocolate caramels was simply a recipe for caramels with the addition of cocoa butter. This recipe appeared in a cookbook by two French chefs.

Recipe for white chocolate caramel tablets

One recipe from an 1872 American cookbook included tapioca, powdered sugar, oatmeal, and Iceland moss (yum) along with “concentrated tincture of Caraccas cacao,” and it was designated as a suitable composition for “delicate persons.”

Another cookbook, specifically for druggists, directed the cook to mix sugar, rice flour, arrowroot, vanilla, and powdered gum Arabic (because everyone keeps that on hand . . .) with cacao butter and then pour the mixture into molds.

The White Chocolate Candy Bar

In the 1910s and ’20s, these experiments moved outside the realm of home kitchens and pharmacies as Swiss chocolatiers began to produce white chocolate. A skeptical article in the International Confectioner sneered at rumors of “snow white chocolate” in Switzerland, which the author, T. B. McRobert, saw as an imaginary nod to the snowy Swiss Alps. McRobert said he would never eat such a thing, as it would have to be bleached with toxic gases to produce the white color. But the rumors grew, and it turns out white chocolate was real (and safe to eat!). Swiss white chocolate was made from cocoa butter and sugar, sometimes with milk powder, chestnut meal, or vanilla.

The early twentieth century saw rapid growth in the candy industry, especially during World War I, which paved the way for the first commercially produced white chocolate. The Double Zero Bar was introduced in 1920 by the Hollywood Brands company in Minnesota. The novel confection consisted of a caramel, peanut, and almond nougat covered in white chocolate fudge, a unique look for a candy bar. If all this talk about chocolate is making you hungry, you’re in luck—this candy bar still produced today, though it’s now called the ZERO Bar and sold by Hershey’s.

A vintage Zero bar wrapper.
The original Zero bar wrapper.
Image from the Candy Wrapper Museum.

Nestlé’s White Chocolate Vitamins

While Nestlé certainly doesn’t have a claim to producing the first white chocolate, it does have a claim to being the first to commercially produce solid white chocolate.

German-Swiss chemist Henri Nestlé had spent part of his career working on an infant formula that could help alleviate the high infant mortality rate in Germany. He subsequently experimented with powdered and condensed milk products that could improve peoples’ quality of life. In 1879, he founded the Nestlé Company with chocolatier Daniel Peter in Switzerland. The duo had perfected a recipe for milk chocolate in 1875 using Nestlé condensed milk, and their partnership continued to prove fruitful. Nestlé began using his scientific expertise to develop new, innovative products both in the candy industry and in the areas of medicine and health.

In 1936, Nestlé worked with the pharmaceutical company Roche to develop a new product called Nestrovit, a tablet made from vitamin-enriched condensed milk that would help provide children with essential nutrients for growth and development. Nestlé faced the challenge of finding a coating for the tablet that would protect the ingredients from damage and preserve their nutritional benefits. Using his knowledge of chocolate production, Nestlé added some cocoa butter to the Nestrovit formula and created a white chocolate coating for the tablet.

Aside from creating a successful health supplement, Nestlé saw the potential for even greater value in the new variety of chocolate he had made. In 1936, the Nestlé Company launched the Galak bar (branded as Milkybar in the UK), a pure white chocolate bar with a sweet and creamy flavor. In 1948, the Alpine White bar with almonds came on the scene and truly popularized white chocolate bars in the US and Canada market. Marketing for the Alpine White bar drew upon the snow-capped Swiss Alps—coming back full circle to prove the early skeptics of white chocolate wrong.

Milkybar, sold in the UK
Image by Evan-Amos from Wikimedia Commons.

White Chocolate Today

White chocolate is often passed over for its cocoa mass-containing counterparts, and many people associate it with cheaper, waxy-textured novelty candy. However, some chocolatiers have begun to take it more seriously. Specialty chocolate makers see white chocolate as a blank canvas for other flavors creative add-ins, without the taste of cacao nibs to overpower delicate flavors. Rosemary and sea salt, roasted strawberry, turmeric and pomegranate, and caramelized or “blond” white chocolate are only some of the unique flavors that artisan chocolatiers have dreamed up.

White chocolate may still be the underdog, and it may not actually be considered chocolate, but it seems that it has great potential for innovation in the future!

Sources

Blakely, Henry. The Druggist’s General Receipt Book: Containing a Copious Veterinary Formulary: Numerous Recipes in Patent and Proprietary Medicines, Druggists’ Nostrums, Etc.: Perfumery and Cosmetics: Beverages, Dietetic Articles, and Condiments: Trade Chemicals, Scientific Processes, and an Appendix of Useful Tables. Philadelphia: Lindsay & Blakiston, 1871. Digitized by Harvard University. https://catalog.hathitrust.org/Record/100598206.

Gouffé, Jules, and Alphonse Gouffé. The Royal Cookery Book (le Livre de Cuisine). 1869. Digitized by Harvard University.

Guittard. “Glossary of Terms.” https://www.guittard.com/in-the-kitchen/article/glossary-of-terms.

“Henri Nestlé.” Wikipedia. Retrieved May 5, 2021, from https://en.wikipedia.org/wiki/Henri_Nestl%C3%A9.

Johnson, Sarah Wassberg. “Before Nestle: A History of White Chocolate.” The Food Historian, February 14, 2021. https://www.thefoodhistorian.com/blog/before-nestle-a-history-of-white-chocolate.

Marchetti, Silvia. “How White Chocolate Evolved from a Coating for Kids’ Medicine into a Sweet, Creamy Treat,” November 9, 2019. https://www.scmp.com/magazines/style/news-trends/article/3036673/how-white-chocolate-evolved-coating-kids-medicine-sweet.

Seth, Simran. “For Those Who Think White Chocolate Isn’t ‘Real’ Chocolate, Have We Got Bars for You.” Chicago Tribune, November 28, 2017. https://www.chicagotribune.com/dining/recipes/ct-white-chocolate-is-real-chocolate-20171128-story.html.

TCHO. “Is White Chocolate Actually Chocolate?” January 9, 2018. https://tcho.com/blogs/news/is-white-chocolate-actually-chocolate.

The Dessert Book: A Complete Manual from the Best American and Foreign Authorities. With Original Economical Recipes. Boston: J. E. Tilton and Company, 1872. Digitized by Harvard University. https://babel.hathitrust.org/cgi/pt?id=hvd.32044087496899&view=1up&seq=125.

The Garden of Children

Why is the first year of school for children called kindergarten? The answer involves a nature mystic, a case of mistaken identity, and a socialism scare.

Kindergarten stands out from the other required years of education in the United States for its unique name. First grade, second grade, and third grade follow, all the way up to twelfth grade (plus some alternate names for the high school years). So why the special name for kindergarten?

The Founding of Kindergarten

The word kindergarten was coined in 1840 by German teacher and educational reformer Friedrich Fröbel, from the words Kinder (“children”) and Garten (“garden”). Like all nouns in  German, the word Kindergarten is capitalized, but this styling is usually not carried over into English.

Friedrich Fröbel, by C. W. Bardeen, 1897.
Image from the Library of Congress.

Fröbel used the word in a proposal that called for the development of early childhood education as a necessary part of widespread educational and social reform. He advocated for the unique needs of young children and opened up an experimental infant school in Prussia called the Child Nurture and Activity Institute. He later renamed it Kindergarten, reflecting his philosophy that young children should be nurtured like “plants in a garden.” Schools for young children in the 1700s and 1800s had formerly been glorified babysitters, philanthropic endeavors to care for impoverished children, or discipline in preparation for adulthood. Fröbel’s school instead focused on encouraging self-expression and learning through play, singing, gardening, and group activities, and it formed the basis for early childhood education techniques used today.

In 1851, Kindergarten schools were banned in Prussia due to a mix-up of Fröbel with his nephew Karl, who was a socialist and had published a treatise proclaiming more radical views about education. The government mistakenly attributed Karl’s “atheistic and demagogic” views to his uncle, who was sincerely religious (in the form of nature mysticism and pantheism) and dedicated to improving childhood education. The ban on Kindergarten led to a diaspora of German teachers to other countries in Europe and the United States, where they spread their teaching model to other schools. In 1856, Margaretha Meyer-Schurz opened the first German-speaking kindergarten in the U.S. A few years later, Elizabeth Palmer Peabody embraced Fröbel’s model after visiting Germany and opened the first English-speaking kindergarten in the U.S. Peabody is largely credited with popularizing the concept of kindergarten in America.

Im Kindergarten, by Hugo Oehmichen, 1879.
Image from Wikimedia Commons.

Translating Kindergarten

English borrowed the word kindergarten from German without translating it, but it is translated into Romance languages word for word in a way that preserves the original meaning of the Kinder + Garten roots. In French, the term is jardin d’enfants (“garden of children”), in Spanish, jardín de infancia (“garden of childhood”), and in Portuguese, jardim de infância (“garden of childhood”). A few non-Romance languages such as Hebrew do the same thing: gan yeladim means “garden of children.” A loanword that is translated this way is called a calque. Other words that use a similar translation scheme include honeymoon, Adam’s apple, and loanword itself.

These words are not very common in Romance languages anymore, nor is the term kindergarten widely used in the UK. During and after World War II, German language and culture was looked down upon in many nations, and some have claimed that these calques of Kindergarten were eclipsed by other terms devoid of German roots.

Kindergarten around the World

In many countries, children from ages three to seven attend kindergarten or the equivalent. Where the United States distinguishes between preschool and kindergarten, many other countries do not, and kindergarten is instead part of the preschool system. Children may attend the same kindergarten/preschool for two years or more before beginning their primary education.

Fröbel was one of the most influential educational reformers in the modern educational system, and the effects of his work—and his words—are still seen today. Kindergarten is a place where we can begin to explore and learn without many of the social pressures of older childhood—where we don’t have to be anyone but ourselves.

Sources

Curtis, Stanley James. “Friedrich Froebel.” Encyclopaedia Britannica.  https://www.britannica.com/biography/Friedrich-Froebel.

Eschner, Kat. “A Little History of American Kindergartens.” Smithsonian Magazine, May 16, 2017. https://www.smithsonianmag.com/smart-news/little-history-american-kindergartens-180963263/.

“kindergarten (n.).” Etymology Online Dictionary. Retrieved April 26, 2021, from https://www.etymonline.com/search?q=kindergarten.

“kindergarten (n.).” Oxford English Dictionary. Retrieved April 26, 2021.

Wikipedia. “Friedrich Fröbel.” Retrieved April 27, 2021, from https://en.wikipedia.org/wiki/Friedrich_Fr%C3%B6bel.

Wikipedia. “Kindergarten.” Retrieved April 26, 2021, from https://en.wikipedia.org/wiki/Kindergarten.

Wikipedia. “List of calques.” Retrieved April 26, 2021, from https://en.wikipedia.org/wiki/List_of_calques.

Honeymoon

Why does the word honeymoon refer to a vacation a couple takes after getting married? The answer involves myths about mead, poetry about love, and a warning about waning.

In some cultures, the period shortly following marriage is seen as a time for a couple to withdraw from the world and spend time with each other. In others, this period is still a time of celebration with friends and family, and couples are given little time alone. Both of these inclinations gave way to the honeymoon, a vacation that a newlywed couple takes together immediately after getting married. But have you ever wondered about the origins of the honeymoon?

Some Ancient Theories

Some historians claim that the honeymoon “dates from the days of marriage by capture when, after snatching his bride, the groom swept her away to a secret location, safe from discovery by her angry kin” (Waggoner, 2020). The groom kept the bride hidden from her family until they stopped looking for her, with the intent to get her pregnant. Later, marriage-by-capture became ritualized, and the groom took the bride away with her family’s knowledge and the understanding that he would later offer a bride price. However, this terrifying practice may not be directly related to post-marriage celebrations today. It could rather be an analogy from a time and place when marriage was more of a forced transaction than an act of love. Some tend to cite ritual kidnapping as the precursor to the honeymoon, but it seems that we lack evidence of a direct link between the two practices.

Another “fanciful” theory held that honeymoon came from an ancient tradition where guests would gift a newlywed couple with mead, which is made from honey. The couple supposedly drank the mead together during their first month of marriage to improve their chances of conception. This may seem like a plausible origin, but is largely regarded as a myth made up in the eighteenth century. (No one can decide where this originated or what language these people may have spoken, and for this reason they probably would not have called it a “honeymoon.”)

And on the two depressing notes of kidnapping and drunkenness, let’s take a look at the happier precursors to the modern honeymoon.

The Origin of the Word

The term honeymoon originally referred to the first month of marriage—encapsulating the idea that the first month, or moon, is when marriage is supposedly the sweetest and is filled with love and happiness. It came into use in the mid-sixteenth century. The earliest recorded use was in a 1546 book of English proverbs by John Heywood, who described newlywed bliss thus: “It was yet but hony moone.”

By the end of the sixteenth century, the word honeymoon was soon extended to mean a period of peace, good relations, or goodwill between people, groups, or nations, often in a political context. A 1655 church history of Britain by Thomas fuller noted, “Kingdoms have their honey-moon, when new Princes are married unto them.” A partnership between two businesses might undergo a honeymoon period, as could a country that has just elected a new leader. But history tells us that the honeymoon period often does not last for long.

The comparison of marriage to the phases of the moon implied that a couple’s love would wane over time: “And now their hony-moone, that late was cleare, Did pale, obscure, and tenebrous appeare,” lamented the poet William Fenner in his 1612 Cornucopiæ.

The Bridal Tour

The custom of a newlywed couple taking a vacation after their wedding originated in Great Britain in the 1820s, when it became fashionable for upper-class couples to take a “bridal tour” to visit family and friends who could not attend their wedding. (Working-class couples generally did not have time off work to take any sort of vacation.) Because this trip occurred during the honeymoon period, the sweet first month of marriage, the term honeymoon began to be applied to the trip rather than the time period.

The bridal tour custom soon spread to the European continent. In many European languages, the word that represents the concept of a honeymoon can be translated word-for-word into “honey” and “moon.” The French term is lune de miel (“moon of honey”), for example. Farther away, the Russian word is медовый месяц (“honey month”), and the Persian word is mah-e-asal (“mouth of honey” or “moon of honey”).

Later in the nineteenth century, during the period known as the Belle Époche, newlyweds began taking a trip for fun and relaxation rather than for visiting family. This was considered one of the first modern practices driving mass tourism, and it was facilitated by general peace and stability between European nations in the period between the Franco-Prussian War and World War I.

Modern Honeymoons

Following European fashion, the honeymoon picked up steam in the United States toward the end of the nineteenth century. Advances in transportation made travel easier and more accessible to couples at all levels of society in the mid-twentieth century. The honeymoon is now considered a near-indispensable part of the wedding ritual in the United States and Great Britain.

Today, there are several different takes on the honeymoon. Many couples opt for a romantic, relaxing getaway for some time together. Some couples go on a short mini-moon after their wedding and then save up for a bigger trip in the future. In a strange twenty-first century twist, solomoons or unimoons are when both partners take a vacation on their own, often because they can’t agree on a destination together (which is probably not a predictor of a happy marriage!). And finally, babymoons are a vacation a couple takes before the birth of a child.

Cycles of Love

Today, we might view the idea of the “honeymoon period” positively but acknowledge that the initial excitement of marriage may wear off. A couple’s love may go through seasons and phases but grow through adversity and wax sweet again and again throughout their marriage. The excitement and passion of newlyweds can simply wane—or it can transform into a partnership built on mutual love and respect that grows stronger with each new phase of life.

Sources

Braff, Danielle. “Until Honeymoon We Do Part.” New York Times, March 12, 2019. https://www.nytimes.com/2019/03/13/fashion/weddings/until-honeymoon-we-do-part.html.

Brohaugh, William. Everything You Know about English Is Wrong. Naperville, IL: Sourcebooks, 2008. https://archive.org/details/everythingyoukno0000broh/page/92/mode/2up.

“Honeymoon.” Oxford English Dictionary. https://www.oed.com/view/Entry/88181.

Monger, George P. Marriage Customs of the World: An Encyclopedia of Dating Customs and Marriage Traditions. Vol. 1. ABC-CLIO, 2013.

Shamsian, Jacob. “The Mysterious Origin of the Word ‘Honeymoon.’” Business Insider, March 27, 2017. https://www.insider.com/honeymoon-word-meaning-etymology-2017-3.

Waggoner, Susan, In Susong, Liz. “The Gloomy History behind Honeymoons.” Brides, May 4, 2020. https://www.brides.com/story/the-gloomy-history-behind-honeymoons.

Wikipedia. “Belle Époque.” April 17, 2021.  https://en.wikipedia.org/wiki/Belle_%C3%89poque.

Wikipedia. “Honeymoon.” April 17, 2021. https://en.wikipedia.org/wiki/Honeymoon.

Cookies (The Slightly Less Delicious Ones)

Why are the tracking files that websites place on your computer called cookies? The answer (somewhat) involves shopping carts, Chinese takeout, and a German fairy tale.

We’ve talked about cookies in the past, but browser cookies (also called internet cookies or HTTP cookies) are a little more crumbly and a little less delicious.

What Is an Internet Cookie?

A cookie is a small text file that a web server sends to your browser when you visit a website. The next time you visit that website, your computer checks to see if it has stored a cookie, and if it has, it sends the information in the cookie back to the website. The cookie tells the website your previous browsing preferences—such as your preferred language, items you’ve added to your cart when shopping online, what links you’ve clicked on and pages you’ve visited before, and so on. The purpose of a cookie is to help you have a better browsing experience by making the site more relevant and useful to you as a unique user. Cookies are also used to protect sensitive information and authenticate users when they log in to their account.

Despite the usefulness of cookies, there is controversy about the amount of data collected by some websites and the way that data is used. The fact that they collect and store information without the user necessarily knowing about it has some people uneasy, and we’ve all encountered ads that seemed way too personalized due to third-party tracking cookies. Luckily, most cookies are harmless and simply speed up browsing and make websites more dynamic.

So Why Are They Called Cookies?

Web programmer Lou Montulli of Netscape Communications first used the term “cookie” in 1994 in reference to a package of data that a program receives and sends back without changing. This type of file had already been used in computing and was called a “magic cookie,” but Montulli ingeniously adapted them for use on the web. He created a system for an online store to solve the problem of servers that were overloaded with user shopping cart information. Passing small cookie files between the server and user computers was a much more efficient way of accessing user shopping cart data when needed.

Another use of this type of file was in Unix’s “fortune” program, which presented the user with a random quote, joke, or poem—like a virtual fortune cookie. The files that stored these messages were “magic cookie” files.

Some have also compared internet cookies to the story of Hansel and Gretel, who left a trail of bread crumbs behind them to mark their path through the forest. In the same way, internet cookies mark the trail of a user’s browsing history on a website.

Sources

Create a Pro Website. “What Are Cookies? And How They Work | Explained for Beginners!” August 31, 2009. https://www.youtube.com/watch?v=rdVPflECed8.

“How Google Uses Cookies.” Google Privacy & Terms. https://policies.google.com/technologies/cookies?hl=en-US.

“HTTP Cookies.” Wikipedia, April 12, 2021. https://en.wikipedia.org/wiki/HTTP_cookie.

“What Are Cookies?” Cookie Controller. https://cookiecontroller.com/what-are-cookies/.

Left, Right, Center

Why are conservatives referred to as the “right” and liberals referred to as the “left” in politics? The answer involves the French Revolution, the quick spread of information through newspapers, and the tense interlude between the two World Wars.

Political beliefs are often described as being on a spectrum from left to right. Left refers to liberal views, such as advocating for progressive reforms and seeking economic equality by redistributing wealth through social programs. On the far left, we have revolutionary ideologies like socialism and communism. Right refers to conservative views, such as maintaining existing institutions and traditional values while limiting government power. On the far right, we have nationalistic ideologies like fascism.

Vive le France

The political descriptors left and right originally referred to the seating arrangements for members of the French National Assembly in 1789, who convened during the French Revolution to draft a new constitution. From the position of the speaker of the assembly, those seated on the right side of the room were nobility and high-ranking religious authorities. Those seated on the left side of the room were commoners and lower-ranking clergy members.

The division was originally caused over the issue of how much authority the king should have. Those in favor of the king having absolute veto power sat on the right, and those who favored limiting the king’s veto power sat on the right.

Les Etats Généraux of 1789
Burin d’Alphonse Lamotte, 1889, public domain from Wikimedia Commons.

The higher-ranking members of society tended to be more pro-aristocracy and generally were more reactionary in their political views, while the lower-ranking members of society tended to be pro-revolution, more radical, and more centered on the needs of the lower and middle classes. Those who sat closer to the center of the room tended to be more moderate in their views than others in their faction. The left was “the party of movement,” and the right was “the party of order.”

Newspapers reported on left-wing and right-wing views, and the terms left and right spread quickly into popular usage in France.

Over the next century, the seating arrangements in the French legislature persisted at some times and were discouraged at other times. When the French Third Republic was established in 1871, the terms left, right, and center were used in the names of political parties themselves: the Republican Left, the Centre Right, the Centre Left, the Extreme Left, and the Radical Left were the major political parties of the day.

The Interwar Years

Right and left became widely used throughout Europe in the 1920s and 1930s, the years between the two World Wars where people “wrestled with the politics of nation and class” and found these labels to be a simplified way to describe complex political ideologies. Marci Shore, professor of European history, writes, “The interwar years were a time of a polarizing political spectrum: the Right became more radical, the Left became more radical; the liberal center ‘melted into air’ (to use Marx’s phrase)” (Carlisle, 2019).

Left and Right in America

Right and left entered usage in America in the 1920s and 1930s as well, but some shied away from the terms, especially left, throughout the mid-20th century due to connotations with extreme ideologies. The 1960s saw a shift toward people defining themselves more consistently with these terms in an effort to differentiate their views from others, as both liberals and conservatives were dissatisfied with the current political consensus. We see again that left and right were used as shorthand ways of categorizing people—a person on the right sees a person on the left as the “other,” and vice versa.

In America, “left” is often synonymous with the Democratic Party, while “right” is often equated with the Republican Party. However, political views span a wide spectrum, and some may fall in between the positions of the parties or way outside the bounds of either one. The definitions of left, right, and center are dynamic and change relative to one another throughout time. The terms meant something different during the French Revolution, in the Soviet Union, during the New Deal, and in America in 2021 and will continue to shift as parties and policies realign in a changing political climate.

The Nature of Public Opinion,” OpenEd CUNY, CC BY-NC-SA 4.0.

The Easter Bunny

Why does a rabbit leave colored eggs, candy, and nonedible novelties for children on Easter morning? The answer involves little ones leaving out an item of clothing overnight with the expectation that it will be filled with gifts, families providing a favorite snack for the mythical bringer of presents, and naughty children receiving a lump of coal  . . . sounds familiar.

The Easter Bunny is a curiously unexplored phenomenon—the jolly figure of Santa Claus appears in Coca-Cola ads, Christmas cards, and the minds of children around the world, but his rabbit friend in the pantheon of holiday figures has no one recognizable image. Santa Claus, his elves, his workshop in the North Pole, his big bag of presents, and his magical sleigh pulled by reindeer are a cohesive set of traditions. But where does the Easter Bunny live? What does he look like? Where does the Easter Bunny get candy and eggs and other little trinkets to put in Easter baskets? And why, in the name of the spring fertility goddess, is the Easter Bunny (a mammal, we might remind you) associated with eggs?

Easter Symbols

As the most important holiday in Christianity, Easter is a celebration of the new and everlasting life that comes through the Resurrection of Jesus Christ. Springtime holidays from pagan and secular traditions also focus on celebrating life and fertility as the world begins to blossom and the sun begins to shine after the darkness and coldness of winter. Many of the symbols we have come to associate with Easter draw from this fountain of youth. Eggs and baby animals are living proof of fertility and new life. Pastel colors reflect the newly budding blossoms in the spring. The growth of Easter lilies from a bulb in the ground to a pure white, trumpet-shaped flower “symbolize the rebirth and hope of Christ’s resurrection” (History.com, 2021).

Ostara by Johannes Gehrts, 1901, shows one artist’s imagining of the goddess. Whether and how the Saxons and other Germanic tribes worshiped her is up for debate.

Rabbits, too, are a prominent symbol of new life: they breed like, well, rabbits. Some have estimated that a female rabbit might have up to 100 babies per season, or a total of up to 1,000 babies over a lifetime! (MentalFloss, 2015). For this reason, they are also an ancient symbol of fertility and thus have a natural association with spring holidays. The Easter holiday is the Christian celebration of the Resurrection of Jesus Christ, but the name Easter comes from the festival of Eostre, the Saxon fertility goddess, whose German name is Ostara. Some have conjectured that the Saxons believed Eostre’s animal symbol was a bunny or she had a hare as a companion, though there is little evidence in the historical record for such a claim. (Instead, later scholars may have theorized such an association to retroactively explain traditions that existed in Europe later on.) Stephen Winick at the Library of Congress explains that common observations about rabbits, eggs, and the budding of new life in the spring led to many similar traditions throughout time, whether or not a direct relationship is present between any of these traditions: “In short, we don’t need a pagan fertility goddess to connect bunnies and eggs with Easter—springtime makes the connection for us all by itself” (Winick, 2016).

A Puck Magazine cover showing a young woman in a fancy dress, the Easter Bunny wearing clothing and carrying a basket of colored eggs, and an exasperated monk.

This Puck magazine cover was typical of twentieth-century depictions of the Easter Bunny. The Bunny appears fully clothed, companions with a young woman who evokes the idea of Eostre, while an exasperated monk protests the secular celebration.
Illustration by L.M. Glackens, March 26, 1902, Library of Congress.

The Osterhase and His Hase-Eier

Like many holiday traditions celebrated in America, the Easter Bunny has its origins in Germany. German immigrants to Pennsylvania in the 1700s brought stories about an egg-laying rabbit called the Osterhase (“Easter Hare”). Among the Pennsylvania Dutch, children made nests for the Osterhase and left carrots for him to eat as fuel on his journey, in hopes that he would leave colored eggs for them the night before Easter. Children often used hats and bonnets as nests, sometimes placing them outside in a garden or a barn where a bunny would have the easiest access. In some versions of the story, the bunny lays the eggs, while in others he brings them in a basket. The Easter Bunny was also a judge: tradition has it that he only gave eggs to children if they were good, to encourage children to behave themselves during Eastertide. Misbehaving children might receive rabbit droppings or coal instead.

Georg Franck von Franckenau’s 1682 essay “De ovis paschalibus” (“About Easter Eggs”) describes an Easter egg hunt of sorts, where the Easter Hare lays Hasen-Eier (“hare eggs”) hidden in the garden and grass for children to find. They would then feast on the eggs (real ones rather than candy-filled plastic!). Eating so many eggs without salt or butter would cause a stomachache, doctors warned—bet they never envisioned the mass sugar rush children today have from feasting on chocolate eggs.

A True Renaissance Rabbit

So why does the bunny deliver eggs, and why he is a male?

Anciently, it was believed that hares were hermaphrodites, meaning that they had the reproductive equipment of both a female and male. Pliny, Plutarch, and other great thinkers thought that hares could switch sexes at will and even impregnate themselves. So though we speak of the Easter Bunny as a he—even though it wouldn’t make sense for a male (or a bunny for that matter) to lay eggs—into the Renaissance, hares in general were not believed to be strictly male or female. This led to an association of the hare with the Virgin Mary, due to its supposed capacity to reproduce while remaining a virgin (which we now know is definitely not true). Renaissance art reflects this association.

The Madonna of the Rabbit: a depiction of the Virgin Mary with the Christ Child, a hare, St. Katherine, and John the Baptist.
The Madonna of the Rabbit
By Titian, 1520, oil on canvas, image from Wikimedia Commons.

Whether or not the bunny actually lays the eggs or just delivers them (did he steal them from a chicken?), eggs represent the potential for new life when a baby chick hatches as well as symbolizing the emergence of Christ from the tomb. Because of this dual symbolism, the Easter Bunny pays a visit to people of different faiths or no faith. It exists as a tradition that draws upon symbols that can be interpreted in light of different religious beliefs, whether Christian or not. The widespread appeal likely contributed to the growing popularity of the Easter Bunny throughout the nineteenth and twentieth centuries in America.

Also, in the twentieth century, nests turned into baskets, real eggs turned into plastic eggs, and the Easter Bunny’s gifts expanded to include chocolate, jelly beans, and small toys. Candy companies capitalized on the Easter Bunny tradition by marketing spring-themed candy and other odds and ends for Easter baskets, further reinforcing the practice.

Other Bringers of Easter Cheer

The Easter Bunny isn’t the only bringer of springtime cheer. In Switzerland, the Easter Cuckoo makes the rounds, while some parts of Germany receive visits from the Easter Fox or the Easter Rooster. In Australia, the Easter Bilby initiates the springtime festivities (and don’t mention the Easter Bunny to an Australian—the overabundance of rabbits as an invasive species introduced in the eighteenth century has led to the endangerment of native animals).

A chocolate Easter Bilby.
Image by Nicole Kearney, April 21, 2019, CC BY-SA 4.0 from Wikimedia Commons.

We could have just had an Easter Hen. That would have made much more sense, and baby chicks are already associated with springtime festivities. But if we’re making up a mythical creature, we might as well stretch our imagination a little further!

Sources

Crew, Bec. “Australia’s Easter Bunny: The Long-Eared Greater Bilby.” Scientific American, April 19, 2014. https://blogs.scientificamerican.com/running-ponies/australiae28099s-easter-bunny-the-long-eared-greater-bilby/.

History.com Editors. “Easter Symbols and Traditions.” History.com, March 24, 2021. https://www.history.com/topics/holidays/easter-symbols.

Jeon, Hannah. “What Are the Easter Bunny’s Origins? Here’s the Fascinating History of the Easter Bunny.” Good Housekeeping, March 4, 2020. https://www.goodhousekeeping.com/holidays/easter-ideas/a31226078/easter-bunny-originshistory/#:~:text=As%20for%20how%20the%20specific,goes%2C%20the%20rabbit%20would%20lay.

Sifferlin, Alexandra. “What’s the Origin of the Easter Bunny?” Time, February 21, 2020. https://time.com/3767518/easter-bunny-origins-history/.

Soniak, M. “Are Rabbits as Prolific as Everyone Says?” MentalFloss, January 20, 2015. https://www.mentalfloss.com/article/29870/are-rabbits-prolific-everybody-says.

Wikipedia. “Easter Bunny.” Accessed March 25, 2021, from https://en.wikipedia.org/wiki/Easter_Bunny.

Winick, Stephen. “On the Bunny Trail: In Search of the Easter Bunny.” March 22, 2016. Library of Congress. https://blogs.loc.gov/folklife/2016/03/easter-bunny/.