A Short History of Hamburgers

Why are hamburgers called hamburgers if they’re not made out of ham? The answer spans time and space from the Mongol invasion of Russia to the German revolutions of 1848 to the McDonald’s Big Mac.

A hamburger and French fries is probably the most American meal you could think of. Let’s consider that for a moment . . . a hamburger, named not for the meat it’s made of but for Hamburg, Germany, and French fries are considered quintessentially American. This is yet another testament to the powerful influences of immigration and cultural exchange that continue to shape the culture of America today.

“Two All-Beef Patties . . .”

There is considerable controversy over the origin of the hamburger. Because ground beef steak and bread have been eaten separately in many different countries for centuries, it is unknown exactly how the hamburger as we know it came to be. For one, there are similar dishes found throughout Europe. Sicia omentata from fourth-century Rome was a baked beef patty mixed with pine kernels, peppercorns, and white wine. Steak tartare had its origins in the twelfth-century Mongol invasion of Russia, when Mongol invaders stashed meat under their saddles to tenderize it while they rode to battle and then ate it raw. Russians called this preparation steak tartare, after their name for the Mongols. When ships from the port of Hamburg came to Russia to trade, they brought back steak tartare as raw ground beef shaped into a patty with a raw egg yolk on top.

Modern steak tartare.
Image by Rainier Zenz, CC BY-SA 3.0 via Wikimedia Commons.

A more direct German precursor to the hamburger is the seventeenth-century Frikadeller, which were flat, pan-fried beef meatballs. In eighteenth-century England and America, the Hamburgh sausage was prepared with chopped beef, spices, and wine and was supposedly a recipe that mimicked the preferences of immigrants and visitors from Hamburg. A nineteenth-century adaptation, called the Hamburg steak, is the most recognizably hamburger-like preparation and carried the Hamburg name. It was a minced beef filet, sometimes mixed with onions and bread crumbs, then salted and smoked and served raw in a pan sauce.

Hamburg steak
Hamburg steak.
Image by OiMax, CC BY 2.0 via Wikimedia Commons.

In 1848, political revolutions throughout the German Confederation pushed many Germans to immigrate to the United States. Known as the “Forty-Eighters,” these immigrants were just the first of a wave of European immigrants. The 1850s saw a larger increase in the immigrant population in the US relative to the overall population than any other decade in history. The German-born population alone increased 118.6% during this decade as immigrants arrived in New York by boat and spread throughout the East and Midwest states.

And here’s where the controversy comes in. The first version of the hamburger origin story claims that Germans arriving on the Hamburg-America line had already been preparing and consuming the Hamburg steak, as it was a popular meal among workers, and the smoked preparation kept well at sea. Immigrants enjoyed the meal and continued to make Hamburg steaks out of fresh meat once they got to America.

The second version goes that the Hamburg steak was created to meet demand for quick, cheap food for German sailors and immigrants arriving in America. Hamburg was known as an exporter of high-quality beef, so the Hamburg steak was offered in America as an idea of what might appeal to those arriving from Germany. Street vendors opened up along the coast where the Hamburg-America line docked, selling lightly grilled meat patties in the “Hamburg style” as a quick meal, perhaps accompanied with bread.

Port of Hamburg, 1862.
Image from Library of Congress via Wikimedia Commons.

In the second half of the nineteenth century, following this wave of immigration, the Hamburg steak was found at restaurants all over the port of New York. It was rather expensive at first (a whole 11 cents!), but with the growth in rural beef production and railroad transportation, the cost of beef decreased, and the meal became more widely available. Cookbooks of the time included detailed instructions for preparing the “hamburger steak,” as it was known from 1889 on.

The hamburger steak was soon viewed as a quintessentially American food, influenced as it was by the waves of immigrants who formed the character of the country. As Chicago and other cities in the East developed into major centers for the large-scale processing of beef, the hamburger steak became widely affordable and available to the average consumer—it was the “American beef dream.”

“Special Sauce, Lettuce, Cheese, Pickles, Onions”

By the early 1900s, the term hamburger steak was shortened to simply hamburger. Sometime between 1885 and 1904, someone decided to put the hamburger steak between two slices of bread, thus inventing the hamburger sandwich we know today. Some credit the founder of fast-food joint White Castle as the inventor of the hamburger sandwich, while others cite small-town cooks in Texas or Oklahoma or Ohio who placed a Hamburg steak between two slices of bread. Hamburgers were served on two thick slices of toast at the St. Louis World’s Fair in 1904, where they gained major exposure and created a sensation among fair-goers. Various claims exist and are not well-documented, and it’s likely that multiple people had the idea for a hamburger sandwich around the same time.

Toppings soon followed. Onions had long accompanied the hamburger steak, and other vegetables like lettuce and tomatoes were added for a fresher appearance. Ketchup was first commercially produced in 1869 by Henry Heinz and soon became a near-universal condiment for the hamburger.

“. . . On a Sesame-Seed Bun”

Fast-food restaurants played a major role in cementing the hamburger as the all-American meal. White Castle, which opened its doors in 1919, is regarded as one of the first true fast-food restaurants. When Upton Sinclair’s book The Jungle, published in 1906, caused public outrage and anxiety over the state of meat processing in the country, restaurants had to deal with negative perceptions of products made from ground meat. White Castle took efforts to promote itself as a clean and hygienic facility, and paired this with rapid service and a simple menu centered around hamburgers and coffee.

Like many foods and practices that had their origins in Germany, the hamburger may have lost popularity during World War I due to anti-German sentiment. Additionally, the word “hamburger” conjured up images of greasy, cheap fair food for some consumers. For these reasons, White Castle hamburgers were rebranded as “sliders” to avoid referencing a German city or invoking other unsavory connotations. (Frankfurters received a similar treatment and were called “hot dogs” from then on, and they never quite regained their name.)

But other fast-food companies did not necessarily follow suit, and the term hamburger was still in use during the Great Depression as White Castle’s production methods became faster, more efficient, and more standardized, providing customers with a predictable meal and experience no matter where they were in the country. This concept would revolutionize the world of restaurants as the birth of fast food.

By the 1940s, the term “hamburger” was shortened to “burger,” which became a new combining form—giving us the parts we needed to build words like cheeseburger, veggie burger, and baconburger. Around the same time, McDonald’s came onto the scene, building on White Castle’s system and adding drive-in service. A competing chain called Bob’s Big Boy lays claim to the first documented instance of making a hamburger with the now-standard sesame-seed bun, due to a request from a customer who wanted “something different.” This order also resulted in the first double-decker burger as the chef cut the bun in three pieces to hold two hamburger patties. (Though the sesame seeds used for hamburger buns today have been rendered tasteless, they add visual appeal and cause people to salivate when they see them.)

By Peter Klashorst, CC BY 2.0 via Wikimedia Commons.

When the Big Mac was introduced in 1967, all bets were off: McDonald’s was the leader in fast food and the main driver behind the popularity of the American-style hamburger worldwide. “Two all-beef patties, special sauce, lettuce, cheese, pickles, onions on a sesame-seed bun” was now the established recipe for a premium-quality fast-food hamburger.

Influenced by immigrants and innovation, the hamburger has now became an internationally recognized symbol of American culture and of globalization. Just ask Inspector Closeau:

Sources

Barksdale, Nate. “How the Hamburger Began—And How It Became an Iconic American Food.” History.com, January 6, 2021. https://www.history.com/news/hamburger-helpers-the-history-of-americas-favorite-sandwich.

“German Immigration in the 1850s.” Ancestry.com.  https://www.ancestry.com/historicalinsights/german-immigration-1848.

“Hamburger.” The Merriam-Webster New Book of Word Histories. Springfield, Massachusetts: Merriam-Webster Inc., 1991, p. 210. https://archive.org/details/merriamwebsterne00merr/page/210/mode/2up.

“History and Legends of Hamburgers.” What’s Cooking America. Retrieved May 8, 2021, from https://whatscookingamerica.net/History/HamburgerHistory.htm.  

Ozersky, Josh. The Hamburger: A History. Yale University Press, 2008.

Satran, J. “How Did Hamburger Buns Get Their Seeds?” HuffPost, April 10, 2015. https://www.huffpost.com/entry/hamburger-bun-history_n_7029310.

Walhout, Hannah. “A History of the Burger: From Ancient Rome to the Drive-Thru.” Food & Wine, June 20, 2017. https://www.foodandwine.com/comfort-food/burgers/burger-timeline.

Wikipedia. “Hamburg Steak.” Retrieved May 12, 2021, from https://en.wikipedia.org/wiki/Hamburg_steak.

Wikipedia. “History of the Hamburger.” Retrieved May 12, 2021, from https://en.wikipedia.org/wiki/History_of_the_hamburger.

Wittke, Carl. Refugees of Revolution: The German Forty-Eighters in America. University of Pennsylvania Press, 1952.

The Secret History of White Chocolate

Is white chocolate actually chocolate—and where did it come from in the first place? The answer involves the snow-capped mountains of Switzerland, questionable pharmaceutical cookbooks, and children’s vitamins.

White Chocolate vs. Chocolate

First, let’s talk about chocolate. According to the Chicago Tribune, the cacao bean—the main ingredient of chocolate—contains about equal parts cocoa butter and cacao nibs. The cocoa butter provides the creamy, smooth mouthfeel of chocolate while the cacao nibs are responsible for the distinctive taste and aroma. Chocolate is made from both cocoa butter and cacao nibs, along with sugar and often milk. Per FDA regulations, chocolate must contain at least 10% cocoa mass to be labeled as chocolate. (Also called chocolate liquor, cocoa mass is the result of finely grinding cacao nibs and includes cocoa butter that is present in the cacao nibs.) White chocolate, on the other hand, is made from cocoa butter without the cacao nibs. By law, it must contain at least 20% cocoa butter and 14% milk products to be labeled as white chocolate.

Some chocolate purists thumb their nose at insistence of white chocolate daring to pretend that it is on par with its milk and dark chocolate cousins since it is missing a crucial ingredient. But some argue that because white chocolate is made from part of the cacao bean, it should be grouped with other types of chocolate. But the law settles this debate: legally, white chocolate cannot just be called chocolate.

The Unwritten History

Despite its sweet, innocent taste, white chocolate has a hidden past. The history of white chocolate is less than straightforward, and Nestlé has long tried to claim ownership. Until recently, the story went that Nestlé developed white chocolate in 1936 as a way to use up excess milk powder that had been produced during World War I. However, this story skips over many earlier uses of white chocolate—and the real story has less to do with powdered milk as a wartime product and more to do with infant formula.

Food historian Sarah Wassberg Johnson has uncovered several sources showing that white chocolate had actually been made as early as the 1860s. A new technique in the world of chocolate making called the Broma process may have had something to do with it. Developed in 1865, the Broma process involves placing cacao beans into a bag at a warm temperature to allow the cocoa butter to drip out, leaving the beans ready for processing into cocoa powder. Johnson proposes that this created a surplus of cocoa butter and spurred experimentation to find new ways to use it.

Recipes for white chocolate begin to appear in cookbooks shortly thereafter, although the formulas were quite different from modern white chocolate.

An 1869 recipe for white chocolate caramels was simply a recipe for caramels with the addition of cocoa butter. This recipe appeared in a cookbook by two French chefs.

Recipe for white chocolate caramel tablets

One recipe from an 1872 American cookbook included tapioca, powdered sugar, oatmeal, and Iceland moss (yum) along with “concentrated tincture of Caraccas cacao,” and it was designated as a suitable composition for “delicate persons.”

Another cookbook, specifically for druggists, directed the cook to mix sugar, rice flour, arrowroot, vanilla, and powdered gum Arabic (because everyone keeps that on hand . . .) with cacao butter and then pour the mixture into molds.

The White Chocolate Candy Bar

In the 1910s and ’20s, these experiments moved outside the realm of home kitchens and pharmacies as Swiss chocolatiers began to produce white chocolate. A skeptical article in the International Confectioner sneered at rumors of “snow white chocolate” in Switzerland, which the author, T. B. McRobert, saw as an imaginary nod to the snowy Swiss Alps. McRobert said he would never eat such a thing, as it would have to be bleached with toxic gases to produce the white color. But the rumors grew, and it turns out white chocolate was real (and safe to eat!). Swiss white chocolate was made from cocoa butter and sugar, sometimes with milk powder, chestnut meal, or vanilla.

The early twentieth century saw rapid growth in the candy industry, especially during World War I, which paved the way for the first commercially produced white chocolate. The Double Zero Bar was introduced in 1920 by the Hollywood Brands company in Minnesota. The novel confection consisted of a caramel, peanut, and almond nougat covered in white chocolate fudge, a unique look for a candy bar. If all this talk about chocolate is making you hungry, you’re in luck—this candy bar still produced today, though it’s now called the ZERO Bar and sold by Hershey’s.

A vintage Zero bar wrapper.
The original Zero bar wrapper.
Image from the Candy Wrapper Museum.

Nestlé’s White Chocolate Vitamins

While Nestlé certainly doesn’t have a claim to producing the first white chocolate, it does have a claim to being the first to commercially produce solid white chocolate.

German-Swiss chemist Henri Nestlé had spent part of his career working on an infant formula that could help alleviate the high infant mortality rate in Germany. He subsequently experimented with powdered and condensed milk products that could improve peoples’ quality of life. In 1879, he founded the Nestlé Company with chocolatier Daniel Peter in Switzerland. The duo had perfected a recipe for milk chocolate in 1875 using Nestlé condensed milk, and their partnership continued to prove fruitful. Nestlé began using his scientific expertise to develop new, innovative products both in the candy industry and in the areas of medicine and health.

In 1936, Nestlé worked with the pharmaceutical company Roche to develop a new product called Nestrovit, a tablet made from vitamin-enriched condensed milk that would help provide children with essential nutrients for growth and development. Nestlé faced the challenge of finding a coating for the tablet that would protect the ingredients from damage and preserve their nutritional benefits. Using his knowledge of chocolate production, Nestlé added some cocoa butter to the Nestrovit formula and created a white chocolate coating for the tablet.

Aside from creating a successful health supplement, Nestlé saw the potential for even greater value in the new variety of chocolate he had made. In 1936, the Nestlé Company launched the Galak bar (branded as Milkybar in the UK), a pure white chocolate bar with a sweet and creamy flavor. In 1948, the Alpine White bar with almonds came on the scene and truly popularized white chocolate bars in the US and Canada market. Marketing for the Alpine White bar drew upon the snow-capped Swiss Alps—coming back full circle to prove the early skeptics of white chocolate wrong.

Milkybar, sold in the UK
Image by Evan-Amos from Wikimedia Commons.

White Chocolate Today

White chocolate is often passed over for its cocoa mass-containing counterparts, and many people associate it with cheaper, waxy-textured novelty candy. However, some chocolatiers have begun to take it more seriously. Specialty chocolate makers see white chocolate as a blank canvas for other flavors creative add-ins, without the taste of cacao nibs to overpower delicate flavors. Rosemary and sea salt, roasted strawberry, turmeric and pomegranate, and caramelized or “blond” white chocolate are only some of the unique flavors that artisan chocolatiers have dreamed up.

White chocolate may still be the underdog, and it may not actually be considered chocolate, but it seems that it has great potential for innovation in the future!

Sources

Blakely, Henry. The Druggist’s General Receipt Book: Containing a Copious Veterinary Formulary: Numerous Recipes in Patent and Proprietary Medicines, Druggists’ Nostrums, Etc.: Perfumery and Cosmetics: Beverages, Dietetic Articles, and Condiments: Trade Chemicals, Scientific Processes, and an Appendix of Useful Tables. Philadelphia: Lindsay & Blakiston, 1871. Digitized by Harvard University. https://catalog.hathitrust.org/Record/100598206.

Gouffé, Jules, and Alphonse Gouffé. The Royal Cookery Book (le Livre de Cuisine). 1869. Digitized by Harvard University.

Guittard. “Glossary of Terms.” https://www.guittard.com/in-the-kitchen/article/glossary-of-terms.

“Henri Nestlé.” Wikipedia. Retrieved May 5, 2021, from https://en.wikipedia.org/wiki/Henri_Nestl%C3%A9.

Johnson, Sarah Wassberg. “Before Nestle: A History of White Chocolate.” The Food Historian, February 14, 2021. https://www.thefoodhistorian.com/blog/before-nestle-a-history-of-white-chocolate.

Marchetti, Silvia. “How White Chocolate Evolved from a Coating for Kids’ Medicine into a Sweet, Creamy Treat,” November 9, 2019. https://www.scmp.com/magazines/style/news-trends/article/3036673/how-white-chocolate-evolved-coating-kids-medicine-sweet.

Seth, Simran. “For Those Who Think White Chocolate Isn’t ‘Real’ Chocolate, Have We Got Bars for You.” Chicago Tribune, November 28, 2017. https://www.chicagotribune.com/dining/recipes/ct-white-chocolate-is-real-chocolate-20171128-story.html.

TCHO. “Is White Chocolate Actually Chocolate?” January 9, 2018. https://tcho.com/blogs/news/is-white-chocolate-actually-chocolate.

The Dessert Book: A Complete Manual from the Best American and Foreign Authorities. With Original Economical Recipes. Boston: J. E. Tilton and Company, 1872. Digitized by Harvard University. https://babel.hathitrust.org/cgi/pt?id=hvd.32044087496899&view=1up&seq=125.

The Garden of Children

Why is the first year of school for children called kindergarten? The answer involves a nature mystic, a case of mistaken identity, and a socialism scare.

Kindergarten stands out from the other required years of education in the United States for its unique name. First grade, second grade, and third grade follow, all the way up to twelfth grade (plus some alternate names for the high school years). So why the special name for kindergarten?

The Founding of Kindergarten

The word kindergarten was coined in 1840 by German teacher and educational reformer Friedrich Fröbel, from the words Kinder (“children”) and Garten (“garden”). Like all nouns in  German, the word Kindergarten is capitalized, but this styling is usually not carried over into English.

Friedrich Fröbel, by C. W. Bardeen, 1897.
Image from the Library of Congress.

Fröbel used the word in a proposal that called for the development of early childhood education as a necessary part of widespread educational and social reform. He advocated for the unique needs of young children and opened up an experimental infant school in Prussia called the Child Nurture and Activity Institute. He later renamed it Kindergarten, reflecting his philosophy that young children should be nurtured like “plants in a garden.” Schools for young children in the 1700s and 1800s had formerly been glorified babysitters, philanthropic endeavors to care for impoverished children, or discipline in preparation for adulthood. Fröbel’s school instead focused on encouraging self-expression and learning through play, singing, gardening, and group activities, and it formed the basis for early childhood education techniques used today.

In 1851, Kindergarten schools were banned in Prussia due to a mix-up of Fröbel with his nephew Karl, who was a socialist and had published a treatise proclaiming more radical views about education. The government mistakenly attributed Karl’s “atheistic and demagogic” views to his uncle, who was sincerely religious (in the form of nature mysticism and pantheism) and dedicated to improving childhood education. The ban on Kindergarten led to a diaspora of German teachers to other countries in Europe and the United States, where they spread their teaching model to other schools. In 1856, Margaretha Meyer-Schurz opened the first German-speaking kindergarten in the U.S. A few years later, Elizabeth Palmer Peabody embraced Fröbel’s model after visiting Germany and opened the first English-speaking kindergarten in the U.S. Peabody is largely credited with popularizing the concept of kindergarten in America.

Im Kindergarten, by Hugo Oehmichen, 1879.
Image from Wikimedia Commons.

Translating Kindergarten

English borrowed the word kindergarten from German without translating it, but it is translated into Romance languages word for word in a way that preserves the original meaning of the Kinder + Garten roots. In French, the term is jardin d’enfants (“garden of children”), in Spanish, jardín de infancia (“garden of childhood”), and in Portuguese, jardim de infância (“garden of childhood”). A few non-Romance languages such as Hebrew do the same thing: gan yeladim means “garden of children.” A loanword that is translated this way is called a calque. Other words that use a similar translation scheme include honeymoon, Adam’s apple, and loanword itself.

These words are not very common in Romance languages anymore, nor is the term kindergarten widely used in the UK. During and after World War II, German language and culture was looked down upon in many nations, and some have claimed that these calques of Kindergarten were eclipsed by other terms devoid of German roots.

Kindergarten around the World

In many countries, children from ages three to seven attend kindergarten or the equivalent. Where the United States distinguishes between preschool and kindergarten, many other countries do not, and kindergarten is instead part of the preschool system. Children may attend the same kindergarten/preschool for two years or more before beginning their primary education.

Fröbel was one of the most influential educational reformers in the modern educational system, and the effects of his work—and his words—are still seen today. Kindergarten is a place where we can begin to explore and learn without many of the social pressures of older childhood—where we don’t have to be anyone but ourselves.

Sources

Curtis, Stanley James. “Friedrich Froebel.” Encyclopaedia Britannica.  https://www.britannica.com/biography/Friedrich-Froebel.

Eschner, Kat. “A Little History of American Kindergartens.” Smithsonian Magazine, May 16, 2017. https://www.smithsonianmag.com/smart-news/little-history-american-kindergartens-180963263/.

“kindergarten (n.).” Etymology Online Dictionary. Retrieved April 26, 2021, from https://www.etymonline.com/search?q=kindergarten.

“kindergarten (n.).” Oxford English Dictionary. Retrieved April 26, 2021.

Wikipedia. “Friedrich Fröbel.” Retrieved April 27, 2021, from https://en.wikipedia.org/wiki/Friedrich_Fr%C3%B6bel.

Wikipedia. “Kindergarten.” Retrieved April 26, 2021, from https://en.wikipedia.org/wiki/Kindergarten.

Wikipedia. “List of calques.” Retrieved April 26, 2021, from https://en.wikipedia.org/wiki/List_of_calques.

Honeymoon

Why does the word honeymoon refer to a vacation a couple takes after getting married? The answer involves myths about mead, poetry about love, and a warning about waning.

In some cultures, the period shortly following marriage is seen as a time for a couple to withdraw from the world and spend time with each other. In others, this period is still a time of celebration with friends and family, and couples are given little time alone. Both of these inclinations gave way to the honeymoon, a vacation that a newlywed couple takes together immediately after getting married. But have you ever wondered about the origins of the honeymoon?

Some Ancient Theories

Some historians claim that the honeymoon “dates from the days of marriage by capture when, after snatching his bride, the groom swept her away to a secret location, safe from discovery by her angry kin” (Waggoner, 2020). The groom kept the bride hidden from her family until they stopped looking for her, with the intent to get her pregnant. Later, marriage-by-capture became ritualized, and the groom took the bride away with her family’s knowledge and the understanding that he would later offer a bride price. However, this terrifying practice may not be directly related to post-marriage celebrations today. It could rather be an analogy from a time and place when marriage was more of a forced transaction than an act of love. Some tend to cite ritual kidnapping as the precursor to the honeymoon, but it seems that we lack evidence of a direct link between the two practices.

Another “fanciful” theory held that honeymoon came from an ancient tradition where guests would gift a newlywed couple with mead, which is made from honey. The couple supposedly drank the mead together during their first month of marriage to improve their chances of conception. This may seem like a plausible origin, but is largely regarded as a myth made up in the eighteenth century. (No one can decide where this originated or what language these people may have spoken, and for this reason they probably would not have called it a “honeymoon.”)

And on the two depressing notes of kidnapping and drunkenness, let’s take a look at the happier precursors to the modern honeymoon.

The Origin of the Word

The term honeymoon originally referred to the first month of marriage—encapsulating the idea that the first month, or moon, is when marriage is supposedly the sweetest and is filled with love and happiness. It came into use in the mid-sixteenth century. The earliest recorded use was in a 1546 book of English proverbs by John Heywood, who described newlywed bliss thus: “It was yet but hony moone.”

By the end of the sixteenth century, the word honeymoon was soon extended to mean a period of peace, good relations, or goodwill between people, groups, or nations, often in a political context. A 1655 church history of Britain by Thomas fuller noted, “Kingdoms have their honey-moon, when new Princes are married unto them.” A partnership between two businesses might undergo a honeymoon period, as could a country that has just elected a new leader. But history tells us that the honeymoon period often does not last for long.

The comparison of marriage to the phases of the moon implied that a couple’s love would wane over time: “And now their hony-moone, that late was cleare, Did pale, obscure, and tenebrous appeare,” lamented the poet William Fenner in his 1612 Cornucopiæ.

The Bridal Tour

The custom of a newlywed couple taking a vacation after their wedding originated in Great Britain in the 1820s, when it became fashionable for upper-class couples to take a “bridal tour” to visit family and friends who could not attend their wedding. (Working-class couples generally did not have time off work to take any sort of vacation.) Because this trip occurred during the honeymoon period, the sweet first month of marriage, the term honeymoon began to be applied to the trip rather than the time period.

The bridal tour custom soon spread to the European continent. In many European languages, the word that represents the concept of a honeymoon can be translated word-for-word into “honey” and “moon.” The French term is lune de miel (“moon of honey”), for example. Farther away, the Russian word is медовый месяц (“honey month”), and the Persian word is mah-e-asal (“mouth of honey” or “moon of honey”).

Later in the nineteenth century, during the period known as the Belle Époche, newlyweds began taking a trip for fun and relaxation rather than for visiting family. This was considered one of the first modern practices driving mass tourism, and it was facilitated by general peace and stability between European nations in the period between the Franco-Prussian War and World War I.

Modern Honeymoons

Following European fashion, the honeymoon picked up steam in the United States toward the end of the nineteenth century. Advances in transportation made travel easier and more accessible to couples at all levels of society in the mid-twentieth century. The honeymoon is now considered a near-indispensable part of the wedding ritual in the United States and Great Britain.

Today, there are several different takes on the honeymoon. Many couples opt for a romantic, relaxing getaway for some time together. Some couples go on a short mini-moon after their wedding and then save up for a bigger trip in the future. In a strange twenty-first century twist, solomoons or unimoons are when both partners take a vacation on their own, often because they can’t agree on a destination together (which is probably not a predictor of a happy marriage!). And finally, babymoons are a vacation a couple takes before the birth of a child.

Cycles of Love

Today, we might view the idea of the “honeymoon period” positively but acknowledge that the initial excitement of marriage may wear off. A couple’s love may go through seasons and phases but grow through adversity and wax sweet again and again throughout their marriage. The excitement and passion of newlyweds can simply wane—or it can transform into a partnership built on mutual love and respect that grows stronger with each new phase of life.

Sources

Braff, Danielle. “Until Honeymoon We Do Part.” New York Times, March 12, 2019. https://www.nytimes.com/2019/03/13/fashion/weddings/until-honeymoon-we-do-part.html.

Brohaugh, William. Everything You Know about English Is Wrong. Naperville, IL: Sourcebooks, 2008. https://archive.org/details/everythingyoukno0000broh/page/92/mode/2up.

“Honeymoon.” Oxford English Dictionary. https://www.oed.com/view/Entry/88181.

Monger, George P. Marriage Customs of the World: An Encyclopedia of Dating Customs and Marriage Traditions. Vol. 1. ABC-CLIO, 2013.

Shamsian, Jacob. “The Mysterious Origin of the Word ‘Honeymoon.’” Business Insider, March 27, 2017. https://www.insider.com/honeymoon-word-meaning-etymology-2017-3.

Waggoner, Susan, In Susong, Liz. “The Gloomy History behind Honeymoons.” Brides, May 4, 2020. https://www.brides.com/story/the-gloomy-history-behind-honeymoons.

Wikipedia. “Belle Époque.” April 17, 2021.  https://en.wikipedia.org/wiki/Belle_%C3%89poque.

Wikipedia. “Honeymoon.” April 17, 2021. https://en.wikipedia.org/wiki/Honeymoon.

Cookies (The Slightly Less Delicious Ones)

Why are the tracking files that websites place on your computer called cookies? The answer (somewhat) involves shopping carts, Chinese takeout, and a German fairy tale.

We’ve talked about cookies in the past, but browser cookies (also called internet cookies or HTTP cookies) are a little more crumbly and a little less delicious.

What Is an Internet Cookie?

A cookie is a small text file that a web server sends to your browser when you visit a website. The next time you visit that website, your computer checks to see if it has stored a cookie, and if it has, it sends the information in the cookie back to the website. The cookie tells the website your previous browsing preferences—such as your preferred language, items you’ve added to your cart when shopping online, what links you’ve clicked on and pages you’ve visited before, and so on. The purpose of a cookie is to help you have a better browsing experience by making the site more relevant and useful to you as a unique user. Cookies are also used to protect sensitive information and authenticate users when they log in to their account.

Despite the usefulness of cookies, there is controversy about the amount of data collected by some websites and the way that data is used. The fact that they collect and store information without the user necessarily knowing about it has some people uneasy, and we’ve all encountered ads that seemed way too personalized due to third-party tracking cookies. Luckily, most cookies are harmless and simply speed up browsing and make websites more dynamic.

So Why Are They Called Cookies?

Web programmer Lou Montulli of Netscape Communications first used the term “cookie” in 1994 in reference to a package of data that a program receives and sends back without changing. This type of file had already been used in computing and was called a “magic cookie,” but Montulli ingeniously adapted them for use on the web. He created a system for an online store to solve the problem of servers that were overloaded with user shopping cart information. Passing small cookie files between the server and user computers was a much more efficient way of accessing user shopping cart data when needed.

Another use of this type of file was in Unix’s “fortune” program, which presented the user with a random quote, joke, or poem—like a virtual fortune cookie. The files that stored these messages were “magic cookie” files.

Some have also compared internet cookies to the story of Hansel and Gretel, who left a trail of bread crumbs behind them to mark their path through the forest. In the same way, internet cookies mark the trail of a user’s browsing history on a website.

Sources

Create a Pro Website. “What Are Cookies? And How They Work | Explained for Beginners!” August 31, 2009. https://www.youtube.com/watch?v=rdVPflECed8.

“How Google Uses Cookies.” Google Privacy & Terms. https://policies.google.com/technologies/cookies?hl=en-US.

“HTTP Cookies.” Wikipedia, April 12, 2021. https://en.wikipedia.org/wiki/HTTP_cookie.

“What Are Cookies?” Cookie Controller. https://cookiecontroller.com/what-are-cookies/.

Left, Right, Center

Why are conservatives referred to as the “right” and liberals referred to as the “left” in politics? The answer involves the French Revolution, the quick spread of information through newspapers, and the tense interlude between the two World Wars.

Political beliefs are often described as being on a spectrum from left to right. Left refers to liberal views, such as advocating for progressive reforms and seeking economic equality by redistributing wealth through social programs. On the far left, we have revolutionary ideologies like socialism and communism. Right refers to conservative views, such as maintaining existing institutions and traditional values while limiting government power. On the far right, we have nationalistic ideologies like fascism.

Vive le France

The political descriptors left and right originally referred to the seating arrangements for members of the French National Assembly in 1789, who convened during the French Revolution to draft a new constitution. From the position of the speaker of the assembly, those seated on the right side of the room were nobility and high-ranking religious authorities. Those seated on the left side of the room were commoners and lower-ranking clergy members.

The division was originally caused over the issue of how much authority the king should have. Those in favor of the king having absolute veto power sat on the right, and those who favored limiting the king’s veto power sat on the right.

Les Etats Généraux of 1789
Burin d’Alphonse Lamotte, 1889, public domain from Wikimedia Commons.

The higher-ranking members of society tended to be more pro-aristocracy and generally were more reactionary in their political views, while the lower-ranking members of society tended to be pro-revolution, more radical, and more centered on the needs of the lower and middle classes. Those who sat closer to the center of the room tended to be more moderate in their views than others in their faction. The left was “the party of movement,” and the right was “the party of order.”

Newspapers reported on left-wing and right-wing views, and the terms left and right spread quickly into popular usage in France.

Over the next century, the seating arrangements in the French legislature persisted at some times and were discouraged at other times. When the French Third Republic was established in 1871, the terms left, right, and center were used in the names of political parties themselves: the Republican Left, the Centre Right, the Centre Left, the Extreme Left, and the Radical Left were the major political parties of the day.

The Interwar Years

Right and left became widely used throughout Europe in the 1920s and 1930s, the years between the two World Wars where people “wrestled with the politics of nation and class” and found these labels to be a simplified way to describe complex political ideologies. Marci Shore, professor of European history, writes, “The interwar years were a time of a polarizing political spectrum: the Right became more radical, the Left became more radical; the liberal center ‘melted into air’ (to use Marx’s phrase)” (Carlisle, 2019).

Left and Right in America

Right and left entered usage in America in the 1920s and 1930s as well, but some shied away from the terms, especially left, throughout the mid-20th century due to connotations with extreme ideologies. The 1960s saw a shift toward people defining themselves more consistently with these terms in an effort to differentiate their views from others, as both liberals and conservatives were dissatisfied with the current political consensus. We see again that left and right were used as shorthand ways of categorizing people—a person on the right sees a person on the left as the “other,” and vice versa.

In America, “left” is often synonymous with the Democratic Party, while “right” is often equated with the Republican Party. However, political views span a wide spectrum, and some may fall in between the positions of the parties or way outside the bounds of either one. The definitions of left, right, and center are dynamic and change relative to one another throughout time. The terms meant something different during the French Revolution, in the Soviet Union, during the New Deal, and in America in 2021 and will continue to shift as parties and policies realign in a changing political climate.

The Nature of Public Opinion,” OpenEd CUNY, CC BY-NC-SA 4.0.

The Easter Bunny

Why does a rabbit leave colored eggs, candy, and nonedible novelties for children on Easter morning? The answer involves little ones leaving out an item of clothing overnight with the expectation that it will be filled with gifts, families providing a favorite snack for the mythical bringer of presents, and naughty children receiving a lump of coal  . . . sounds familiar.

The Easter Bunny is a curiously unexplored phenomenon—the jolly figure of Santa Claus appears in Coca-Cola ads, Christmas cards, and the minds of children around the world, but his rabbit friend in the pantheon of holiday figures has no one recognizable image. Santa Claus, his elves, his workshop in the North Pole, his big bag of presents, and his magical sleigh pulled by reindeer are a cohesive set of traditions. But where does the Easter Bunny live? What does he look like? Where does the Easter Bunny get candy and eggs and other little trinkets to put in Easter baskets? And why, in the name of the spring fertility goddess, is the Easter Bunny (a mammal, we might remind you) associated with eggs?

Easter Symbols

As the most important holiday in Christianity, Easter is a celebration of the new and everlasting life that comes through the Resurrection of Jesus Christ. Springtime holidays from pagan and secular traditions also focus on celebrating life and fertility as the world begins to blossom and the sun begins to shine after the darkness and coldness of winter. Many of the symbols we have come to associate with Easter draw from this fountain of youth. Eggs and baby animals are living proof of fertility and new life. Pastel colors reflect the newly budding blossoms in the spring. The growth of Easter lilies from a bulb in the ground to a pure white, trumpet-shaped flower “symbolize the rebirth and hope of Christ’s resurrection” (History.com, 2021).

Ostara by Johannes Gehrts, 1901, shows one artist’s imagining of the goddess. Whether and how the Saxons and other Germanic tribes worshiped her is up for debate.

Rabbits, too, are a prominent symbol of new life: they breed like, well, rabbits. Some have estimated that a female rabbit might have up to 100 babies per season, or a total of up to 1,000 babies over a lifetime! (MentalFloss, 2015). For this reason, they are also an ancient symbol of fertility and thus have a natural association with spring holidays. The Easter holiday is the Christian celebration of the Resurrection of Jesus Christ, but the name Easter comes from the festival of Eostre, the Saxon fertility goddess, whose German name is Ostara. Some have conjectured that the Saxons believed Eostre’s animal symbol was a bunny or she had a hare as a companion, though there is little evidence in the historical record for such a claim. (Instead, later scholars may have theorized such an association to retroactively explain traditions that existed in Europe later on.) Stephen Winick at the Library of Congress explains that common observations about rabbits, eggs, and the budding of new life in the spring led to many similar traditions throughout time, whether or not a direct relationship is present between any of these traditions: “In short, we don’t need a pagan fertility goddess to connect bunnies and eggs with Easter—springtime makes the connection for us all by itself” (Winick, 2016).

A Puck Magazine cover showing a young woman in a fancy dress, the Easter Bunny wearing clothing and carrying a basket of colored eggs, and an exasperated monk.

This Puck magazine cover was typical of twentieth-century depictions of the Easter Bunny. The Bunny appears fully clothed, companions with a young woman who evokes the idea of Eostre, while an exasperated monk protests the secular celebration.
Illustration by L.M. Glackens, March 26, 1902, Library of Congress.

The Osterhase and His Hase-Eier

Like many holiday traditions celebrated in America, the Easter Bunny has its origins in Germany. German immigrants to Pennsylvania in the 1700s brought stories about an egg-laying rabbit called the Osterhase (“Easter Hare”). Among the Pennsylvania Dutch, children made nests for the Osterhase and left carrots for him to eat as fuel on his journey, in hopes that he would leave colored eggs for them the night before Easter. Children often used hats and bonnets as nests, sometimes placing them outside in a garden or a barn where a bunny would have the easiest access. In some versions of the story, the bunny lays the eggs, while in others he brings them in a basket. The Easter Bunny was also a judge: tradition has it that he only gave eggs to children if they were good, to encourage children to behave themselves during Eastertide. Misbehaving children might receive rabbit droppings or coal instead.

Georg Franck von Franckenau’s 1682 essay “De ovis paschalibus” (“About Easter Eggs”) describes an Easter egg hunt of sorts, where the Easter Hare lays Hasen-Eier (“hare eggs”) hidden in the garden and grass for children to find. They would then feast on the eggs (real ones rather than candy-filled plastic!). Eating so many eggs without salt or butter would cause a stomachache, doctors warned—bet they never envisioned the mass sugar rush children today have from feasting on chocolate eggs.

A True Renaissance Rabbit

So why does the bunny deliver eggs, and why he is a male?

Anciently, it was believed that hares were hermaphrodites, meaning that they had the reproductive equipment of both a female and male. Pliny, Plutarch, and other great thinkers thought that hares could switch sexes at will and even impregnate themselves. So though we speak of the Easter Bunny as a he—even though it wouldn’t make sense for a male (or a bunny for that matter) to lay eggs—into the Renaissance, hares in general were not believed to be strictly male or female. This led to an association of the hare with the Virgin Mary, due to its supposed capacity to reproduce while remaining a virgin (which we now know is definitely not true). Renaissance art reflects this association.

The Madonna of the Rabbit: a depiction of the Virgin Mary with the Christ Child, a hare, St. Katherine, and John the Baptist.
The Madonna of the Rabbit
By Titian, 1520, oil on canvas, image from Wikimedia Commons.

Whether or not the bunny actually lays the eggs or just delivers them (did he steal them from a chicken?), eggs represent the potential for new life when a baby chick hatches as well as symbolizing the emergence of Christ from the tomb. Because of this dual symbolism, the Easter Bunny pays a visit to people of different faiths or no faith. It exists as a tradition that draws upon symbols that can be interpreted in light of different religious beliefs, whether Christian or not. The widespread appeal likely contributed to the growing popularity of the Easter Bunny throughout the nineteenth and twentieth centuries in America.

Also, in the twentieth century, nests turned into baskets, real eggs turned into plastic eggs, and the Easter Bunny’s gifts expanded to include chocolate, jelly beans, and small toys. Candy companies capitalized on the Easter Bunny tradition by marketing spring-themed candy and other odds and ends for Easter baskets, further reinforcing the practice.

Other Bringers of Easter Cheer

The Easter Bunny isn’t the only bringer of springtime cheer. In Switzerland, the Easter Cuckoo makes the rounds, while some parts of Germany receive visits from the Easter Fox or the Easter Rooster. In Australia, the Easter Bilby initiates the springtime festivities (and don’t mention the Easter Bunny to an Australian—the overabundance of rabbits as an invasive species introduced in the eighteenth century has led to the endangerment of native animals).

A chocolate Easter Bilby.
Image by Nicole Kearney, April 21, 2019, CC BY-SA 4.0 from Wikimedia Commons.

We could have just had an Easter Hen. That would have made much more sense, and baby chicks are already associated with springtime festivities. But if we’re making up a mythical creature, we might as well stretch our imagination a little further!

Sources

Crew, Bec. “Australia’s Easter Bunny: The Long-Eared Greater Bilby.” Scientific American, April 19, 2014. https://blogs.scientificamerican.com/running-ponies/australiae28099s-easter-bunny-the-long-eared-greater-bilby/.

History.com Editors. “Easter Symbols and Traditions.” History.com, March 24, 2021. https://www.history.com/topics/holidays/easter-symbols.

Jeon, Hannah. “What Are the Easter Bunny’s Origins? Here’s the Fascinating History of the Easter Bunny.” Good Housekeeping, March 4, 2020. https://www.goodhousekeeping.com/holidays/easter-ideas/a31226078/easter-bunny-originshistory/#:~:text=As%20for%20how%20the%20specific,goes%2C%20the%20rabbit%20would%20lay.

Sifferlin, Alexandra. “What’s the Origin of the Easter Bunny?” Time, February 21, 2020. https://time.com/3767518/easter-bunny-origins-history/.

Soniak, M. “Are Rabbits as Prolific as Everyone Says?” MentalFloss, January 20, 2015. https://www.mentalfloss.com/article/29870/are-rabbits-prolific-everybody-says.

Wikipedia. “Easter Bunny.” Accessed March 25, 2021, from https://en.wikipedia.org/wiki/Easter_Bunny.

Winick, Stephen. “On the Bunny Trail: In Search of the Easter Bunny.” March 22, 2016. Library of Congress. https://blogs.loc.gov/folklife/2016/03/easter-bunny/.

Pink Onesie, Blue Onesie: Infant Gender-Coding through Color

Why are baby girls dressed in pink and baby boys in blue? The answer involves marketing tactics, a pair of famously misconstrued paintings, and ultrasound technology.

White Dresses for All

Throughout history, socially defined rules have dictated certain types of clothing that are suitable for certain people. What you might not realize is that socially defined rules also determine at what age this gender distinction begins to matter—men, women, and children are often seen as different categories of people, each with their own typical styles of clothing.

Take a look at young Franklin D. Roosevelt:

Two-and-a-half-year-old Franklin D. Roosevelt wearing a white dress and long hair.
Franklin D. Roosevelt, age 2 1/2.
Photo 1884, public domain from Wikimedia Commons.

This picture, taken in 1884, shows two-and-a-half-year-old Roosevelt wearing a white dress, a feathered hat, and a long head of hair. These are things that today would be considered more suitable for a little girl, but they were typical for both genders of the upper class in the nineteenth century and earlier. In the Victorian Era, gender was not considered significant in a child’s life until about the age of seven, and little boys and girls generally wore the same types of clothing.

At age seven, boys went through a rite of passage called “breeching,” which involved dressing in pants and getting a haircut. Girls continued to wear short dresses, and as they grew older, their prescribed hemline length grew longer until their dresses reached the ankles around age 16.

Practically, having both little girls and little boys wear dresses saved parents a lot of time. Slipping a dress over a child’s head was much easier than buttoning up pants, and it simplified potty training. Clothes could also be reused for another child in the future, regardless of the child’s gender.

In earlier centuries, infants and young children had worn colored dresses in many different hues irrespective of gender. At other times, they had worn clothing that resembled those of their adult parents, reflecting a view of children as merely small adults who needed to grow up and begin working as soon as possible to help provide for the family. But in the age of bleaching, cheap cotton, and childhood, white dresses were the norm. White also had a connotation of purity and innocence, which seemed appropriate for small children.

Among Catholics, both girls and boys were sometimes dressed in blue to honor the Virgin Mary. (The same thing sometimes occurred for wedding dresses.)

Lighter tones and pastel colors followed and came to be associated with babies, though these colors were not gender-specific.

Pink and Blue as Gender Identifiers

Beginning around the mid-nineteenth century, the colors pink and blue came to be used as gender signifiers.

Items like ribbons, bows, and baby blankets were made in shades of light blue or pink to indicate whether a child was a girl or a boy. Dresses and other clothing soon followed.

Until the 1940s, two conflicting traditions existed. Magazines, advice columns, and other literary references were divided in the advice they gave to new parents. Some continued to recommend light, pastel colors in general. Some recommended mixing pink and blue for a lavender color. Some, like the 1890 Ladies’ Home Journal, explained:

“Pure white is used for all babies. Blue is for girls and pink is for boys, when a color is wished.”

(Emma M. Hooper, “Hints on Home Dress-Making” Ladies’ Home Journal, November 1890, p. 23)

Others such as Godey’s Lady’s Book noted, taking a page from sources in London and Paris,

“Blue is the color appropriated to male children, as rose or pink to those of the opposite sex.”

(Godey’s Lady’s Book, volumes 52–53 ,edited by Louis Antoine Godey and Sarah Josepha Buell Hale)

Marketing copy, magazines, and literary sources often cited “pink for girls, blue for boys” as the French fashion, which was a convincing reason for many people to follow this trend. The beloved 1869 novel Little Women, showed this inclination:

“Are they boys? What are you going to name them?”

“Boy and girl, aren’t they beauties?” . . .

“Amy put a blue ribbon on the boy and a pink on the girl, French fashion, so you can always tell.”  

(Louisa May Alcott, Little Women, Chapter 28)

These two conflicting gender assignments for pink and blue continued well into the twentieth century, and other countries had similarly mixed traditions—from Mexico to Switzerland to Korea, baby boys were dressed in pink, and blue was the preferred color for girls, but other countries reflected the fashions of England, the United States, and France. Some have attempted to explain that little girls wore blue because it was associated with the Virgin Mary and was seen as a more delicate and calm color, and little boys wore pink because it was a lighter version of red, which was seen as a strong, active, passionate color.

The Shift toward Gender Coding

According to historian Jo B. Paoletti, around the turn of the twentieth century, psychological studies on child development led some child care experts to conclude that parents should make a greater distinction between the appearance of girls and boys from a younger age. It was common for mothers to be told to dress their little boys in pink so that they grew up to be more masculine and to dress their little girls in blue so that they grew up to be more feminine. Not everyone was comfortable with this at the time due to the tendency to see children as “sexless cherubs” (see Paoletti, p. 89). Though pink-blue gender coding was known even during the Victorian Era, as we have seen, it did not necessarily become widespread in the United States until about the 1950s.

However, the shift toward gender coding in terms of color had begun, along with the styles of clothing that were deemed appropriate for babies. In 1927, Time magazine published a chart describing the appropriate colors for girls (blue) and boys (pink). Though the assignment of the colors differed regionally, department stores gave similar advice—if they could convince parents that they had to buy a whole new wardrobe for a baby girl and a baby boy, parents would end up buying more baby clothes rather than reusing them.

In the 1940s, however, clothing manufacturers and popular advice columns flipped the script and began promoting pink as the color of choice for girls and blue as the appropriate pick for boys. During World War II, little boys began to be dressed in pants and had short hair, emphasizing a particular view of masculinity that reflected the clothing their fathers wore, whereas little girls continued to wear dresses like their mothers. Children began to be dressed as mini adults in a way that emphasized their gender.

What we’re looking at is not a full-scale reversal of the colors assigned to girls and boys, but a larger-scale promotion of one practice and the quiet discontinuation of the other.

The Blue Boy and Pinkie

Art history has something to say about gender coding as well. When millionaire Henry Huntington purchased two eighteenth-century paintings, The Blue Boy and Pinkie, the paintings were widely publicized by the press, and suddenly Americans began to think that “pink for girls, blue for boys” had been right all along. The Blue Boy and Pinkie are inseparably connected in the minds of many viewers, their misguided takeaway being that the colors indicate a long-standing tradition in gender color coding. (In fact, the paintings were done about 25 years apart by different artists, and the clothing styles represented in the paintings are separated by about 150 years. The artists had no conceivable gender-coding agenda in mind, either.)

Pinkie, a portrait of a young girl in a pink dress
Pinkie
By Thomas Lawrence, 1794, oil on canvas.
The Blue Boy, a young boy dressed in a blue outfit
The Blue Boy
By Jonathan Buttall, 1770, oil on canvas.

Rejection and Revival

The 1960s and ’70s saw a rejection of gendered clothing and color in the second wave of feminism and other countercultural movements. Unisex clothing became more popular for young adults and children alike. In addition, feminist activists launched an anti-pink crusade in the 1970s as part of a larger movement to reject traditional gender norms and free women from the many cultural constraints that had been placed upon their sex. Ironically, this actually solidified pink in the minds of many as being essentially associated with femininity.

In the 1980s, gender color coding was back in fashion and stronger than ever. Ultrasound techniques that allowed parents to know the gender of their child before the child was born contributed to a revival of pink-blue gender coding. Now, the parents could announce the gender of the baby beforehand, friends and family could give pink or blue gifts to expecting mothers at a baby shower, and there were new ways for companies to market baby products of all kinds based on color. Clothing manufacturers and retailers targeted this market aggressively, pushing the “pink for girls, blue for boys” tradition for baby clothes. The pink-blue divide became more visible and more firmly embedded in the minds of American consumers. And in the age of pregnancy announcements on social media and gender reveal parties, “It’s a boy!” might as well just be “It’s a blue!”

Pink and Blue Today

Today, few parents would think of dressing their baby boy in pink, and many would think twice about dressing a baby girl in blue without also marking her gender in another way (such as the style of her clothing or a bow in her hair). Both men and women wear blue freely, as they have for centuries. But when grown men wear pink, it can come off as a social statement about defying gender roles, or they may feel that they need to justify their clothing choice. Wearing pink is often seen as too feminine for men. And the prejudice against men wearing pink is really a prejudice against women—the fear of appearing effeminate stems from society devaluing a color (or anything else) that has been culturally assigned to women, reinforcing sexism at a deeper level. Older girls sometimes protest wearing pink out of a desire not to appear like a “girly girl,” as if that were a negative thing. The devaluing of women and anything seen as feminine (even though there is not necessarily anything inherently feminine or masculine about pink or blue) hurts both boys and girls, as boys are told not to appear feminine and girls are told not to appear too feminine, regardless of how they may personally want to express themselves. It sends the message that anything too “female” is less important, less valuable, less capable of being taken seriously, whereas anything “male” is the default.

Cultural bias against women is changing, and with it, perhaps pink-blue gender coding as well. It is becoming more and more acceptable for men to wear pink, especially for the younger generations. A push to see gender on a spectrum rather than a male/female binary has also influenced attitudes toward gender coding in childhood. “Gender-neutral” often still means “not pink or blue,” but it is becoming more common for babies to wear gender-neutral colors, receive gender-neutral names, and sleep in a neutral-colored nursery room.

The future of gender color coding is in flux—with the opposing influence of gender reveals and gender-neutral baby products, pink and blue could become just colors, or they could be reinforced even further as gender signifiers.

Sources

Bhattacharjee, Puja. “The Complicated Gender History of Pink.” CNN, January 12, 2018. https://www.cnn.com/2018/01/12/health/colorscope-pink-boy-girl-gender/index.html.

Bilal, Khadija. “Here’s Why it All Changed: Pink Used to be a Boy’s Color & Blue for Girls.” The Vintage News, May 1, 2019. https://www.thevintagenews.com/2019/05/01/pink-blue/.

Blazeski, Goran. “Most Victorian-Era Boys Wore Dresses and the Reasons Were Practical.” The Vintage News, April 8, 2018. https://www.thevintagenews.com/2018/04/08/breeching-boys-2/.

DeVito, Jacklyn. “Mini Portraits: An Exploration of Childrenswear in the Nineteenth and Early Twentieth Centuries.” Cornell Fashion + Textile Collection Blog, April 4, 2018. https://blogs.cornell.edu/cornellcostume/2018/04/04/mini-portraits-an-exploration-in-childrenswear-of-the-nineteenth-and-early-twentieth-centuries/.

Maglaty, Jeanne. “When Did Girls Start Wearing Pink?” Smithsonian Magazine, April 7, 2011. https://www.smithsonianmag.com/arts-culture/when-did-girls-start-wearing-pink-1370097/.

Paoletti, Jo B. “Clothing and Gender in America: Children’s Fashions, 1890–1920.” Signs: Journal of Women in Culture and Society, vol. 13, no. 1 (1987): 136. https://doi.org/10.1086/494390.

Paoletti, Jo Barraclough. Pink and Blue: Telling the Boys from the Girls in America. Bloomington, Indiana: Indiana University Press, 2012.

Terynn Bolton. “The Surprisingly Recent Time Period When Boys Wore Pink, Girls Wore Blue, and Both Wore Dresses.” Today I Found Out, October 17, 2014. http://www.todayifoundout.com/index.php/2014/10/pink-used-common-color-boys-blue-girls/.

Wikipedia. “List of Historical Sources for Pink and Blue as Gender Signifiers.” Accessed March 20, 2021, from https://en.wikipedia.org/wiki/List_of_historical_sources_for_pink_and_blue_as_gender_signifiers.

Rock, Paper, Scissors

Where did Rock, Paper, Scissors come from? The answer involves a Japanese game called jan-ken but probably does not involve Celtic settlers in Portugal and the French general who aided George Washington during the Revolutionary War.

First, let’s clear something up—“rock, paper, scissors, shoot” or “rock, paper, scissors”? “Rock, paper, scissors,” or “paper, rock, scissors?” Best two out of three? How do we agree on the rules? Maybe we could decide with a tiebreaker, a hand game of sorts . . .

Sansukumi-Ken: The Origin of Rock, Paper, Scissors

The first known reference to a game using finger signs is a painting on a tomb wall in Egypt dating to 2000 BCE. A precursor to Rock, Paper, Scissors using three distinct hand gestures was first played in China during the Han dynasty, around 200 BCE. The game was called shoushiling, according to Xie Zhaozhiin his book Wuzazu, written in the 1600s.

This game was then introduced to Japan, spurring an entire genre of hand games known as sansukumi-ken. This translates to “the ken (fists) of three who are afraid of one another,” in reference to three hand gestures used in the games where A beats B, B beats C, and C beats A. These hand games were often coupled with drinking and were sometimes played in brothels. One speech made in 1809 recounts a ken tournament in Nagasaki’s red-light district with feasting and dancing. At some point, these games shed their association with drinking, stripping, and prostitution and began to be played by children.

An image showing how to play mushi-ken. The three hand gestures include the pinky, the thumb, and the index finger.
Mushi-ken, or Slug, Frog, Snake.
Linhart, Sepp. “Die Repräsentation Von Tieren Im Japanischen Ken-Spiel: Versuch Einer Interpretation.” Asiatische Studien: Zeitschrift Der Schweizerischen Asiengesellschaft 65.2 (2011): 541-61.

The earliest recorded sansukumi-ken game was known as mushi-ken. This game involved three gestures: the frog (the thumb), the slug (the pinky finger), and the snake (the index finger). The frog defeats the slug, which defeats the snake, which defeats the frog. Another popular version called kitsune-ken featured a supernatural fox (kitsune) well-known in Japanese mythology, who defeats the village head, who defeats the hunter, who defeats the fox.

So the game could have called “Frog, Snake, Slug” instead—or maybe “Foxhunt.”

Kitsune-ken, or Hunter, Village Head, Fox.
Linhart, Sepp. “Die Repräsentation Von Tieren Im Japanischen Ken-Spiel: Versuch Einer Interpretation.” Asiatische Studien: Zeitschrift Der Schweizerischen Asiengesellschaft 65.2 (2011): 541-61.

The most common version today is called an-ken and features rock, paper, and scissors. This variation developed in the nineteenth century and spread beyond East Asia for the first time in the early twentieth century. Sepp Linhart, author of “From Kendo to Jan-Ken: The Deterioration of a Game from Exoticism into Ordinariness” indicates that the global appeal of the an-ken version of the game stems from its use of simple, ordinary objects that were familiar to a wide audience.

Through increased contact between the East and the West, sansukumi-ken games from Japan were introduced in England, Australia, the United States, and France. Newspaper articles and letters in the 1920s and 1930s described the game as a method of casting lots, gambling, or settling disputes, going into detail about the specifics of the game for those who were yet unfamiliar with it. The game was also known as “zhot” or “jan-ken-pon.”

Alternate Theories

There are other potential sources of Rock, Paper, Scissors since there are similar games found in cultures around the world, and internet legends abound. According to the Straight Dope, some have purported that the hand game made its way into common knowledge by way of a Celtic tribe that settled in Portugal in the sixth century BCE. The game spread throughout Portugal in following centuries. Pihedra, Papelsh e Tijhera, as the game is now called in Portuguese, spread further due to the Roman invasion of the Spanish Peninsula and subsequent intercultural contact. However, the game was seen as a potential threat to Roman rule and was suppressed in the British Isles until 350 CE. This explanation lacks any real evidence, but it’s just one example of a potential parallels across cultures. The hand game played today in many countries around the world was most likely spread from Japan rather than from similar hand games found among the Celts or any other group of people.

Roshambo

Jean Baptiste de Rochambeau
Jean Baptiste-Donatien de Vimeur, Comte de Rochambeau (1725–1807)
By Charles-Philippe Larivière, public domain via Wikimedia Commons.

Why is Rock, Paper, Scissors sometimes called roshambo? For some unknown reason, the game became associated with Jean Baptiste Donatien de Vimeur, Comte de Rochambeau, who commanded the French Expeditionary Force sent to help the United States during the Revolutionary War. His name was used as a code word during the battle of Yorktown, in which the British army surrendered to the United States. Since Rock, Paper, Scissors was not widely known in the West in the eighteenth century, there is little basis for Rochambeau knowing or using Rock, Paper, Scissors to settle a dispute. Additionally, the earliest known use of “roshambo” is from 1936 in a book called Handbook for Recreation Leaders.

Linguist and language commentator Ben Zimmer hypothesizes that children in the San Francisco Bay Area (an area home to many East Asian immigrants) in the 1930s may have combined their knowledge of the defeat of the British at Yorktown, which they had learned about in school, with the new, popular hand game in which they tried to defeat their opponents or settle disputes. The name of the famous general was Americanized and became roshambo. (Anyone up for a game of roshambo? Who wants to be the British?)

Other Variations

And just for fun, here are some other variations on Rock, Paper, Scissors:

  • Rock, Paper, Scissors, Lizard, Spock (United States)
  • Ant, Human, Elephant (Indonesia)
  • Tiger, Village Chief, Village Chief’s Mother (Japan)
  • Bird, Water, Stone (Malaysia)
  • Muk-zzi-ppa, where the goal is to get your opponent to play the same sign as you (Korea)

Sources

Carlisle, Rodney P. Encyclopedia of Play in Today’s Society. Newbury Park, California: SAGE Publications, 2009, p. 603.

Ferro, Shaunacy. “Why Do People Call Rock-Paper-Scissors ‘Roshambo?’” Mental Floss. https://www.mentalfloss.com/article/80201/why-do-people-call-rock-paper-scissors-roshambo.

Schwab, Katharine. “A Cultural History of Rock-Paper-Scissors.” The Atlantic, December 23, 2015. https://www.theatlantic.com/entertainment/archive/2015/12/how-rock-paper-scissors-went-viral/418455/.

Straight Dope Staff. “What’s the Origin of ‘Rock, Paper, Scissors’?” The Straight Dope, July 10, 2001. https://www.straightdope.com/21343076/what-s-the-origin-of-rock-paper-scissors.

Wikipedia. “Rock Paper Scissors.” Retrieved March 11, 2021, from https://en.wikipedia.org/wiki/Rock_paper_scissors#:~:text=A%20beats%20B%2C%20B%20beats,was%20imported%20directly%20from%20China.

Wikipedia. “Sansumi-ken.” Retrieved March 11, 2021, from https://en.wikipedia.org/wiki/Sansukumi-ken.

World Rock Paper Scissors Association. “The Official History of Rock Paper Scissors.” Retrieved March 11, 2021, from https://www.wrpsa.com/the-official-history-of-rock-paper-scissors/.

Charley Horse

Why is the painful cramp you sometimes get in your leg called a charley horse? The answer involves baseball and continual adaptation of oral history.

What Is a Charley Horse?

A charley horse occurs when a muscle contracts involuntarily, causing a painful cramp that can last from just a few seconds to a whole day. They occur most commonly in the legs and feet but can happen elsewhere in the body.

These cramps can be caused by a number of things, including inadequate blood flow to the muscles, injuries, overusing a muscle, and stress. Another common cause is a mineral imbalance due to inadequate potassium, calcium, or sodium in the blood, which can be caused by dehydration.

Charley horse formerly referred to a muscle injury in the leg that caused blood to pool outside of the blood vessels. This is now known as a dead leg and often causes pain and limited mobility for several weeks.

So Who’s Charley?

The origin of the term charley horse to describe a muscle cramp is murky, but all sources point toward an origin in baseball.

L. Prang & Co, 1887, public domain via Wikimedia Commons.

The oldest use of the term was in an 1886 letter published in the Louisville Courier-Journal. Jim Hart, manager of the Louisville Colonels baseball team, wrote:

Ely is still suffering from a sore arm, and Reccius has what is known by ball players as “Charley Horse,” which is a lameness in the thigh, caused by straining the cord.

Jim Hart, March 21, 1886, Courier-Journal, wordorigins.org

One well-known origin story of the term holds that “Charley” was a lame horse that pulled the roller to prepare the field at the Chicago White Sox ballpark (World Wide Words).

In a similar vein, baseball official Bill Brandt explained the term as a reference to a lame horse named Charley in Chattanooga, Tennessee, who pulled things around the ballpark. Between practice and the start of a game, the players watched as Charley dragged a dust-brush around the baseball diamond. When a player on the team suffered from a pulled tendon or other injury that caused limping, the other players would jokingly refer to him as “Charley Horse” (Shulman, 1949).

However, Brandt offered a different explanation shortly after this statement. He cited a joke made by coach Billy Sunday about a hobbling baseball player, in an analogy to a horse race the players had made a bet on. This explanation is doubtful as well, and some have conjectured that Brandt changed his story to honor Sunday shortly after Sunday died.

Henry Mencken, author of The American Language, conducted an investigation into the term at the request of the editors of Webster’s New International Dictionary, Second Edition. Mencken’s research found several different explanations, none of them more plausible than the rest:

  • In 1934, Baltimore Orioles second basemen Bill Clarke claimed that the term referred to “Charley Esper, a left-handed pitcher, who walked like a lame horse.” However, the term was in use long before Charley Esper ever joined the Orioles.
  • In 1944, Billy Earle, a catcher who jumped around to several teams and dabbled in hypnotism and spiritual healing on the side, said the term was suggested by a Sioux City groundskeeper named Charley who had a horse.
  • In 1943, Dr. Logan Clendeming claimed that a charley horse was a ruptured muscle (based on the previous medical definition of the term), and it occurred in the same way that a horse suffered a string-halt. He seems to have connected the two based on pathology, though it remains unclear from this explanation exactly who “Charley” was.

None of Mencken’s proposed etymologies truly fit the bill in light of the earlier usage of the term. Apparently, Webster’s agreed: In Webster’s New International Dictionary, Third Edition (1961), charley horse was said to come from “the occurrence of Charley as a typical name for old lame horses kept for family use” (Woolf, 1973).

As one last explanation, the American Dialect Society cites an article in the Washington Post from 1907 that attempted to explain the term, which had already been in use for a few decades at that point. The article postulated that charley horse made reference to pitcher Charley Radbourne, who was affectionately nicknamed “Old Hoss.” Radbourne suffered a muscle cramp during a game in the 1880s. To describe the condition, the name charley horse was coined by putting together the pitcher’s first name, Charley, with part of his nickname, Hoss (a variant of horse; slang for a large, strong, and respected person). Thus, charley horse.

The Boston Beaneaters and New York Giants on Major League Baseball Opening Day 1886.
First from the left, Charles “Old Hoss” Radbourne gives the finger to cameraman.
Photo by F.L. Howe, April 29, 1886, public domain via Wikimedia Commons.

Sorry, charley—no one knows exactly where the term charley horse came from. All we can say for sure is that it became popular among baseball players in the 1880s and 1890s. Though there are many different theories, etymologists and historians continue to disagree about who originally coined the term. It’s likely that players used the term to poke fun at one another and that the retelling of their own stories became the origin of the phrase from their own point of view. The continual shaping and reshaping of oral history was a way for baseball players to make the term uniquely their own and stake a claim in the lingo of the game.

Sources

“Charley Horse.” May 29, 2020, Wordorigins.org. https://www.wordorigins.org/big-list-entries/charley-horse.

Joannes, Gerard. “Charley Horse.” World Wide Words. http://www.worldwidewords.org/qa/qa-cha1.htm.

Mencken, Henry Louis. The American Language, Supplement II, p. 735. (New York: Alfred A. Knopf, fourth edition 1936, supplement 1948).

Moore, Kristine. “Charley Horse.” September 18, 2009. Medline. https://www.healthline.com/health/charley-horse#causes.

O’Conner, Patricia T., and Kellerman, Stewart. “Who’s the ‘Charley’ in ‘Charley Horse’?” January 1, 2007, Grammarphobia. https://www.grammarphobia.com/blog/2007/01/charley-horse.html.

Shulman, David. “Whence ‘Charley Horse’?” American Speech, vol. 24, no. 2 (1949): 100–104. Duke University Press. https://doi.org/10.2307/486616.

Wiktionary. “Hoss.” Accessed March 6, 2021, from https://en.wiktionary.org/wiki/hoss.

Woolf, H. B. “Mencken as Etymologist: Charley Horse and Lobster Trick.” American Speech, vol. 48, no. 3/4 (Autumn–Winter, 1973): 229–238. Duke University Press. https://doi.org/10.2307/3087830.