7 Animals You Should Never Take Selfies With

Kittens, puppies. The occasional squirrel. Most of these animals make good partners for selfie pics. Sure, they don’t have the charisma of a cougar or the magnetism of a bear, but they’re much better choices — for obvious safety reasons.

So the next time your brows are so perfect that you can’t help but photobomb a wildlife scene, keep one thing in mind: “That’s not how this works. That’s not how any of this works.”

No more sweet snaps with bears, ‘mkay?

1. Bears

So many people have been taking selfies with wild bears that officials closed a Colorado park. Park managers at Waterton Canyon recreation area near Denver temporarily stopped people from entering the park until the bears go into hibernation. Similar Insta-worthy situations have been spotted in Yosemite and other state parks.

2. Bison

A snap with a Bison? It’s too close for comfort, say Yellowstone National Park authorities. Several people have been hurt after taking pictures with a bison in the background — some as near as 3 to 6 feet (1 to 2 meters) away. Countless more have just been lucky to get away with a shareable moment and not serious injuries.

3. Any Snake, Ever

Everything turned out OK after the extreme close-up with a snake pictured below, but one California man wasn’t so lucky. He may lose his hand after taking a selfie with a rattlesnake — and then receiving a $150,000 hospital bill for anti-venom. Which hurts more?

4. Crocodiles

The freshwater crocodiles that sun themselves near pedestrians in a nature park in Australia’s Top End may not be as powerful as their saltwater cousins. They are, however, still deadly. Authorities have been warning tourists not to talk selfies with crocs within arm’s reach. You can imagine why, even though many others apparently couldn’t.

5. Camels

You know that moment, right before it all goes wrong? Yeah, we’re thinking this selfie ended with a painful nip, too. Although not as bad as this one. And, things really went wrong for this guy taking a snap with a kick-happy camel. This gentleman is about to have his hat reshaped.

6. This Cat

While filming a plea for someone — anyone — to re-home her “lovely cat,” a now-infamous felinerepeatedly bites and scratches this woman. Turns out, the feelings are probably mutual.

7. Tigers

So many single guys were sneaking up on tigers to take Tinder profile snaps that the New York State Assembly actually passed legislation to prevent the pics. People caught photobombing a big cat will now be fined. No word on whether #guyswithtigers are more popular on dating sites. #doubtit 

Now That’s Interesting

Japanese photographer and Minolta employee Hiroshi Ueda developed a version of the selfie stick in the early 1980s, though Canadian inventor Wayne Fromm claims credit for popularizing the gadget in the smartphone era

Looks and Acts Like a Hummingbird? Could Be a Hummingbird Moth

You’ve spotted a 2-inch (5-centimeter) creature zipping from flower to flower, its wings moving so fast they are nearly an invisible blur. A hummingbird, you declare, feeling fortunate to catch a glimpse of this beneficial pollinator, long believed to be a harbinger of spring.

As you look closer, though, something doesn’t add up. Although it has the tongue-like proboscisof a hummingbird, this hovering creature is much smaller than any hummingbird you’ve ever seen and has antennae. Its tail is actually a bit of fluff that resembles feathers.

It’s not a hummingbird after all. It’s a hummingbird moth, a bug that looks like a bird.

There are 23 species of hummingbird moths but only five located in North America: hummingbird clearwing (Hemaris thysbe), snowberry clearwing (Hemaris diffinis), slender clearwing (Hemaris gracilis), Rocky Mountain clearwing (Hemaris thetis) and the white-lined sphinx moth (Hyles lineata).

While the territory of these five North American species of hummingbird moths ranges from Canada to Mexico, only two — the snowberry clearwing and the hummingbird clearwing — are commonly seen. Snowberry clearwings sport distinctive yellow and black body colors with a horizontal black line running down each side of its body from its eyes to its tail, while hummingbird clearwings have a distinctive reddish-brown abdomen and olive green back.

Hummingbird moths, like the hummingbirds they mimic, are adept at hovering and moving side-to-side or backward with helicopter-like precision. Although the body of a hummingbird moth takes on the cylindrical, barrel-chested proportions of a hummingbird, it is typically not as long. Hummingbird moths reach 1 to 2.5 inches (2.5 to 6 centimeters) at maturity, while hummingbirds are usually 3 to 4 inches (8 to 10 centimeters) in length. You’re also likely to spot a hummingbird moth’s six legs dangling as they hover, while a hummingbird will tuck its pair of slender legs into its downy belly feathers as it sups.

If you’d like to increase your odds of spotting a hummingbird moth, plant a variety of flowers. Hummingbird moths, like hummingbirds, feast on the nectar of a variety of long-necked flowers, such as trumpet creeper (Campsis radicans) or cardinal flowers (Lobelia cardinalis). Sometimes you may see a hummingbird and a hummingbird moth dining on the nectar of the same flowers; the two species often amicably share the same territory.

When it comes to leaving a legacy, however, hummingbird moths prefer specific plants like honeysuckle, dogbane and hawthorn, and trees like cherry and plum on which to lay their eggs. When the eggs hatch into larvae and transform into caterpillars, the next generation of hummingbird moths are then able to dine on their ideal plants. Although you may first see hummingbird moths in the spring, they are likely to make a reappearance in mid to late summer as their preferred flowers pump out more nectar.

Now That’s Cool

The hummingbird moth made history in 1991 when it starred in “The Silence of the Lambs.” Central to the story was the death’s-head hawk moth, several of which were featured throughout filming. The death’s-head hawk moth — a member of the hummingbird moth family — has a distinctive skull-like mark on its thorax.

Ridiculous History: Beloved Author Roald Dahl Was Also a Suave British Spy

“And above all, watch with glittering eyes the whole world around you, because the greatest secrets are always hidden in the most unlikely of places.” Roald Dahl penned these words for “The Minpins,” the final of 34 children’s books he wrote between 1943 and his death in 1990.

But unlike in this excerpt, there was nothing fictional about Dahl’s search for secrets. During World War II, the soon-to-be-beloved author of books including “Charlie and the Chocolate Factory,” “Danny, the Champion of the World” and “The Witches” served as a fighter pilot and an officer in the British Royal Air Force. Following that, he assumed lifestyle reminiscent of superspy James Bond, joining a secret organization based in the United States known as the British Security Coordination.

That spy network’s primary goals were to offset Nazi propaganda while protecting the interests of the United Kingdom. So, while Dahl dreamed up imaginative children’s stories, he also lived as a secret intelligence officer under the cover of working a public relations job at the British embassy in Washington, D.C.

By all reports, he was both very good and very bad at it. Dahl was especially talented at being a ladies’ man, a skill that came in handy when convincing both politicians and heiresses alike to part with closely guarded secrets. One biography described his romantic skill with particularly colorful language, following reports that Dahl reportedly had affairs with, among others, Millicent Rogers, heiress to the Standard Oil fortune, and Clare Boothe Luce, an influential congresswoman who later became an ambassador and foreign affairs advisor to presidents Nixon and Ford.

But as good as he was at the “sleeping” part, Dahl came up short at keeping secrets. According to his daughter Lucy, Dahl was a prolific gossip. “Dad never could keep his mouth shut,” she’s quoted saying in Donald Sturrock’s 2010 biography “Storyteller: The Life of Roald Dahl.”

Despite his penchant for spilling the beans, Dahl did manage to come across some interesting intelligence at cocktail parties — or perhaps it was thanks to his bedside manner. As early as 1944,

he’d uncovered early U.S. talk of landing a man on the moon. He also reportedly believed rumors that Franklin D. Roosevelt and the Norwegian crown princess Martha were having an illicit affair (a claim most historians discount), passing that information along with other intelligence directly to Winston Churchill.

Despite his Bond-style role in world affairs, Dahl will probably always be best remembered as one of the greatest children’s storytellers of all time. Many of his children’s books have been turned into movies, including “The BFG,” the tale of a friendly giant who befriends a young girl, then races against time to protect her from danger. It’s just the sort of story Dahl could truly appreciate.

Now That’s Cool

In a piece of real life intersecting with fiction, Roald Dahl wrote the screenplay for the 1967 James Bond thriller “You Only Live Twice.” 

The Surprisingly Radical History of Mother’s Day

Mother’s Day, one of the largest holidays in the world, has become an unstoppable idea. Whether it’s a set of earrings or a dozen roses, few can imagine allowing a Mother’s Day to come and go without giving Mom a gift.

In 2022, 84 percent of Americans planned to celebrate Mother’s Day, spending more than $245 per gift, on average. This figure, which has grown without fail for decades, doesn’t even count handmade perks like breakfast in bed. The top three gift categories were greeting cards, flowers and special outings [source: National Retail Federation].

How did the second Sunday of May become a milestone to mark the contributions of mothers in the first place? In the beginning, it wasn’t all sweetness and light. In fact, it was far from a feel-good holiday designed to celebrate women and how they care for their families.

Mother’s Day was built on radical ideals. It was an international movement meant to change the world, one that began through the collective efforts of influential women who sought to free the world from injustice and warfare.

The modern concept of Mother’s Day grew out of a seed planted in 1858, when Ann Reeves Jarvis began organizing Mothers’ Day Work Clubs to rail against the disease-causing environment of West Appalachia’s poorest workers. She believed too many of the workers’ children were dying from illnesses brought on by filthy conditions, so under the advice of her physician brother, Jarvis taught mothers how to boil water for drinking and keep food from spoiling. The practical nature of her Mothers’ Day Work Clubs served as a model for nearby towns, and by 1860 the idea had spread across West Virginia [source: The Library of Congress].

Just as Jarvis’ concept was gaining traction, her attention was drawn to another challenge. The American Civil War, which would play out from 1861 to 1865, was erupting right in her front yard.

Mother’s Day as a Political Movement 

By the 1860s, the Mothers’ Day Work Clubs launched by Ann Reeves Jarvis had become successful, but now she faced another complication. The area near her West Virginia home was a pivotal stop on the Baltimore and Ohio Railroad during the Civil War [source: West Virginia Division of Culture and History. Caught in a storm of rising tensions deep in this crossroads of the Civil War, Jarvis took a stand — by not taking a stand.

She insisted the Mothers’ Day clubs become neutral ports in a sea of political differences. Jarvis and other club members fed and clothed Union and Confederate soldiers; treated their wounds and, just as they’d done with the Appalachian’s poorest workers, taught life-saving sanitation methods.

It was a passion that grew out of Jarvis’ own tragedies. Years earlier, four of her children had died from contagious diseases such as diphtheria, scarlet fever and whooping cough [source: Tyler-McGraw]. Their deaths would be the first blows of many. Four more of her 12 children would be buried before reaching their 18th birthdays.

Even the end of the Civil War in 1865 did not bring peace. Tensions increased as Union and Confederate soldiers returned from war and found themselves occupying the same space in Grafton, still seething with bitter hostility. Jarvis once again took action. She organized a Mothers’ Friendship Day at a local courthouse. Although she publicized the event as a way to honor mothers, its real purpose was to bring together a fractured community by gathering battle-worn soldiers and their families — whatever side they had been on.

Amid rumors the event would erupt in violence, Jarvis’ Mothers’ Friendship Day opened with a prayer and a band that played “Should Auld Acquaintances Be Forgot.” According to an account by Sen. John D. Rockefeller, “By the time they reached the word ‘forgot,’ neighbors were weeping and shaking hands.” Mothers’ Friendship Day became an annual pacifist event [source: The Library of Congress].

Modern Mother’s Day

Ann Reeves Jarvis was one of several postwar influencers who promoted peace. In 1870, abolitionist Julia Ward Howe also took a stand, notably as author of a well-publicized “Mother’s Day Proclamation” urging women to promote peace through political means.

Howe, who had written the Civil War anthem “Battle Hymn of the Republic” years earlier, had become a pacifist after living through the Civil War (where 620,000 soldiers died) and reading accounts of the Franco-Prussian War that followed [source: Kohls]. Howe and other women organized events driven by their pacifist leanings, including Mother’s Days for Peace, which were held annually in various locations across the U.S. on June 2 [source: Rosen]. These eventually fell out of favor.

By Jarvis’ death in 1905, her daughter Anna was ready to take up the cause. She held a church service on the second Sunday in May, 1908 to honor her mother. Mrs. Jarvis had once said that she hoped there would be one day be a memorial day for mothers, adding, “There are many days for men, but none for mothers.”

The custom spread, as Anna publicized it through letters to newspapers and politicians. She lobbied enthusiastically to institute a national holiday that would personalize the mothers’ movement by encouraging sons and daughters to honor their own mothers. In 1914, then-U.S. President Woodrow Wilson officially named the second Sunday in May as Mother’s Day [source: Pruitt].

Almost as soon as the proclamation ink dried, merchants began advertising candies, flowers and greeting cards to commemorate the day. This disturbed Anna, who believed it was being corrupted from her intention of an intimate celebration of one’s own mother.

In the years that followed, she tried to reverse the commercialization of Mother’s Day, spending her sizable inheritance, along with her energy, on boycotts and lawsuits against groups that violated the spirit of the day. In 1923, she crashed a confectioners’ convention. In 1925, she protested the American War Mothers convention, which used Mother’s Day as a fundraising event by selling carnations. She was arrested for disturbing the peace [source: Handwerk].

Her efforts, though nearly impossible to ignore, went largely unanswered. She died penniless in a sanitarium in 1948, having no children of her own. Mother’s Day continued to gain momentum. Today, it’s an international holiday celebrated in 152 countries [source: The Library of Congress].

A Museum and a Shrine
For 20 years, Ann Reeves Jarvis taught Sunday school at the Andrews Methodist Episcopal Church in Grafton, W.V., which is now the International Mother’s Day Shrine. Since 1908, when the first Mother’s Day service was held, the building has been home to a celebration for mothers. Later, Jarvis’ home in Grafton (also the birthplace of her daughter Anna) became the Anna Jarvis Birthplace Museum. Notably, the home once served as the headquarters of Gen. George McClellan during the Civil War [sources: Anna Jarvis Birthplace Museum, International Mother’s Day Shrine].

Author’s Note: What does Mother’s Day have to do with the Civil War?
I had no idea. Yes, I’d heard of Ann Reeves Jarvis, her daughter and Julia Ward Howe. But Mother’s Day as a radical movement meant to disseminate peace? For me, that was a first — and one that changed my perspective. I’ve always thought of Mother’s Day as a perfunctory holiday. First, as an excuse to honor my mom (very deserving, by the way), then as a holiday for myself that is more often spent at graduations or ball tournaments than something specifically meant for me. Knowing its roots, I’m ready to rethink my celebration.

How Deadheading Helps Flowering Plants Flourish

You’ve taken multiple trips to the plant nursery, selected a variety of plants and can already envision how they’re going to brighten up your flower beds throughout the spring and summer. But soon enough (too soon, in fact) these colorful additions lose their luster and you find yourself surrounded, not by the gorgeous landscape you’d planned, but by faded and dead blooms. Now what?

Before you throw those gardening gloves in the trash right along with your dreams of a beautiful botanical space, take a beat. There is a solution … and it is way simpler than you might think.

Deadhead. No, we’re not referring to those diehard fans who once traveled the continent seeing the Grateful Dead as many times as possible. Deadheading is the process of manually removing a spent bloom, whether on an annual or perennial plant, and it not only preserves the beauty of your plants, but encourages them to look their best for longer.

How to Deadhead, and Why
To deadhead is to do just as it sounds: remove the dead “head” — or blooming portion — of a plant. Often, this means using one’s thumb and forefinger to pinch and remove the stem of a spent bloom. For some tough-stemmed plants, however, garden snips or pruning shears may be needed. A sprawling mass of ground cover can even be deadheaded with the careful sweep of a somewhat indelicate garden tool, such as a weed eater.

“Remove the spent blossom as close to the larger main stem as possible because this helps you avoid leaving behind unattractive and flowerless stems,” says Erinn Witz, a garden expert and the co-founder of Seedsandspades.com, an educational website for people in any stage of their gardening experience. “What you want is a clean break in the stem, not a break from a pulling or twisting motion.”

In general, flowering plants are resilient, so if you are unsure where to remove a dead flower, simply pinch off the stem right under the flower. The important thing is that the faded bloom is no longer attached to the plant.

“Deadheading is basically stopping a plant from developing fruit and seeds so the energy is refocused on making more flowers,” emails Charlotte Ekker Wiggins, a University of Missouri master gardener emeritus and blogger at Gardening Charlotte.

After a plant blooms, it usually suspends the process of making new flowers so it can put its energy into forming seeds. Deadheading not only enhances a flowering plant’s performance by causing it to produce more blooms, but it can keep its form shapely and compact. Deadheading has the added bonus of removing the faded and browning blooms from view, so even as you wait for plants to rebloom you are rewarded with greenery.

Not all flowers require deadheading, according to Fiskars. Most bulbs produce only one round of flowers per season, as do flowers such as peonies and liatris, and therefore don’t need deadheading. Most flowering vines, periwinkles and impatiens don’t need it either. Here is a list, by no means inclusive, of some annuals and perennials that will benefit from deadheading:

  • Zinnias
  • Hardy geraniums
  • Cosmos
  • Marigolds
  • Delphinium
  • Snapdragons
  • Marguerite daisies
  • Petunias
  • Blanket flowers
  • Roses
  • Sweet peas
  • Bee balm
  • Campanula
  • Salvia

Just a Pinch, But How Often?
There’s one rule of thumb you need to know about deadheading, and it’s catchy enough to remember from year to year: early and often. The idea is to begin deadheading in the spring, right after the first blooms are spent. Every few days, tour your yard and observe any plants that bloom, then spend a few minutes removing faded or dead flowers. If you wait until late summer or early fall, the process will probably become overwhelming because of the sheer quantity of dead blooms that need your attention.

“Annual flowers that need to be replanted every year and perennial flowers that live more than two years will produce more flowers if they are deadheaded on a regular basis,” says garden designer Joanna VonBergen from GinghamGardens.com, in an email.

If you are growing geraniums or another type of flowering annual, it’s likely that deadheading the blooms will encourage them to reflower throughout the season. But, even amongst those plants that benefit from deadheading, there can be differences in where to cut them back.

“How you deadhead depends on the flowering plant,” says Chey Mullin, flower farmer and blogger at Farmhouse and Blooms, in an email. “Some plants require deadheading of the whole stem. Other plants benefit from a light pruning of spent blooms just back to the center stem. Then others only require the spent bloom to be removed just under the flower, such as with daylilies.”

There are, however, some exceptions to the deadheading rule. Some perennials, like peonies, won’t bloom again, even if you deadhead them. And some perennials, such as the sedum autumn joy stonecrop, will reward you with interesting seedheads throughout the winter if you don’t lop off their blooms. Hollyhocks and foxgloves are good examples of perennials that should be left alone after blooming so they can produce and drop seeds for the next growing season.

Now That’s Interesting
Need another good reason to deadhead? When you remove the faded blooms after a plant’s first flowering of the season, you’ll be rewarded for your efforts with a second bloom — and this second bloom will usually last much longer than the first. Instead of sending its energy and nutrients into making seeds, the plant will focus solely on new blossoms.

Which Is the Sweetest Grapefruit — White, Red or Pink?

Grapefruit can be red, pink or white and, while it is definitely known for its pucker power, can range from intense to mild in flavor. Regardless of the type you choose, grapefruit is known for its balance of tart and sweet, and is packed with a host of health benefits.

With a hefty dose of vitamins A and C in each juicy segment, grapefruit can help boost immunity and has even been found to lower blood pressure and be beneficial in lowering triglyceride and “bad” LDL cholesterol levels. And, because grapefruits are more than 90 percent water, adding them to your daily diet can also boost hydration.

Grapefruit is usually easy to find, too. Once available primarily during its growing season, which runs October through June, grapefruit now can be purchased year-round in most U.S. markets. When selecting grapefruit, regardless of variety, look for the heaviest fruits because they tend to be the juiciest. Grapefruit that is fully ripe, without green color on the skin and without soft spots, will likely taste best.

Grapefruit has long been a popular fruit eaten for breakfast, but keep in mind that, no matter which variety you choose, the potential of this versatile superfruit reaches far beyond the morning.

Red Grapefruit
Possibly the most popular grapefruit variety — and one of the most widely available — is the red grapefruit. Although sometimes simply labeled as “red grapefruit,” there are several cultivars, or varieties, including Ruby Red and Rio Red, both of which top the grapefruit chart for sweetness. Generally, red grapefruit will be sweeter than pink or white grapefruit, although there will always be some exceptions. The rich scarlet color of the red grapefruit’s pulp comes from its high levels of lycopene, an antioxidant that helps protect the body against chronic diseases and environmental toxins.

“Red grapefruit are an excellent source of vitamin A,” says Jessica Randhawa, owner and head chef of The Forked Spoon, which features grapefruit as an ingredient in everything from cocktails to ceviche.

In addition to its sweet taste, red grapefruit tends to have a thinner skin than other types, maximizing the amount of tender, seedless fruit inside.

Pink Grapefruit
Rosy-colored pink grapefruit tastes similar to red grapefruit but offers a flavor parity that is unequaled among fruit of its kind, according to Randhawa.

“The right balance of sweet and tart is found in pink grapefruit. Their flesh is typically very juicy and not sour. Not only do they taste great, but these grapefruits are rich in vitamin C, fiber and vital antioxidants like beta-carotene and lycopene,” Randhawa says.

Although pink grapefruit — named for its blush-hued flesh — is typically not as sweet as red grapefruit, the complexity of their flavor profile makes them ideal for eating solo, adding to salads or even perking up plain water. Like red grapefruit, pink grapefruit is generally easy to find and is typically available year-round in most U.S. markets.

White Grapefruit
White grapefruit, unlike its red or pink cousins, has pale, yellow-colored flesh. It has a thin rind — thinner than most oranges, in fact — and ranges from a distinctive green to light yellow in color. As the fruit ripens, the rind shifts to a darker yellow. White grapefruit is sometimes referred to as “yellow” or “gold” grapefruit for the yellow shade of its ripe skin.

Cultivated less for its flesh and more for its slightly bitter juice, the white grapefruit is a familiar ingredient in sodas and mixed drinks. “White grapefruits are the least sweet variety,” Randhawa says. “However, they possess an intense flavor that is good for making juices and syrups.”

The white grapefruit’s prized bitterness comes from its high acid content and from the thick layer of albedo that lies between the peel and the flesh. Peel or cut into a white grapefruit and you’ll encounter its bright, intense and acidic scent.

Notable Grapefruit Varieties
Although grapefruit may be categorized as red, pink or white, there are dozens of cultivars and related fruits that can be difficult to distinguish from each other.

The white grapefruit is sometimes mistaken for the Oroblanco grapefruit. Although the two are both members of the citrus family and look very much alike, the similarities end when it comes to parentage and flavor. The Oroblanco and the white grapefruit are different species. The Oroblanco is part pomelo — a mild citrus fruit indigenous to Southeast Asia that can grow to the size of a watermelon — and part grapefruit.

“Oroblanco, which means ‘white gold’ in Spanish, has a thick rind, lemon-yellow skin and is almost seedless,” Randhawa says, but unlike the tart white grapefruit, the flesh of the Oroblanco “is juicy and sweet with little to no bitterness.”

Likewise, the melogold grapefruit is not a “true” grapefruit, but a cross between a pomelo and a grapefruit. It has pale, yellow-tinged, sweet-tart flesh and a green-tinged exterior.

Now That’s Interesting
The pomelo and grapefruit look and taste similar, but are different fruits. Widely considered the largest citrus fruit in the world, the pomelo (Citrus maxima or Citrus grandis) can grow to the size of a cantaloupe or watermelon. Native to Southeast Asia, the pomelo is a direct ancestor of the grapefruit. Pomelos, when crossed with sweet oranges, produced the first grapefruits.

Why Do British Lawyers Still Wear Wigs?

The drama of a criminal trial has a macabre allure. In America, strangers line up to enter courtrooms as spectators of high-profile proceedings. Those who can’t be there in person watch live-streamed versions on televisions and tablets. And when there’s downtime from real-life court battles, many turn instead to pseudo-fictional prime-time portrayals.

But in the U.K., nothing is more British than the iconic white wig judges and attorneys — or barristers as they’re known — wear during formal courtroom proceedings. Many of the judges and barristers who wear wigs say the headpiece — also known as a peruke — brings a sense of formality and solemnity to the courtroom.

“In fact, that is the overwhelming point for having them,” Kevin Newton, a Washington, D.C.-based lawyer who studied law at the University of London, said when we originally spoke to him in 2017. Newton added that barristers’ counterparts, known as solicitors, meet with clients outside the courtroom and don’t wear wigs.

A Desire for Uniformity
Like the robes the lawyers wear, the wigs are worn as a symbol of anonymity, Newton said. The wigs are part of a uniform that create a visual separation between the law and those being brought up before it. Wigs are so much a part of British criminal courts that if a barrister doesn’t wear one, it’s seen as an insult to the court.

Barrister wigs are curled at the crown, with horizontal curls on the sides and back. Judges’ wigs — also called bench wigs — look similar, but are typically more ornate. They’re fuller at the top and transition into tight curls that fall just below the shoulders.

Most are handmade from 100 percent horsehair, though there are synthetic versions available today, too. Horsehair wigs aren’t cheap, either, especially when they’re handmade and combine an ages-old craft of styling, sewing and gluing. A judge’s full-length wig can cost more than $3,000, while the shorter ones worn by barristers cost more than $600.

Wigs may have fallen out of fashion over the centuries, but when they first made their appearance in a courtroom around 1685, they were part and parcel of being a well-dressed professional.

In the 17th century, only the elite wore powdered wigs made of horsehair. Those who couldn’t afford the best garb but wanted to look the part wore wigs made of hair from goats, spooled cotton or from the hair of human corpses. There was also a steady trade that involved living people who sold their long hair for wigs, though horsehair remained the ideal.

But why did powdered wigs come on the fashion scene in the first place? Why top one’s head with an itchy, sweat-inducing mass of artificial curls? Blame it on syphilis.

Historical Hair
Wigs began to catch on in the late 16th century when an increasing number of people in Europe were contracting the STI. Without widespread treatment with antibiotics (Sir Alexander Fleming didn’t discover penicillin, the treatment for syphilis until 1928), people with syphilis were plagued by rashes, blindness, dementia, open sores and hair loss. The hair loss was particularly problematic in social circles. Long hair was all the rage, and premature balding was a dead giveaway that someone had contracted syphilis.

Wigs, when not used to cover syphilis-related hair loss, were also a helpful for those who had lice. After all, it was much more difficult to treat and pick through the hair on one’s head than it was to sanitize a wig.

When it comes to trend-starters, no one had a bigger influence on British wigs than Louis XIV of France. During his reign from 1643 to 1715, the Sun King disguised his prematurely balding scalp — historians believe it was caused by syphilis — by wearing a wig. In doing so, he started a trend that was widely followed by the European upper- and middle-class, including his cousin, Charles II, the King of England (also rumored to have contracted syphilis), who reigned from 1660 to 1685.

Although aristocrats and those who wished to remain in good social standing were quick to adopt the practice of wearing wigs, English courtrooms were slower to act. In the early 1680s, judicial portraits still showed a natural, no-wig look. By 1685, however, full, shoulder-length wigs had become part of the proper court dress.

A Persistent Legacy
Over time, wigs fell out of fashion with society as a whole. During the reign of England’s King George III, from 1760 to 1820, wigs were worn by only a few — namely bishops, coachmen and those in the legal profession. And bishops were permitted to stop wearing them in the 1830s. But the courts kept wigs for hundreds of years more.

In 2007, though, new dress rules did away with barrister wigs — for the most part. Wigs were no longer required during family or civil court appearances, or when appearing before the Supreme Court of the United Kingdom. Wigs, however, remain in use in criminal cases.

And in Ireland, judges continued to wear wigs until 2011, until the practice was discontinued. In England, and other former English and British colonies — like Canada, for instance, whose provinces abandoned the wigs throughout the 19th and 20th centuries, or Jamaica, which removed the wigs in 2013 — lawyers and judges now only wear wigs for ceremonies.

Yet, wearing wigs still enjoys popularity among British lawyers, the Guardian reported in 2021. “If you don’t meet the physical stereotypes of a barrister — male, white, perhaps older — it is helpful to wear the uniform because it stops any awkward conversations,” barrister Zoe Chapman told the publication.

Now That’s Interesting
Before the adoption of wigs in the 17th century, British lawyers had a dress code that would seem positively modern. They were expected to appear in court with short hair and neatly trimmed beards.

Is Induction Cooking Better Than Gas or Electric?

Boil a pan of water in under three minutes? Melt butter or chocolate quickly, yet without scorching? It’s all possible with an induction cooktop.

While cooktops powered by induction heating have been favored across Europe for decades, they are now steadily gaining traction in the United States, where the National Kitchen + Bath Association expects them to eventually replace electric cooktops altogether, and market-watcher Technavio anticipates the induction cookware market will blossom to $1.38 billion by 2025.

Gas, Electric or Induction?
There are three main types of cooktops: gas, electric and induction.

If a cooktop is powered by gas or electricity, it relies on thermal conduction, either via gas flames or an electric coil. The heat source conducts heat to the burner itself and then to the pot or pan atop the burner. Whether the burner is an exposed circular heating element, or is covered by a glass or ceramic surface, it requires thermal conduction.

An induction cooktop, however, eliminates the need for anything to conduct the heat. There is not a heating coil, nor are there gas flames. This is because an induction cooktop does not rely on thermal conduction. An induction cooktop sends heat straight from the source to the item you intend to heat, eliminating the need for conduction — and this direct heat transfer is becoming a preferred method among chefs and home cooks alike.

How Does Induction Cooking Work?
An induction cooktop has a heat-resistant glass or ceramic surface that may look like an ordinary electric cooktop, but the similarities stop there. Underneath this smooth surface lies an electromagnetic copper coil and, when the heating surface is turned on, an electric current passes through this electromagnetic coil. This results in the creation of a multidirectional magnetic field, which radiates outward from the coil in all directions but which does not produce any heat.

And therein lies the kicker: an induction cooktop doesn’t produce heat through its underlying electromagnetic coil until — and only until — a cooking pan is placed atop the burner.

When a pan or pot is placed on an induction cooktop, the fluctuating magnetic field interacts with the bottom of the pan, causing an electric current to flow through it. A fluctuating (or looping) magnetic field is known as an eddy current or a Foucault current, the latter named for the French physicist Jean Bernard Léon Foucault who discovered it in 1851. An induction cooktop creates a magnetic field between a cooking pot and the induction coil beneath the cooking surface, which then releases some of its energy as heat and heats the contents of the pot.

Benefits of Induction Cooking
Induction cooktops are more energy-efficient than other cooktop options primarily because they draw less energy to create heat. During induction cooking there is little heat loss, with up to 90 percent of the generated heat energy used to heat the contents of a pan instead of the atmosphere around it. A gas or electric stovetop, in comparison, loses up to 35 percent of the heat it generates during cooking.

An induction cooktop heats faster — and at more precise temperatures — than a gas or electric cooktop, making it a preferred method for professional chefs.

“Electric cooktops are generally known for hot spots on pans, and induction does not get hot spots like electric cooktops, while also allowing the same precision cooking experience usually associated with gas cooking,” says Jessica Randhawa, head chef and recipe creator at The Forked Spoon, in an email interview. “I find that the precision is much more consistent [with induction cooktops] than gas cooktops, allowing for much better overall temperature control of the food being cooked.”

Induction cooktops also have the potential to reduce burns and other-related injuries, in large part because the surface of an induction cooktop stays cool to the touch, even when the heating element is turned on.

According to the National Fire Protection Association, “Cooking is the leading cause of reported home fires and home fire injuries in the U.S., as well as a leading cause of home fire death.” Even more prevalent? Non-fire cooking burns caused by “contact with hot equipment, hot cooking liquids or hot food.” Of these injuries, making contact with a hot range is the most common source of non-fire cooking burns treated at emergency departments from 2015 to 2019.

“Gas is highly flammable, and the misalignment of a burner on low can create a situation where the flame goes out, and gas floods the kitchen or house; this has happened to me before,” Randhawa says. “Both gas and electric cooktops get very hot when started and tend to hold their heat long after the cooking is done,” she says, “whereas induction cooktops heat up only when the pot or pan is placed directly on the induction zone, and it also cools off quite a bit faster.”

Downsides to Induction Cooking
Cost is one of the chief drawbacks of induction cooking, particularly for home cooks, with induction cooktop ranges averaging anywhere from $2,000 to $4,000 — several times the approximately $500 cost of an average electric range.

And you may need to purchase a different set of pots and pans made specifically for an induction cooktop. Induction works only if a pot or pan is comprised of ferromagnetic metals, such as cast iron, enameled cast iron and some stainless steel pots and pans. Some stainless steel will not work with an induction stove if its composition is high in nickel, which can block the magnetic field necessary for heating its contents with an induction stove. In addition, older types of aluminum, copper or glass cookware are not compatible with an induction cooktop, although some manufacturers are now adding a magnetic layer to the bottom of these items.

“Induction cookware is also well known for its temperature acceleration, reducing the cooking times for simple tasks like water boiling by half,” Randhawa says, which may help turn the tide for consumers who are wary of induction cooking.

If you’re unsure whether your cookware will work with an induction cooktop, you can test it by holding a magnet near the bottom of the pan. If it adheres, it is magnetic and — as long as it has a flat bottom surface — will work well with an induction cooktop.

New cookware aside, induction may well be the cooktop of the future. Induction cooktops have long been employed in professional kitchens throughout Europe and have been gaining ground among professionals and enthusiasts alike throughout the United States as concerns about climate change move kitchen cookery from natural gas to all-electric — prompting city zoning ordinances in some locations to deny the use of natural gas lines in newly constructed homes.

Now That’s Interesting
When compared to gas cooktops, induction cooktops may help reduce indoor air pollutants such as methane. A Stanford University study found the methane byproduct of natural gas-burning stoves in U.S. homes has the equivalent climate impact as the carbon dioxide emissions from about 500,000 gas-powered cars.

How Long Should You Really Go Without Washing Your Jeans?

Those jeans you’re wearing? They’re part of a decades-old debate about whether — and when — you should wash denim. It’s a contested topic filled with pseudoscience and conjecture, one centered around an Odyssean journey designed to coax a legendary article of clothing into the perfectly worn pair of jeans.

How often jeans should be laundered is dependent on a number of factors, including fabric, dye and your personal feelings on bacteria. But first, Ben Bowlin, host of our accompanying BrainStuff video, lets you in on some surprising information. Denim is only partially dyed, so if you prefer a deep indigo color, think long and hard before putting those pants in the washing machine and definitely turn them inside out. (Actually, you may want to think long and hard about how attached you are to that indigo color anyway, considering how water-intensive and toxic the process of dyeing denim can be. But that’s a separate article.)

Denim is created when cotton fibers are made into a twill weave. In a twill weave, a yarn called the weft is woven crosswise, passing over and under vertically placed warp fibers. Typically, only the warp threads are dyed. This means the weft threads remain white, a quality that gives the inside of blue jeans its lighter color.

Plus, the blue shade on the warp threads comes from an indigo dye — a dye that doesn’t penetrate cotton fibers. Indigo sits atop the surface of each thread that makes up the yarn, its molecules chipping away over time and causing the fabric to fade.

This fade pattern is so unique that the FBI can analyze denim fade patterns to track criminals, identifying telltale whisker patterns on the front and honeycomb patterns behind the knees.

Washed and artificially distressed denim has already been broken in, allowing you to see its particular fade right away.

Raw denim — you’ll know it by its stiff feel — will fade naturally over time in a pattern based on your activities. The longer you go without washing raw denim, the more personalized the jeans will become, displaying a customized set of fading patterns. If you can wait to wash, you’ll also preserve the indigo and stiff texture of the denim. You can take care of any odors by spritzing your jeans with some fabric spray, too.

But what about the bacteria colonizing the denim on your lower hemisphere? In 2011, a microbiology student at the University of Alberta put it to the test. He went 15 months without washing his jeans, then tested the denim’s bacterial content. He compared the findings to another pair of jeans that had been washed a mere two weeks earlier. The bacteria content on both pairs of jeans was nearly identical. No less a denim authority than Levi’s recommends washing your jeans once every 10 wears, at most, adding that some Levi’s staffers have jeans they’ve never washed. Ever.

So if you don’t want to wash your jeans, how do you keep them clean? Levi’s used to recommend freezing jeans to kill bacteria and odor, something that was later proven to be a myth.

Most of the bacteria on our jeans comes from our skin, and these germs are adapted to living at low temperatures. Stephen Cary, a frozen microbe expert at the University of Delaware, says you’d be better off heating the jeans to 121 degrees Celsius (249 degrees Fahrenheit) for 10 minutes. Or, he adds, you could just, y’know … wash them.

Now That’s Interesting
Since 1999, a Denim Day campaign has encouraged people to wear jeans on a Wednesday in April to raise awareness about sexual violence. It started after the Italian Supreme Court overturned the conviction of driving instructor who raped a female student, citing the tightness of her jeans. This year Denim Day falls on April 27, 2022.

The Atlas Moth Is a Behe-moth, Plus 5 Other Facts

Among the 160,000 Lepidoptera species of moths, the Atlas moth stands out as one of the largest in size and one of the shortest in life span. The Atlas moth also has a striking appearance, with large and colorful wings that could easily rival any beautiful butterfly.

Although the Atlas moth life cycle may seem simple, this giant moth plays a complex role in its natural habitat, where its adult life is driven by a primary reproductive urge as soon as the sun sets. What other secrets does the Atlas moth carry? Check out these surprising facts about the Atlas moth.

1. The Atlas Is a Massive Moth
The Atlas moth is probably best known for its size. The Atlas moth (Attacus atlas) is a species of moth in the giant silkworm family Saturniidae and it has one of the largest wing spans — and wing surface areas — among its kind. The average female Atlas moth is usually larger than the male Atlas moth, and has a wingspan up to 12 inches (30.5 centimeters) with a total surface area up to 62 inches (1.5 meters).

2. The Wings Come With a Warning
An Atlas moth that is perched on a branch with its wings closed may seem like an easy target to any of its natural predators, but by making just one move — opening its wings — the Atlas moth can flip the script. The upper corner of each Atlas moth wing mimics the distinctive profile of a cobra’s head, acting as an immediate deterrent (times two!) to the birds or lizards that would otherwise consider it a readily available meal.

When the Atlas moth becomes frightened, it opens its wings and shakes them to imitate the movements of a snake, and in doing so, may fend off its attacker. In addition, the pattern on the wings of the Atlas moth will sometimes resemble the pupils of watching eyes. These “eyes” may also deter predators.

3. The Atlas Is Fond of Forests
The Atlas moth lives in forested areas of Asia, ranging from India to the Philippines and south to Indonesia. It has adapted to life in a variety of forested climates, from tropical and lowland to upper mountain forests. The female Atlas moth lays eggs on the underside of a leaf and after seven to 14 days, the eggs hatch into large caterpillars.

4. The Atlas Has an Underdeveloped Mouth and Does Not Eat
The caterpillar that eventually emerges from a cocoon as an Atlas moth has gotten to this point by storing up food reserves. As an adult, it won’t taste another bite. “The adult form of the Atlas moth no longer eats food because it has an underdeveloped mouth with a tiny and non-functioning proboscis,” says Craig Miller, co-founder of Academia Labs, in an email. He has taught courses about the Atlas moth, collaborated with Atlas moth researchers and witnessed the moth in its natural habitat during a visit to Southeast Asia. “The adult Atlas moth mainly relies on the reserves it stored when it was still in the caterpillar stage.” These reserves are designed to offer about a week of energy to an Atlas moth, allowing it to survive long enough to mate.

5. The Male Has Mating on the Mind
If these giants-among-moths were starring in a new reality series, it would probably be titled “Atlas Moths: After Dark.” The male Atlas moth waits until the sun goes down to seek out a female and will locate her, not by sight, but by following the heady scent of the mating pheromones she releases. Once united, male and female Atlas moths will mate (sometimes for up to 24 hours) and then the female will lay more than 100 eggs on the underside of a leaf. This life cycle — from moth to mate to mother — will take place in a single week and will end in death. Likewise, the male Atlas moth dies after mating.

6. Atlas Cocoon Silk Is Used to Make Coats
The cocoons of the Atlas moth caterpillar are composed of broken strands of light brown fagara silk, a durable silk spun by the caterpillar as it prepares to pupate. Cocoons of the Atlas moth are typically about 2 inches (6 centimeters) in length. The name for the silk, ‘fagara silk,’ is believed to have originated from an archaic generic name for a genus of trees upon which Atlas caterpillars are known to feed, the Zanthoxylum genus. When harvested commercially, the silk of the Atlas moth is used to make purses, light coats, shirts, scarves and other wearable items.

Now That’s Interesting
The impressive wings of an Atlas moth are not only large, but they’re also colorful, with deep shades of red and brown outlined at times by black, yellow or eggplant. By contrast, the cocoon from which an Atlas moth hatches is a monochrome and muted light brown. Once abandoned by the Atlas moth, the tightly woven and durable cocoons are sometimes collected and repurposed as small coin purses.