Translate

Sunday, November 30, 2014

Fruitcake Upgrade: A Recipe for Christmas Stollen



Fruitcake Upgrade: A Recipe for Christmas Stollen

This German sweet bread makes a delicious gift—especially once you ditch the Day-Glo cherries. A dash of nutmeg and a nip of brandy give this recipe its festive kick

By Georgia Freedman in the Wall Street Journal

IT ARRIVED EVERY year in mid-December, a hefty brick of a package wrapped in red cellophane and tied with a thin ribbon. Inside was my Great-Aunt Barb’s stollen, a traditional German Christmas bread that she made in huge batches and sent out to family and friends. The stollen was dense and buttery and brandy-spiked, studded with almonds and jewel-like chunks of candied orange and citron, dried pineapple and golden raisins. The bread itself was not very sweet, but Aunt Barb topped each loaf with snow-white frosting and cherries dyed a red so bright they practically glowed like the lights on our Christmas tree.
Throughout my childhood, Aunt Barb’s stollen was an integral part of our Christmas ritual, a treat eaten midmorning, between the frenzied free-for-all of the stockings and the more measured, ceremonial opening of the presents under the tree. My mother served slices of it on Christmas-themed napkins, often with a thick slab of brie on top.
Made from yeasted dough, traditional stollen becomes soft and moist with the addition of butter and rum, which preserve the bread so that it can ripen for a few weeks and absorb the flavors of the fruit. It’s a beloved Christmas treat all over Germany, where it has been made in one form or another for centuries. In Dresden, there’s even a “protection association” dedicated to ensuring that loaves bearing the name contain the right proportions of butter, fruit and almonds.
Aunt Barb’s family had German roots, but in many ways her stollen was very American. The candied fruits she used reflected her cosmopolitan life in New York, where she had access to specialty stores like Zabar’s. Additionally, her version was preserved with brandy rather than rum, giving it a flavor that must have been popular with a generation used to drinking Sidecars at cocktail hour.
That combination of flavors defined my Christmases for years. But then, when I was in college, Aunt Barb passed away suddenly, and, just like that, stollen disappeared from our family ritual. No one felt up to the challenge of such an ambitious baking project.
This year, however, will be different. For the first time, I will be on the other side of the Christmas equation: With the birth of my daughter this past spring, I have become not just a mother but the Tooth Fairy, the Easter Bunny and Santa Claus. To share my family traditions with the next generation, I have decided to become the stollen maker, too.
‘I wanted to take the stollen I grew up with and update it.’
A few weeks ago I reached out to Aunt Barb’s children, and they sent me a copy of her original recipe. Only then did I discover that our family’s stollen tradition had begun with Barb’s grandmother (my great-great-grandmother), Margaret Georg, who had left Barb a copy of the recipe in elegant handwriting. Over the years, Barb had tweaked the recipe to suit her own tastes and updated it in painstaking detail.
For my first try at stollen, I followed Barb’s instructions closely. I soaked the fruit overnight in the brandy, made a yeast starter, added butter, sugar, eggs and more brandy, then kneaded in flour until I had a pliant, stretchy dough. After letting the dough rise, I added an astonishing volume of fruit and nuts and kneaded the whole mixture until it somehow all stayed together. When the loaves were baked they tasted exactly as I had remembered, boozy and wonderful, but a little artificial from the industrial-grade candied fruit.
Curious about how more traditional iterations would compare, I sought out other recipes, and soon I was testing stollen filled with the classic mix of nuts, raisins and citrus peel; loaves flavored with a caravan’s worth of spices; and ones made with various liquors. Each was lovely in its way, but the more I tried, the more I felt pulled back to Barb’s recipe.
What I really wanted, I realized, was to take the stollen I grew up with and update it, just as Barb had done. I went back to her instructions, but swapped out the neon-hued candied fruits for undyed natural versions. I also added a touch of nutmeg to the dough, to complement the flavor of the brandy. And when the loaves came out of the oven, I brushed them with melted butter and topped them with powdered sugar instead of frosting. The result was a perfect balance of old and new, and so good that I’m planning a whole day of baking in early December so that I can make a huge batch of stollen to send to friends and relatives. Wrapped in red cellophane, of course.

The Failure of Tribal Schools



The Failure of Tribal Schools

Despite being seen as a way up for Native Americans, tribal colleges often fail to produce results. With high costs and low graduation rates their existence is being questioned.

By Sarah Butrymowicz in The Atlantic

Breanne Lugar says the only reason she enrolled in college was so she could move away from the house she shared on the Standing Rock Indian Reservation with her parents, her boyfriend, and her five children.
"I never wanted to come to school," says Lugar, 26, who signed up at Sitting Bull College, one of the nation’s tribal colleges and universities located on Indian reservations and run entirely by tribes. "I hated school."
But after a semester of classes toward a degree in business administration helped her move from a job as blackjack dealer to the finance department of the tribal casino, Lugar, a sophomore, has become a fervent advocate of the college.

She and other Native Americans say the best way for their tribes to solve their problems, including poverty and high rates of drug use and suicide, is through higher education. On Standing Rock, one of the nation’s poorest reservations, 43 percent of people lived in poverty in 2012, according to Census figures—three times the national average. Meanwhile, only 15 percent had bachelor’s degrees, compared to more than 30 percent of all Americans. Despite getting more than $100 million a year in federal funding tribal colleges often have abysmal success rates.
There are 32 accredited tribal colleges and at least five non-accredited schools offering associate, bachelor’s, and even some master’s degrees. Tribal college advocates say that the schools give opportunities to students in sprawling, geographically isolated Native communities and that their mission is broader than producing degrees. Many offer language classes to all those living on reservations to help prevent Native languages from going extinct; they also work with local businesses and attempt to address social problems on the reservation.
But despite getting more than $100 million a year in federal funding—including grants low-income students use to pay tuition—tribal colleges often have abysmal success rates. The average percentage of students who earn four-year degrees within six years (or two-year degrees within three years) at these schools is only 20 percent, according to a Hechinger Report analysis of federal graduation data—one third the national average and half the rate of Native students at non-tribal schools. These statistics only include first-time, full-time students, but at some tribal colleges, fewer than one in 10 of them ever finish.
"There’s not a lot of value for the student or for the tribes or the economies where they are," says Tom Burnett, a former Montana state senator who has been critical of tribal colleges.
The schools, which largely allow anyone to attend, say their poor outcomes are largely due to the many shortcomings students face before college even begins, including poor preparation in primary and secondary schools. Less than 70 percent of Native students graduate from high school, according to research by the U.S. Department of Education.
"The dilemma that we’re facing is we’re open admissions," said Thomas Shortbull, president of Oglala Lakota College on the Pine Ridge Reservation in South Dakota. "We do have a major problem with our students’ [preparedness]."
College accountability advocates are sympathetic to this argument—but only to a point.
"You can’t just say, 'That college has opened its doors wide and it has a low graduation rate, therefore it’s terrible,'" says Mark Schneider, vice president of the American Institutes for Research. "On the other hand, you can’t just say, ‘What do you expect?’"
Schneider and others argue that taxpayers spending tens of millions on tribal colleges and universities deserve to get more for their money.
"In higher education the federal government has essentially had a hands-off approach to their federal investment," says Mary Nguyen Barry, a policy analyst at Education Reform Now and co-author of Tough Love: Bottom-Line Quality Standards for Colleges, who says the government should try to help low-performing schools improve their graduation rates. If they can’t, Barry says, they should be cut off.
Struggling tribal schools would likely welcome extra support. Congress sets tribal college funding and is authorized by federal law to give schools a maximum of $8,000 per student. But in reality the schools get $5,850 per student on average. And that funding can be used only for Native American students; nearly a fifth of those enrolled don’t identify as Native. Howard University, a historically black college, by comparison averages more than $20,000 per student from the federal government.
"We want to see that the federal government is supporting our tribal colleges and universities as they are supporting any other minority-serving institution or state institution," says Victoria Vasques, former director of the White House Initiative on Tribal Colleges under President George W. Bush.
"Without tribal colleges, who would try to help these people?"
But Burnett says the better way of calculating this is by looking at the cost per degree awarded, not the cost per enrolled student. For example, the tribal Institute of American Indian Arts in New Mexico spends $504,000 for every degree it confers, he says—more than Harvard University or the Massachusetts Institute of Technology. Officials at the school, when contacted, would not comment about these costs.
In 2011, President Barack Obama signed an executive order creating the White House Initiative on American Indian and Alaska Native Education, which has supported programs to teach Native languages, including those at tribal colleges, and focused on informing tribal college administrators of grant opportunities. In a June speech at Standing Rock, Obama spoke of the need for the federal government to support economic and education development on reservations.  
Treaties between Indian tribes and the United States require the federal government to pay for education on reservations. Originally, this was taken to mean K-12 schools, but that assumption was questioned in the 1960s, according to Shortbull. It was around that time that Shortbull graduated from high school and was one of 20 students in his class to go on to the University of South Dakota. Four years later, he said, only two of them had earned degrees. "Our elders on the reservation said that this was unacceptable," he said. "Why couldn’t we create a college to educate our own people?"
By 1975, elders from tribes around the country were lobbying Congress, and the first tribal colleges began to open. Today, they collectively enroll nearly a tenth of Native Americans who attend colleges and universities nationwide.  
At Shortbull’s college, the most popular degrees offered are nursing and elementary education, two of the biggest careers in Pine Ridge. Nearly two-thirds of nurses on the reservation are graduates of Oglala Lakota, as are about 45 percent of the teachers, according to Shortbull. But many students struggle to make it past their first year. Two-thirds arrive at Oglala Lakota needing at least one remedial class in math or English to make up for material they should’ve learned in primary and secondary school but didn’t. Of those, two-thirds never get any further. In 2012, only 12 percent of Oglala Lakota students graduated after six years, according to federal data.
Anti-drug and alcohol messages painted on plywood flank the main road into Kyle, and suicide-prevention notices hang outside every room.
Part of the problem is that there aren’t many jobs on reservations, meaning even college graduates can easily be unemployed, says Stephanie Sorbel, who manages the college’s campus center in Kyle, one of the reservation’s largest towns. Anti-drug and alcohol messages painted on plywood flank the main road into Kyle, and suicide-prevention notices hang outside every room of the Oglala Lakota center.
From the parking lot, Sorbel can point to nearly all of Kyle’s employment opportunities. There are a few jobs requiring a college education at the health clinic up the road and at the daycare center on the campus. Tanka, a buffalo meat snack company, is headquartered in Kyle, but openings there are rare. There is one sign of growth, though: Some Oglala Lakota grads just opened a movie theater.
Despite his school’s low success rate, Shortbull says, its existence is vital. "Without tribal colleges, who would try to help these people?"
Like Lugar, many Native American students choose tribal colleges because they’re more convenient than other higher-education institutions and they feel more comfortable staying on the reservation.
"History tells us that if we didn’t have the colleges here many of our students would go off [the reservation] and they wouldn’t do well," Sitting Bull College President Laurel Vermillion said, adding that the majority of her students transfer there from an off-the-reservation school.
But Burnett argues that attending low-performing schools won’t help students. "Going back to a safe harbor that leads you nowhere is no solution."

This story was produced in collaboration with The Hechinger Report.

Thanksgiving: Who's on First



Thanksgiving: Who's on First

By Jeremy Lott

Before entering our mandatory food comas this year, we pause to reflect on the curious fact that several historical societies regard the official Plymouth Plantation Thanksgiving story as a bit of Massachusetts pro-Pilgrim propaganda.
These local history buffs are not anti-Thanksgiving -- not even anti-Pilgrim necessarily. They point out that the first of the two feasts, held in October 1621 and then July 1623, was more of a harvest feast than an official cycle of fasting and gorging and thanksgiving to God.
The boosters tend to make these points and then slyly slip it in there that their very own favored Thanksgiving(s) occurred first.
When and where were these tentative turkey days, these rival First Thanksgivings? So glad that you asked...
"The very first Thanksgiving occurred in Virginia." Full marks to the Commonwealth of Virginia's website for not letting Massachusetts have those beloved holiday tourism dollars without a fight.
"Each year," explains VirginiaBot, "visitors are invited to join in the festivities at the Virginia Thanksgiving Festival hosted by Berkeley Plantation, site of the very first Thanksgiving in 1619. Enjoy this day dedicated to history and food, and including house tours of the beloved 1726 Berkeley Plantation manor house."
National Geographic reminds us the Commonwealth may have an even earlier claim. The Jamestown colonists were mightily thankful and held a feast in 1610, "when the arrival of a food-laden ship ended a brutal famine." Yet that still may not be early enough to win the Thanksgiving revisionist no-prize.
In 1607, a short-lived English settlement in Maine made landfall on August 18 and had a Thanksgiving celebration the very next day. Though these colonists had a reputation for being "riffraff and scoundrels," wrote the late Christian Science Monitor columnist John Gould, "when ordered to attend church services, they went."
These men "devoutly took part in a Christian service of thanksgiving, followed by a feast of thanksgiving cooked on the shore. ...The first thanksgiving dinner in the New World was Maine lobster with steamed mussels and boiled dried peas. Some of the men found pearls in the mussels."
"No Indians attended," wrote Gould. He speculated, "Squanto might have come if he'd been invited, as he lived at the Indian village of Pemaquid, just a moccasin step upstream." Instead, he would have to wait a few years for a different Thanksgiving celebration.
Robyn Gioia is the schoolteacher responsible for, "firing the next shot across the Mayflower's bow," according to USA Today. Gioia wrote America's REAL First Thanksgiving: St. Augustine Florida, September 8, 1565, a title which gets us most of the way there, explanation-wise.
"What does REAL mean?" asked reporter Craig Wilson. "Well, [Gioia's] not talking turkey and cranberry sauce. She's talking a Spanish explorer who landed here on September 8, 1565, and celebrated a feast of thanksgiving with Timucua Indians. They dined on bean soup."
Texas Governor Rick Perry has weighed in to the Thanksgiving debate, claiming first-in-nation status for his own state. According to the Texas Legislative Reference Library, "the first Thanksgiving celebration in the United States took place in 1598 near El Paso."
The Library explains that an expedition "led by Spanish explorer Don Juan de Onate journeyed from Mexico and, after months of arduous travel, arrived at the Rio Grande near what is now San Elizario. The exploration party and the indigenous people celebrated their accomplishments with a feast and Catholic ceremonies -- 23 years before the Pilgrims held their famous dinner at Plymouth Rock."
Texas has an even earlier claim that predates Florida's Thanksgiving, which it curiously refuses to press. "In 1541 Spaniard Francisco Vasquez de Coronado and his troops celebrated a 'Thanksgiving' while searching for New World gold in what's now the Texas Panhandle," reports National Geographic.
Robert Malkin is a trolley driver and tour guide in St. Augustine, Florida. He enjoys pointing out the town's storied history, yet he doesn't mention the town's first Thanksgiving, so USA Today asks him about it. "Well, it's very arguable. I also don't think they called it Thanksgiving. You can't even call it Thanksgiving if it's not even English. Thanksgiving is an English word," he says.
That sort of answer raises St. Augustine Historical Society director Susan Parker's hackles. "There's a tradition of diminishing the Catholic presence of our early history," Parker complains.
Parker blames a reflexive "Protestant twist" in American historiography for the fact that Florida, or possibly Texas, are not acknowledged for their first Thanksgivings. They just don't have Mass appeal.
The trolley driver has half a point with his notion of Thanksgiving as an English thing, though it's narrower than that. The Catholic Church had created a calendar chock full of feast days and holy days of obligation. Protestants thought this approach un-Biblical and wasteful and tried to blot most of the holidays off the calendar. Puritans were even anti-Christmas.
Yet people want and need holidays, it turns out. So the Reformers proposed most church holidays be replaced by specially declared days of fasting and days of thanksgiving. Whether this applied to the Pilgrims' first Thanksgiving(s) is arguable, but the idea is not, and it played a prominent role in American history.
As general and as president, George Washington called for national days of Thanksgiving to God. According to the Mt. Vernon estate website, President Washington observed the first truly national American Thanksgiving on November 26, 1789 "by attending services at St. Paul's Chapel in New York City," -- the country's temporary capital -- "and by donating beer and food to imprisoned debtors in the city."
In the middle of his country's rancorous Civil War, President Abraham Lincoln looked to George Washington's example. The sixteenth president declared a national and perpetual day of Thanksgiving to God, to be held on the fourth Thursday of November.
In spite of the war, Lincoln observed, the skies had been "healthful" and the fields "fruitful." And to these bounties "which are so constantly enjoyed that we are prone to forget the source from which they come, others have been added," Lincoln argued, "which are of so extraordinary a nature that they cannot fail to penetrate and soften even the heart which is habitually insensible to the ever watchful providence of Almighty God."
In some ways, Lincoln was only recognizing a holiday that many Americans were already celebrating. He had already declared a few local Thanksgivings and folks in much of the country had started writing about and celebrating the First Thanksgiving at Plymouth.
As an added bonus, the story had in it many attractive elements for someone trying to bind the country back together: particularly the Indians and the Pilgrims coming together, putting their great differences aside, sitting down at the table of brotherhood, and rendering themselves temporarily peaceful through gluttony.
In response to pressure from struggling retailers, President Franklin Roosevelt tried to bump Thanksgiving up a week in 1939, to the third Thursday in November. This move, explains About.com, proved wildly unpopular: "Atlantic City's mayor derogatorily called November 23 'Franksgiving.'"
Many people refused to go along. As a result, "The country became split on which Thanksgiving they should observe." Initially, only 23 states followed FDR's lead. This created so many scheduling headaches that Congress eventually rebuffed the president, restoring Lincoln's and Washington's original fourth Thanksgiving in November date.
Most Americans know that every year a president undertakes the cheesy ceremony of pardoning a turkey. What they probably don't know is that he pardons two of them: the official Thanksgiving Turkey and an alternate, in case something should happen to the First Gobbler.
The second turkey represents the vice president. As presidential traditions go, this one is about as harmless and diverting as anything we could ever conceive. Plus it has the potential to take some feathers out of the politicians, by designating as their fowl representatives a couple of turkeys.
Still, it might be best, just this once, to extend the pardoning power to the Vice President. Just imagine watching the speech Joe Biden would deliver if given this august responsibility.
That presidentially pardoned turkey is a goner, sadly.
National Journal recently ran the article, "Soon, President Obama Will Pardon a Thanksgiving Turkey. Then, It Will Die." The Journal pointed out that all eight turkeys Barack Obama had pardoned to date had died. In fact, "Only one turkey pardoned by the president has lived to see a second Thanksgiving."
George W. Bush's turkeys didn't fare better. Bush sent pardoned gobblers to a Virginia farm to live out their days. Asked for comment about the pardonees' longevity, the farmer unsentimentally said "we usually just find 'em and they're dead."
Domestic turkeys "are so fat that without human intervention, [they] would go extinct." The Journal explained that is true because the orotund birds are "physically incapable" of reproducing without human intervention.
The modern Thanksgiving dinner is "remarkably consistent in its elements: the turkey, the stuffing, the sweet potatoes, the cranberry sauce," says Yankee Magazine. In fact, "Barring ethical, health, or religious objections, it is pretty much the same meal for everyone, across latitudes and longitudes, and through the years of their lives. We stick with the basics and simply change the seasonings."
The best argument for a more syncretistic approach to Thanksgiving is the food. Imagine a truly inclusive Thanksgiving with lobster and mussels and bean soup and perhaps some Tex-Mex. It would beat the heck out of most boring American turkey day dinners.
But even a more authentic Pilgrim Thanksgiving would be a major improvement. Yankee says in that alleged First Feast, "venison was a major ingredient, as well as fowl, but that likely included pheasants, geese, and duck" moreso than dread dry turkeys.
Other likely ingredients include onions and herbs, cranberries, currants, watercress, walnuts, chestnuts, beechnuts, sunchokes, shellfish, beans, pumpkins, squashes and corn "served in the form of bread or porridge."
We'd say more on this score but the Pavlovian reflex is threatening to short out the keyboard.
Our neighbor on the 49th has its own Thanksgiving. It's held over a month earlier than the American holiday because weather in late November in the Great White North can make travel treacherous. Who knew?

Jeremy Lott is an editor of Rare and author, most recently, of William F. Buckley.



The Paleo Diet: Unprovable and Probably Wrong



The Paleo Diet: Unprovable and Probably Wrong


Editor's Note: This article was originally published at The Conversation.

We still hear and read a lot about how a diet based on what our Stone Age ancestors ate may be a cure-all for modern ills. But can we really run the clock backwards and find the optimal way to eat? It’s a largely impossible dream based on a set of fallacies about our ancestors.
Top of Form
Bottom of Form
There are a lot of guides and books on the palaeolithic diet, the origins of which have already been questioned.

It’s all based on an idea that’s been around for decades in anthropology and nutritional science; namely that we might ascribe many of the problems faced by modern society to the shift by our hunter-gatherer ancestors to farming roughly 10,000 years ago.
Many advocates of the palaeolithic diet even claim it’s the only diet compatible with human genetics and contains all the nutrients our bodies apparently evolved to thrive on.
While it has a real appeal, when we dig a little deeper into the science behind it we find the prescription for a palaeolithic diet is little more than a fad and might be dangerous to our health.
Mismatched to the modern world
The basic argument goes something like this: over millions of years natural selection designed humans to live as hunter-gatherers, so we are genetically “mismatched” for the modern urbanised lifestyle, which is very different to how our pre-agricultural ancestors lived.
The idea that our genome isn’t suited to our modern way of life began with a highly influential article by Eaton and Konner published in the New England Journal of Medicine in 1985.
Advocates of the palaeolithic diet, traceable back to Eaton and Konner’s work, have uncritically assumed a gene-culture mismatch has led to an epidemic in “diseases of civilisation”.
Humans are, it’s argued, genetically hunter-gatherers and evolution has been unable to keep pace with the rapid cultural change experienced over the last 10,000 years.
These assumptions are difficult to test or even outright wrong.
What did our Stone Age ancestors eat?
Proponents of the palaeolithic diet mostly claim that science has a good understanding of what our hunter-gatherer ancestors ate.
Let me disavow you of this myth straight away – we don’t – and the further back in time we go the less we know.
What we think we know is based on a mixture of ethnographic studies of recent (historical) foraging groups, reconstructions based on the archaeological and fossil records and more recently, genetic investigations.
We need to be careful because in many cases these historical foragers lived in “marginal” environments that were not of interest to farmers. Some represent people who were farmers but returned to a hunter-gatherer economy while others had a “mixed” economy based on wild-caught foods supplemented by bought (even manufactured) foods.
The archaeological and fossil records are strongly biased towards things that will preserve or fossilise and in places where they will remain buried and undisturbed for thousands of years.
What this all means is we know little about the plant foods and only a little bit more about some of the animals eaten by our Stone Age ancestors.
Many variations in Stone Age lifestyle
Life was tough in the Stone Age, with high infant and maternal mortality and short lifespans. Seasonal shortages in food would have meant that starvation was common and may have been an annual event.
People were very much at the mercy of the natural environment. During the Ice Age, massive climate changes would have resulted in regular dislocations of people and the extinction of whole tribes periodically.
Strict cultural rules would have made very clear the role played by individuals in society, and each group was different according to traditions and their natural environment.
This included gender-specific roles and even rules about what foods you could and couldn’t eat, regardless of their nutritional content or availability.
For advocates of the palaeolithic lifestyle, life at this time is portrayed as a kind of biological paradise, with people living as evolution had designed them to: as genetically predetermined hunter-gatherers fit for their environment.
But when ethnographic records and archaeological sites are studied we find a great deal of variation in the diet and behaviour, including activity levels, of recent foragers.
Our ancestors – and even more recent hunter-gatherers in Australia – exploited foods as they became available each week and every season. They ate a vast range of foods throughout the year.
They were seasonably mobile to take advantage of this: recent foraging groups moved camps on average 16 times a year, but within a wide range of two to 60 times a year.
There seems to have been one universal, though: all people ate animal foods. How much depended on where on the planet you lived: rainforests provided few mammal resources, while the arctic region provided very little else.
Studies show on average about 40% of their diet comprised hunted foods, excluding foods gathered or fished. If we add fishing, it rises to 60%.
Even among arctic people such the as Inuit whose diet was entirely animal foods at certain times, geneticists have failed to find any mutations enhancing people’s capacity to survive on such an extreme diet.
Research from anthropology, nutritional science, genetics and even psychology now also shows that our food preferences are partly determined in utero and are mostly established during childhood from cultural preferences within our environment.
The picture is rapidly emerging that genetics play a pretty minor role in determining the specifics of our diet. Our physical and cultural environment mostly determines what we eat.
Evolution didn’t end at the Stone Age
One of the central themes in any palaeolithic diet is to draw on the arguments that our bodies have not evolved much over the past 10,000 years to adapt to agriculture-based foods sources. This is nonsense.
There is now abundant evidence for widespread genetic change that occurred during the Neolithic or with the beginnings of agriculture.
Large-scale genomic studies have found that more than 70% of protein coding gene variants and around 90% of disease causing variants in living people whose ancestors were agriculturalists arose in the past 5,000 years or so.
Textbook examples include genes associated with lactose tolerance, starch digestion, alcohol metabolism, detoxification of plant food compounds and the metabolism of protein and carbohydrates: all mutations associated with a change in diet.
The regular handling of domesticated animals, and crowded living conditions that eventually exposed people to disease-bearing insects and rodents, led to an assault on our immune system.
It has even been suggested that the light hair, eye and skin colour seen in Europeans may have resulted from a diet poor in vitamin D among early farmers, and the need to produce more of it through increased UV light exposure and absorption.
So again, extensive evidence has emerged that humans have evolved significantly since the Stone Age and continue to do so, despite some uninformed commentators still questioning whether evolution in humans has stalled.
A difficult choice
In the end, the choices we make about what to eat should be based on good science, not some fantasy about a lost Stone Age paradise.
In other words, like other areas of preventative medicine, our diet and lifestyle choices should be based on scientific evidence not the latest, and perhaps even harmful, commercial fad.
If there is one clear message from ethnographic studies of recent hunter-gatherers it’s that variation – in lifestyle and diet – was the norm.
There is no single lifestyle or diet that fits all people today or in the past, let alone the genome of our whole species.