Thursday, July 31, 2014
Inquirer Daily News

POSTED: Monday, July 2, 2012, 4:19 PM

The cover review of the New York Times Book section featured a journalist/philosopher writing on a new book by another journalist/philosopher. I’m biased here because the author of America the Philosophicalis Carlin Romano, who wrote for many years for the Inquirer and whose features are much missed. The reviewer is Anthony Gottlieb who was my editor when I was an intern at the Economist, and who is also a friend. His book, The Dream of Reason, is one I’m constantly consulting. That’s how it ended up in the picture above.

Both authors have backgrounds in philosophy but are journalists at heart. Both see philosophy as something that the general public can understand without any so-called dumbing down.
There are some places where the reviewer and the author disagreed, but both seem to share a faith in the American intellect. And so they agreed that reports of our intellectual demise have been greatly exaggerated:

 “Idiot America: How Stupidity Became a Virtue in the Land of the Free,” “Unscientific America: How Scientific Illiteracy Threatens Our Future” and “The Age of American Unreason” are just three of the books from American writers in the past five years that belabor religious fundamentalism, conservative talk shows, scientific illiteracy or the many available flavors of junk food for thought. The fallacy of such books, as Romano argues, is that they take some rotten parts for the largely nutritious whole. It’s not so much that they compare American apples with foreign oranges, but that they fail to acknowledge that the United States is an enormous fruit bowl. Everything is to be found in it, usually in abundance, including a vibrant intellectual life. Rather like that of India — which has over a third of the planet’s illiterate adults but also one of the largest university systems in the world — the intellectual stature of America eludes simple generalizations.

Faye Flam @ 4:19 PM  Permalink | 0
POSTED: Saturday, June 30, 2012, 6:09 PM
Melting ice floats away from polar icecap. (Credit: The Nature Conservancy)

Below is a slightly expanded version of the column that will run in Monday's Health and Science section of the Philadelphia Inquirer:

Several of the regular readers of this column have told me that since I’ve been brave enough to tell the truth about evolution, I should do the same for climate change and expose it as a hoax. In one case I replied that in my stories I always strive to reflect the truth to the best of my abilities. He wrote that he was “disappointed.” These evolution-accepting climate change “skeptics” are an interesting breed, revealing some key differences in the ways they and creationists approach science.

Self-described climate skeptics are much more scattered in their views than are creationists, but they are better organized and together speak with a louder, and angrier voice.

Faye Flam @ 6:09 PM  Permalink | 0
POSTED: Friday, June 29, 2012, 12:53 PM

For some of our ancient relatives, that sentence should be read with eats as the verb and everything else as food items, at least according to a new paper published in Nature this week.

Actually it was fruits, leaves and bark and other forest vegetation that apparently made up the diet of a possible human ancestor known as Australopithecus sediba. Researchers reached that conclusion by examining 2 million year-old teeth.

My colleagues and I discussed this paper earlier in the week and we offered several alternative hypotheses for the presence of bark in those teeth. One person thought A. sediba was using it for toothpicks. I suggested that the bark was hallucinogenic.
In this story for the BBC, one of the researchers says this bark might have tasted like maple syrup.

Faye Flam @ 12:53 PM  Permalink | 0
POSTED: Friday, June 29, 2012, 12:20 PM

Yesterday I started wondering if there’s some evolutionary root behind the desire to learn news a few seconds before the next person. I can’t see any practical benefit to knowing the Supreme Court’s ruling on Obamacare one minute or even five minutes before someone else. Maybe those people who vowed to leave the country could save a few dollars on airfare – but that’s a stretch.  


And yet reporters put their reputations on the line in a tweeting race that came down to seconds. Some got it wrong and had to post a hasty retraction. This story in today’s Inquirer captured the frenzy:


…..a storm front of anticipation had built around the decision two years in the making. Twitterverse, blogosphere, radio, TV - all were pressing to be first to know and tell.
The next 20 minutes were some of the craziest in media history. They left a lot of reputations in the road.
The Associated Press, the New York Times, Bloomberg News, and the blog SCOTUSblog.com got it right. (The Times didn't rush; magisterially, it told the world to wait while it digested the info.) Fox News and CNN got it wrong. (Following CNN, other news venues, including Philly.com, posted incorrect tweets and, minutes later, deleted or corrected them.)
"With the way the media world has been trending, we're in a situation where misinformation thrives," said Victor Pickard, associate professor at the Annenberg School for Communication at the University of Pennsylvania.

Faye Flam @ 12:20 PM  Permalink | 0
POSTED: Thursday, June 28, 2012, 3:03 PM

These days everyone is a self-described skeptic or skep-something, but the Wharton School’s Uri Simonsohn looks like the real deal.

Science has identified him as a “whistle blower” who smelled something fishy in the work of Dutch marketing professor Dirk Smeesters. Smeesters had authored a number of papers with headline-grabbing, counterintuitive findings on things like the effects of seeing the colors red and blue on stereotyping. Now it looks like those findings were counterintuitive because they weren’t true.

Simonsohn’s analysis led to an internal investigation and the Dutch researcher’s resignation.

Faye Flam @ 3:03 PM  Permalink | 0
POSTED: Thursday, June 28, 2012, 12:50 PM

The news broke today that Philadelphia’s own medical pioneer Douglass Wallace will win the $500,000 Gruber Prize for genetics. He’s enlightened the medical world about the importance of a cellular structure that powers cells and makes complex animal life possible. That structure is called the mitochondria, and it’s known to have its own DNA. When that DNA is mutated or damaged, the effects can be deadly.


Wallace holds joint appointments at CHOP, where he directs the Center for Mitochondrial and Epigenomic Medicine, and at Penn. I met him in 1997, when the concept of mitochondrial medicine was just starting to emerge. Here’s an excerpt from my story back then:


Though the first case of a mitochondrial disease was diagnosed in 1962, it was not until the mid-1980s that doctors began to recognize more cases in children. Experts now suspect that many patients are wrongly diagnosed as having cerebral palsy, epilepsy, multiple sclerosis, or ``failure to thrive. '' It may even account for instances of sudden infant death syndrome.
``We need to convince [doctors] these disorders are common and important,'' said Douglas Wallace, an Emory University biologist attending an international meeting on mitochondrial diseases held this month in Philadelphia.
``Until recently, it was considered stupid to look at the mitochondria'' as a cause of disease, says Wallace. The changing attitude took a revolution in scientists' understanding of these tiny cellular power plants.
High school textbooks have long depicted the mitochondrion as a folded ribbon surrounded by an oval-shaped membrane. The snapshot was accurate but the standard definition - an organelle, or tiny organ, with the job of making energy - didn't capture its strangely independent nature.
In 1963 scientists discovered that the mitochondria carried their own set of genes, made from their own DNA. (Before that, scientists thought all human DNA was contained in the 23 pairs of chromosomes inside the cell's nucleus. )
The genes in the mitochondria pass only from mother to offspring - egg cells carry mitochondria, while sperm cells do not. Stranger still, some scientists have come to see the mitochondria not as a standard part of our bodies but as a life form in itself - a benevolent parasite.
The way Wallace explains it, about a billion and a half years ago, a slender, thread-like bacterium slithered inside a larger one-celled organism. Both life forms benefited from the invasion. The bacterium gained the protection and mobility of its much larger host, and the host benefited by absorbing some of the energy that the invader pumped out. The invader used an efficient, oxygen-burning process that the host cell had not evolved.
This mutually beneficial - or symbiotic - partnership worked so well that the two evolved together into fungi, plants and animals. The discovery of the mitochondria's own set of genes backed this scenario of an independent origin, especially after analysis showed that the mitochondria's closest relative is a free-living bacteria.
Wallace argues that, as composite beings, we can die one of two ways - either the body's cells die, or the mitochondria within them die.

Faye Flam @ 12:50 PM  Permalink | 0
POSTED: Wednesday, June 27, 2012, 5:57 PM

According to a new National Geographic Channel poll reported in USA Today, more than a third of Americans believe in UFOs. This is perfectly understandable. There’s a lot of stuff up there and if you’re not an amateur astronomer, it’s likely that on a clear night you would see at least one thing you couldn’t identify. UFO, after all, just stands for unidentified flying object. I tend to assume that if I can’t identify something, well, someone in the Delaware Valley Amateur Astronomers probably could. But that’s just me.


But the most interesting part of the survey was this: 

"•Nearly 65% think Barack Obama would be better suited than Mitt Romney to handle an alien invasion.
Extraterrestrial beings could not be reached for comment."

Faye Flam @ 5:57 PM  Permalink | 0
POSTED: Wednesday, June 27, 2012, 11:45 AM

A reader kindly sent us this video, which was allegedly created to improve the image of women in science. The reader sent it with this note, “Got your marching (strutting) orders.”  I have little to say except that the shoes are not an ideal choice for any kind of field work. Higgs, however, has some observations:

Higgs: There is nothing here to be ashamed of people. What you’re seeing is a somewhat exaggerated display of fertility. I have engaged in a similar behavior myself before my contraceptive procedure. I would strut down the alley with my tail as upright as a flagpole, showing the all world my impressive gonads.


I wasn’t any less intelligent back then, but my behavior was more powerfully influenced by sex hormones. Again – nothing to be ashamed of. Now that I’ve been separated from my gonads I live a more cerebral and contemplative life, much like Peter Abelard. There is no connection between my prowess as a scientific thinker and my fertility.


Faye Flam @ 11:45 AM  Permalink | 0
About this blog
Faye Flam - writer
In pursuit of her stories, writer Faye Flam has weathered storms in Greenland, gotten frost nip at the South Pole, and floated weightless aboard NASA’s zero-g plane. She has a degree in geophysics from the California Institute of Technology and started her writing career with the Economist. She later took on the particle physics and cosmology beat at Science Magazine before coming to the Inquirer in 1995. Her previous science column, “Carnal Knowledge,” ran from 2005 to 2008. Her new column and blog, Planet of the Apes, explores the topic of evolution and runs here and in the Inquirer’s health section each Monday. Email Faye at fflam@phillynews.com. Reach Planet of the at fflam@phillynews.com.

Planet of the Apes
Also on Philly.com:
Stay Connected