Science and Technology at Scientific American.com CURRENT ISSUE
CURRENT ISSUE HIGHLIGHTS:


 - Subscribe >

MOST SEARCHED KEYWORDS:
  stem cells
  global warming
more >
SEARCH
 
 Advanced Search
October 22, 2007 Newsletters | RSSxml
 SCIAM OBSERVATIONS
Opinions, arguments and analyses from the editors of Scientific American

Search
Comments or questions? Send them to editorsblog@sciam.com
Recent Posts
Categories
Archives
General Science Sites/Favorite Blogs
October 19, 2007

10:00:34 am, Categories: Ethics and Science, Life Sciences, Medicine, 182 words

Nobel scientist apologizes

Geneticist James Watson apologized for his inflammatory remarks linking race and intelligence, telling an audience at the Royal Society of London: "To all those who have drawn the inference from my words that Africa, as a continent, is somehow genetically inferior, I can only apologize unreservedly. That is not what I meant. More importantly, there is no scientific basis for such a belief.''

TimesOnline of London reports that Watson said he was flummoxed by an interview that appeared in The Sunday Times in which he was quoted as saying that Africans are intellectually inferior. "I cannot understand how I could have said what I am quoted as having said,'' he said. "I can certainly understand why people reading those words reacted in the ways they have.'' The paper stood by the interview.

Watson's apology came in the wake of an uproar sparked by his comments. In Britain to peddle his upcoming book "Avoid Boring People: Lessons in a Life of Science,'', the controversial scientist was banned from appearing at the prestigious Science Museum in London Times Online reports.


Posted by Lisa Stein · 14 comments   Permanent



03:16:37 am, Categories: Life Sciences, Politics and Science, 144 words

Cold Spring Harbor Laboratory has suspended James Watson

Just to be clear: They didn't fire him, they didn't take away his position as Chancellor of the university, and they probably won't ever take his name off of their graduate school (The Watson School of Biological Sciences), but the board of trustees at CSHL has decided to "suspend the administrative responsibilities" of James D. Watson.

Full press release after the jump.

Read more


Posted by Christopher Mims · 7 comments   Permanent

October 18, 2007

05:14:32 pm, Categories: Ethics and Science, Education, 590 words

Won a Nobel? Go nuts!

As a long-time science journalist, I have learned to take what James Watson says with a grain of salt. Even so, I was caught off guard by the outrageousness of his latest words. Watson gets all the kudos for his genetics work, and his discovery with Francis Crick of the double-helical structure of DNA unquestionably deserved the Nobel Prize. But maybe that's what's wrong.

For most research scientists, winning the Nobel Prize stands as the pinnacle of success, the ultimate goal that takes intelligence, dedication, luck and ambition (and don't be fat, Watson would say, because fat folks are not ambitious). Once the King of Sweden drapes that medal around your neck, life is good--people want to hear you speak, offer you presitigious positions, and are more inclined to give you what you want.

To their credit, some scientists take the opportunity to tackle very "out there" research. Soon after he won his 1995 Nobel in physics, Martin Perl launched a project to find "free quarks." Conventional thinking says there can be no such things--quarks must remain bound in the particles in which they build--but some scientists speculate that some free quarks might have been left over during the big bang. Perl recognized the long-shot odds of finding free quarks, but it was a project he could do because of his Nobel. A graduate student would be committing career suicide.

Other researchers run from the glory. Brian Josephson, who discovered the quantum effect in which superconducting electrons could jump across a narrow barrier, went off to study mysticism and psychic phenomena. (His problems, though, may run deeper; not many people would choose Taco Bell for a (free) lunch meeting.)

After the wacky things James Watson has uttered over the past decades--on women, homosexuals and the obese, to name a few--now comes his decision to join hands with the transistor-developing, eugenics-advocating, sperm-donating William Shockley, who, I recall, blamed his wife's genes for his kids being less than genius. As a geneticist, Watson arguably has better credentials to rant about race and IQ than Shockley. But that still doesn't make him an expert on IQ studies. It's true that blacks have historically scored 15 points lower than whites on IQs. What's been debated endlessly is how much is tied to heredity and how much to environment. Intelligence researchers such as James Flynn have found that IQ can change over time, suggesting a strong environmental influence. Others, such as Philippe Rushton and Linda Gottfredson, say the data is at least as consistent with hereditarian arguments as they are with environmental ones. I don't want to get into a whole discussion about IQ again--we've covered a lot in this magazine (see, for instance, "Unsettled Scores" and our Intelligence special issue). But while I'm at it, one question I have for the hereditarians: how do you separate genetic explanations from womb conditions--a crucial environmental factor?

Without having mucked around in the morass that is IQ research, Watson can at best be only a casual observer. He's reportedly dim about the prospects of Africa because of their lower intelligence test scores. What, cultural conflicts, religious attitudes and greed are less important? That's hard to justify when you look at what's going on in different parts of the world right now.

Yet because of his Nobel and past accomplishments as a geneticist, Watson's words take on added meaning and weight beyond what they deserve. Winning the Nobel grants a great deal of power. Too bad Watson didn't channel Stan Lee and recognize that, with great power comes great responsibility.


Posted by Philip Yam · 9 comments   Permanent



05:06:10 pm, Categories: Ethics and Science, 496 words

James Watson and eugenics

Cold Spring Harbor Laboratory issued a statement from its board of trustees that addressed remarks by James Watson that were reported in The Sunday Times U.K. in which he claims that blacks are inferior in intelligence to whites:

The statement reads, in part:

The comments attributed to Dr. James Watson that first appeared in the October 14, 2007 edition of The Sunday Times U.K. are his own personal statements and in no way reflect the mission, goals, or principles of Cold Spring Harbor Laboratory's Board, administration or faculty. Dr. Watson is not the President of Cold Spring Harbor Laboratory and was not speaking on behalf of the institution.

The Board of Trustees, administration and faculty vehemently disagree with these statements and are bewildered and saddened if he indeed made such comments. Cold Spring Harbor Laboratory does not engage in any research that could even form the basis of the statements attributed to Dr. Watson.

All of the above is true. But it's not the whole story. The 79-year-old Watson is listed under laboratory administration on its Web site as "chancellor," right below Bruce W. Stillman, the president. (Watson was lab director from 1968 until 1994 and president from that year until 2003.) Watson deserves his larger-than-life reputation around Cold Spring Harbor as co-discoverer of the double helix structure of DNA, as a dynamic former laboratory president and as a leader in the Human Genome Project. [A few hours after this blog entry was posted, Cold Spring Harbor Laboratory issued a press release that stated the board's decision to suspend Watson from his duties as chancellor.]

But there are other reasons, for the sake of the laboratory he did so much for as an administrator, that the man should keep his mouth shut about the social implications of genetics. What is today known as Cold Spring Harbor Laboratory was once at the center of the American eugenics movement, when it was home to the Eugenics Record Office from 1910 to 1939, at first under the tutelage of the infamous eugenicists Charles Benedict Davenport and Harry Laughlin.

The Eugenics Record Office gathered "pedigrees" of families, noting traits such as allergies, feeble-mindedness, civic leadership and immoral behavior. The University of Virginia Health System's eugenics historical collection gives this description of Davenport and Laughlin's perspective: "Both men were members of the American Breeders Association. Their view of eugenics, as applied to human populations, drew from the agricultural model of breeding the strongest and most capable members of a species while making certain that the weakest members do no reproduce."

Laughlin drafted model legislation in 1914 that was adopted by nearly 20 states that led to the forced sterilization of thousands of men and women thought to be mentally or physically unfit. The Laughlin model law even influenced the framing of the Nazis' 1933 sterilization laws. It is unfortunate that Watson, one of the most famous scientists of the twentieth century, should end his career by tacitly invoking a tawdry side of the stellar institution he helped to build.


Posted by Gary Stix · 11 comments   Permanent



01:41:11 pm, Categories: Ethics and Science, Life Sciences, 385 words

James Watson's greatest hits

Lest anyone imagine that the recent comments regarding race by James Watson, Nobel Prize-winning co-discoverer of DNA, not to mention board member of Seed Media Group, publisher of Seed magazine and ScienceBlogs, are in any way uncharacteristic of this particular scientist's descent into senescence:

I've compiled this helpful guide to Dr. Watson's past utterances, all of which come from the one fact-checkable source I could find: a 2000 lecture delivered at Berkeley University. (Nature Medicine article on the same subject here.)

* After showing images of women in bikinis and veiled Muslim women, he suggested that there is a link between exposure to sunlight and libido. Then he said, "That's why you have Latin lovers. You've never heard of an English lover. Only an English patient.''

* After showing a picture of Kate Moss, he asserted that thin people are unhappy and therefore ambitious. "Whenever you interview fat people, you feel bad, because you know you're not going to hire them,'' he added.

* Fat people may also be more sexual, Watson asserted, because their bloodstreams contain higher levels of leptin.

Update: People keep pointing out other incendiary things Watson has said in the past:

"If you are really stupid, I would call that a disease," says Watson, now president of the Cold Spring Harbour Laboratory, New York. "The lower 10 per cent who really have difficulty, even in elementary school, what's the cause of it? A lot of people would like to say, 'Well, poverty, things like that.' It probably isn't. So I'd like to get rid of that, to help the lower 10 per cent."

No doubt this is only the beginning of Watson's controversial utterances. He's about to go on tour to support his new book, Avoid Boring People: Lessons from a Life in Science, after all. As a Nobelist turned pundit, he's bound to be the darling of talk radio, if they'll even let him on.

The only real question is why he's been given a pass for this long.

I wonder what his various employers and associates will do in the wake of this embarrassment?

(Update: Cold Spring Harbor has begun distancing themselves from his comments -- see their press release on the subject. Ditto the Federation of American Scientists.)

Razib over at Gene Expression has an interesting take on the whole mess.


Posted by Christopher Mims · 15 comments   Permanent

October 17, 2007

04:33:00 pm, Categories: Ethics and Science, Life Sciences, Medicine, Politics and Science, 730 words

World-renowned geneticist draws fire for claims that Africans are intellectually inferior

Nothing like stirring the pot--not to mention selling books--with an incendiary claim, in this case that one race is intellectually superior to another. Dr. Watson, we presume? Bingo. World-renowned geneticist James Watson, 79, who shared the 1962 Nobel Prize for Physiology or Medicine for his role in helping break the DNA code, is being widely criticized after telling The Sunday Times of London that he's "inherently gloomy about the prospect of Africa" because "all our social policies are based on the fact that their intelligence is the same as ours--whereas all the testing says not really." He went on, reports the newspaper, to say that "people who have to deal with black employees find...it is not true" that all humans are equal.

The Independent reports today that Watson is currently on a swing through Britain to publicize his latest book Avoid Boring People: Lessons from a Life in Science due out next week in which he apparently makes similar such claims. "There is no firm reason to anticipate that the intellectual capacities of peoples geographically separated in their evolution should prove to have evolved identically," the London newspaper quotes from the upcoming book. "Our wanting to reserve equal powers of reason as some universal heritage of humanity will not be enough to make it so."

The comments and ensuing firestorm are reminiscent of the explosive debate whipped up a decade or so ago by the 1994 book The Bell Curve in which Richard Herrnstein and political scientist Charles Murray suggested that IQ is genetic and that some races are inherently brighter. That claim, like this one, was roundly denounced as "scientific racism" by the science community. (Another Nobel example: 1956 physics Laureate William Shockley, who shared his prize as a co-inventor of the transistor. Considered a founding father of Silicon Valley, he later applied his genius to other fields, namely developing various eugenic theories. These included the idea that higher rates of reproduction among the less intelligent were lowering the quality of the human race and that this was happening to a greater degree among blacks. He even recommended that individuals having IQs below 100 should be paid to undergo voluntary sterilization.)

Watson is recognized for his research at the University of Cambridge in the 1950s and 1960s and as part of the team that discovered DNA's structure. But the prestigious scientist, who for 50 years has served as director of the Cold Spring Harbor Laboratory on Long Island, is no stranger to controversy. In a 1990 profile, Science wrote: "To many in the scientific community, Watson has long been something of a wild man, and his colleagues tend to hold their collective breath whenever he veers from the script." In 1997, reports The Independent, Watson reportedly told a newspaper that a woman should have the right to abort her unborn child if tests could determine it would be homosexual. He has also reportedly argued for genetic screening and engineering as a potential cure for "stupidity" (if only) and once was quoted as saying that "People say it would be terrible if we made all girls pretty. I think it would be great."

"This is Watson at his most scandalous," Steven Rose, a professor of biological sciences at the Open University and a founding member of the Society for Social Responsibility in Science, told The Independent about the scientist's most recent comments. "If he knew the literature in the subject, he would know he was out of his depth scientifically, quite apart from socially and politically."

UPDATE (10/18/07): In the wake of the controversy, the Federation of American Scientists (FAS) today issued a searing statement denouncing Watson's comments:

"At a time when the scientific community is feeling threatened by political forces seeking to undermine its credibility, it is tragic that one of the icons of modern science has cast such dishonor on the profession," FAS president Henry Kelly said in the statement.

The scientific enterprise is based on the promotion and proof of new ideas through evidence, however controversial, but Dr. Watson chose to use his unique stature to promote personal prejudices that are racist, vicious and unsupported by science.

"While we honor the extraordinary contributions that Dr. Watson has made to science in the past, his comments show that he has lost his way. He has failed us in the worst possible way," said Kelly. "It is a sad and revolting way to end a remarkable career."


Posted by Lisa Stein · 30 comments   Permanent



11:52:29 am, Categories: Life Sciences, Technology, Public Policy, Chemistry, 625 words

Last year there were two trillion stalks of corn in Iowa, and most of it went to corn syrup


In the year 2000, the average American consumed 73 pounds of corn syrup.

King Corn, which, depending on where you live, is coming to a theater near you sometime this fall, is the story of two guys who decided to find out what would happen if they moved to Iowa, grew an acre of corn, and traced its path through the giant metabolic engine that is the American food system.

Unsurprisingly, the plot resembles the path that Michael Pollan traced in his seminal doorstop The Omnivore's Dilemma, with two important differences:

1. King Corn is a movie, so it's relatively short and accessible

2. King Corn is surprisingly funny

I don't know if this film is going to get as wide a distribution as Morgan Spurlock's Super Size Me, but it certainly deserves to.

In fact, this is probably one of those movies that should be required viewing in just about every classroom in America.

Spoiler alert: Here's what you'll learn from King Corn.

(All this data comes straight from the movie, so take it with a grain of salt; I haven't had time to fact-check it.)

* Those amber waves of grain (or corn) you see rushing past your window on roadtrips? You wouldn't want to eat that corn. The overwhelming majority of corn grown in this country is so foul to the human palate that one scene in King Corn has our protagonists taking bites of the fruits of their labor and then spitting them out in disgust. That's because modern feed corn has been bred for two things, and delectability isn't one of them: to tolerate being planted very close to its neighbors (to increase yield) and to produce as much starch as possible (to increase yield).

* Most of that corn goes to one of two places -- the vast corn-syrup factories that produce the one sweetener that can be found in just about every junk food (and even non junk food) item in the grocery store, or to feed cattle. Feeding corn to cattle makes them fat and sick, by the way, but also delicious, since it raises the saturated fact content of their flesh.

* If you were to pluck one of your hairs and analyze the origins of its carbon, most of it came from corn. That's right -- you're mostly corn! That's because all the chicken and beef you've ever eaten (or almost all of it) was corn-fed. And those snacks and sodas? All corn syrup. Even the fries we eat are up to 50% corn by caloric content if they are fried in corn oil.

* Yield per acre for corn farming is 4-5 times what it was before the introduction of artificial fertilizer. This is part of the reason that Americans now spend only on average 16% of their take-home pay on food. Thanks corn! (But boo to your making us obese!)

Here's something I'll add that I stole from Michael Pollan, which doesn't really get addressed in King Corn: all that artificial fertilizer is basically the product of fossil fuels. So, more or less, if we're made out of corn, and our food system is dependent on corn, and corn is dependent on fossil fuels, then, in a manner of speaking, we're all petroleum by-products.

Last weekend I spent some time on an actual farm, and I asked the farmer, who was typical in that he had a giant farm that hardly resembled the family farms of yore, where his fertilizer came from. Russia, he said. Because in Russia, they use natural gas to produce fertilizer -- whereas in the U.S., we use it to produce energy, on account of it burning cleaner than coal.

Now you know where your Big Mac ultimately came from -- the Precambrian, by way of St. Petersburg.


Posted by Christopher Mims · 8 comments   Permanent

October 16, 2007

10:55:40 am, Categories: Life Sciences, Mind Matters, 1328 words

Attention! How your brain manages its need to heed
Welcome to

Mind Matters

where top researchers in neuroscience, psychology, and psychiatry explain and discuss the findings and theories driving their fields. Readers can join them. We hope you will.

This week:

Modules, Networks, and the Brain's Need to Heed

_____


Introduction

by David Dobbs

Editor, Mind Matters

Two perennial polarities beloved by brain geeks -- networks versus modules and top-down versus bottom-up attention -- get linked in this week's essay, in which UC Berkeley's Mark D'Esposito reviews an imaging study of how monkeys use their brains to direct their attention. The results, suggests D'Esposito, add threads to vital strands of neuroscientific thought.

_____


Attention!

How the Brain Coordinates its Efforts to Pay Heed

Mark D'Esposito

Helen Wills Neuroscience Center

University of California, Berkeley

How does the brain organize its work? And how does it heed what it needs to heed? Theories of brain organization focus on two distinct but complementary principles of brain organization: modularity, the existence of brain regions with specialized functions, and network connectivity, the integration of information from various brain regions that results in organized behavior. In the study under review here, the modular and network models appear to play specialized roles in directing the attention of monkeys seeking certain visual targets through either "top-down" or "bottom-up" attentional strategies.

Modules versus networks

In the modules-versus-network debate, modularity is probably the simpler brain model to understand. Clinical observation of individuals with brain damage, as well as brain-imaging studies (functional MRIs, or fMRIs) of healthy individuals, demonstrate that certain brain regions control specific cognitive processes, such as the ability to produce speech. For instance, in patients with nonfluent aphasia, which creates a selective inability to speak, comprehension of spoken language remains intact. In 1861 Paul Broca observed that damage to the left frontal lobe in an autopsied brain had produced nonfluent aphasia. Modern brain-imaging studies of patients with strokes to this area (now known as "Broca's area") confirmed Broca's theory. Moreover, fMRIs of healthy individuals reveal that the left frontal lobe is activated when subjects generate speech.

Of course, that some brain areas specialize in certain functions does not exclude the possibility that those areas are also part of larger networks of brain regions communicating with one another. Although the modular model may accurately describe many cognitive functions, it is insufficient to explain complex cognitive processes that cannot be localized to isolated brain regions. It is unlikely that our ability to get the gist of a conversation, for instance, is the work of a single specialized brain module. Such complex behavior more likely arises from interactions between brain regions through network connectivity.

In his 1995 book Memory in the Cerebral Cortex: An Empirical Approach to Neural Networks in the Human and Nonhuman Primate, UCLA neurologist Joaquin Fuster began an argument he extended in his 2002 work Cortex and Mind: Unifying Cognition. According to Fuster, new studies of brain networks have led to a "revolution in contemporary neuroscience." He contends that the empirical shift from a reductionist modular model to a holistic network model offers promise of accomplishing our long-term goal of resolving the mind-brain question. Fuster's conception of a network model of brain function includes several key notions:

(1) Cognitive information is represented in wide, overlapping and interactive brain networks.

(2) Such networks develop on a core of organized modules of elementary sensory and motor functions, to which they remain connected.

(3) The cognitive code is a relational code, based on connectivity between discrete brain regions.

(4) The code's diversity and specificity derive from the myriad possible combinations of those brain regions.

(5) Any brain region can be part of many networks and thus of many percepts, memories, items of experience or personal knowledge.

(6) A given brain network can serve several cognitive functions.

(7) Cognitive functions consist of functional interactions within and between brain networks.

A division of labor ...

In "Top-Down Versus Bottom-Up Control of Attention in the Prefrontal and Posterior Parietal Cortices" (click here for pdf download), published in Science this year, Timothy Buschman and Earl Miller, both of MIT, add to this model by exploring the neural mechanisms underlying both functional specialization and functional integration. Buschman and Miller investigated how the brain allows us to give volitional attention to something in our environment, such as looking for our keys (top-down attention), versus automatically attending to something salient (or attention-grabbing), such as a fire alarm (bottom-up attention). To study these processes, the researchers focused on two brain regions known to be involved in attentional processes, the frontal and parietal lobes.

Using monkeys, Buschman and Miller recorded from neurons in these brain regions while the monkeys located a visual target on a computer screen. The target was always randomly located in an array of four stimuli, but there were two sets of conditions: In the first, the "pop-out" task, the target object would be clearly different from the nontarget stimuli (for example, it was not only a different color but also a different orientation), making the target more conspicuous. In the second test, the "search" task, the target stimulus would match some of the nontarget stimuli in at least a few dimensions (for instance, it might have the same color but not the same orientation). Because the target stimulus in this latter test was not salient, the monkeys had to rely on their memory of the desired target's appearance as they looked for it. As expected, it took longer for the monkeys to find this second type of target.

Buschman and Miller found that frontal-lobe neurons were first to find the target during the search task, whereas parietal lobe neurons were first to find the target during the pop-out task. In other words, selection of a more obscure target (top-down attention) may be mediated by the frontal lobes, whereas fast selection of a highly salient target (bottom-up attention) may be mediated by the parietal lobes. These findings suggest that the frontal and parietal lobes may mediate different cognitive processes consistent with functional specialization.

... and cooperative ventures

The researchers also investigated how these two brain regions communicated with each other during the two attention tasks by measuring the degree of synchrony between neuronal activity in each region. (Such synchrony -- a rough alignment of electrical wave patterns emitted from different brain areas -- is thought to facilitate or indicate communication and cooperation among brain regions. See Gyorgy Buzsaki's Rhythms of the Brain.) Synchrony between frontal and parietal regions was stronger in lower frequencies during the search task and in higher frequencies during the pop-out task, suggesting that synchronous activity between brain regions may increase the effectiveness of communication between these regions. Also, different modes of attention (for example, top-down as opposed to bottom-up) may emphasize synchrony at different frequencies. Buschman and Miller proposed that the increase in low-frequency synchrony during the search task could reflect a "broadcast" of top-down signals across the entire brain, whereas higher-frequency synchrony may support the local interactions between brain regions.

A methodological advance

These results add detail to the picture neuroscience is getting of how brain networks operate. And the study's methodological advance -- the ability to record simultaneously from two distant brain regions in an awake and active monkey -- offers a significant new tool for studying brain network connectivity. (In humans, functional magnetic resonance imaging, or fMRI, can also explore interactions between brain regions, as it simultaneously records correlates of neural activity throughout the entire functioning brain.) Over the next decade, hundreds of roughly similar monkey neurophysiology and human imaging studies will be performed in neuroscience laboratories around the world. From these studies, the pieces of the puzzle will continue to fall into place, allowing us to make significant strides toward answering the ultimate question: How does the brain work?

Mark D'Esposito is Professor of Neuroscience and Psychology at the University of California, Berkeley, where he the Director of the Henry H. Wheeler Jr. Brain Imaging Center at the Helen Wills Neuroscience Institute. He studies working memory, frontal lobe function, and cognitive neuroscience. He also maintains a practice in clinical neurology.


Posted by David Dobbs · 1 comment   Permanent

October 15, 2007

03:36:13 pm, Categories: Global Warming and Climate Change, Politics and Science, Public Policy, 296 words

UK declares Al Gore made 9 errors in An Inconvenient Truth; Germany rushes to his defense

The UK has its share of climate change skeptics, such as school principle Stewart Dimmock of Kent, who was attempting to use the English courts to have An Inconvenient Truth banned from schools there on the grounds that it contains a number of errors (earlier this year the government sent copies of the film to every school in the UK).

The BBC reports that these errors include the assertion that a seal level rise of up to 20 feet would be caused by melting of ice in Antarctica or Greenland "in the near future." (The judge was certainly right on this count. "In the near future," sounds like "ten to fifty years from now" to most reasonable people -- which is not the time-scale over which we're going to see that kind of sea level rise.)

Of course, none of this is new territory -- it's been known since the film came out that An Inconvenient Truth contains a few errors - check out this New York Times review of the film in which U. of Washington geochemist acknowledges these errors.

"The small errors don't detract from Gore's main point, which is that we in the United States have the technological and institutional ability to have a significant impact on the future trajectory of climate change."

Interestingly, the Germans, who are also apparently fond of showing An Inconvenient Truth in classrooms, are nonplussed by the declarations of the UK judge.

'Inconvenient Truth' To Continue Airing in Schools

The German government has come out in defense of former US Vice President Al Gore, who was named the 2007 Nobel Prize winner for his work on climate change education on Friday. Germany's Environment Ministry says a few errors in the film is no reason not to show it in schools.


Posted by Christopher Mims · 19 comments   Permanent



12:37:11 pm, Categories: Technology, 118 words

The cars of the 1907 automobile show

This post is a photographic supplement to an entry in the "50, 100 & 150 Years Ago" section of the November 2007 issue of Scientific American. The excerpt, from a November 1907 issue, is titled "Perfect Cars" and notes that the "buggy-type machine and the two-engine automobile" were the dominant style of car at the time.

Below are some samples of the exhibits on display 100 years ago. Click the links to see a larger image:

The Star--A Novel Two-Cycle Runabout with Friction Disk Transmission:

Read more


Posted by Nikhil Swaminathan · 4 comments   Permanent

October 12, 2007

07:10:09 pm, Categories: Life Sciences, Science and the Arts, 186 words

Man implants "ear" in arm

posted on behalf of news editor Lisa Stein:

Well isn't this special. Britain's Daily Mail reports that so-called performance artist Stelios Arcadious, a Cypriot-born Australian, had an ear implanted into his arm. And why, pray tell, did he do that? Seems he believes that art "should be more than simply illustrating." Toward that end, the newspaper says, Arcadious, 61, a former research fellow at Nottingham Trent University's Digital Research Unit, had a surgeon implant a human ear (grown from cells in a lab) into his left forearm. The next step in his living exhibit? "I hope to have a tiny microphone implanted [in]to it that will connect with a Bluetooth transmitter," he said. "That way, you can listen to what my ear is hearing."

Art? Maybe. Perhaps there's a scientific use for this too.. Like a human version of the Vacanti Mouse, maybe Mr. Arcadious could offer himself up for research in cellular regeneration and grow all sorts of cartilaginous structures off his body, not to mention help save countless lab mice from such a grotesque fate?

So what do you think? Art? Science? Insanity?

LS


Posted by JR Minkel · 3 comments   Permanent



05:56:31 pm, Categories: Space and Cosmology, 193 words

New image of Saturn's moon Iapetus... as a desktop image.

Being into science means never having to look at a boring desktop background, because NASA is always pumping out amazing images worth staring at every time you close a window and need a moment of respite.

Today I thought I'd share the fruits of my occasional efforts at turning these into images sized for desktops. Check your monitor size and download the corresponding resolution. Want another resolution? Leave your request in the comments. All images are maximum quality jpegs.

1680x1050
1152x864
1280x1024
800x600
1024x768
new: 1920x1200
new: 1440x900
new: 1280x800

Original image available here, courtesy of NASA, JPL, and the Space Science Institute. According to my colleague JR Minkel, writing in our weekly Gallery:

The first high-resolution images of the bright side of Saturn's two-toned moon Iapetus, shown here in a false-color mosaic, reveal a pair of overlapping impact craters and a dark fuzz of presumed hydrocarbons and other compounds (green) pooled on crater floors and equator-facing slopes. The Cassini spacecraft captured the images with its narrow-angle camera on a September 10, 2007, flyby from a distance of about 45,000 miles (73,000 kilometers). NASA said the images should help decipher the origin of the moon's twin tones.


Posted by Christopher Mims · 10 comments   Permanent



02:48:25 pm, Categories: Global Warming and Climate Change, Politics and Science, Public Policy, 672 words

Nobel committee to climate change deniers: "Your mother was a hamster and your father smelled of elderberries."*

*

There's no use pretending that today's announcement that Al Gore and the IPCC are to share the Nobel Peace Prize for their work on climate change is not in some way political.

But then again, the Nobel Peace Prize always is.

Elie Wiesel, the Dalai Lama, Doctors Without Borders, the International Campaign to Ban Landmines -- all past winners of the Prize, all organizations and individuals who have devoted themselves, collectively and individually, to publicizing and preventing the worst atrocities of the past century.

Giving the IPCC and Al Gore the Prize is an acknowledgment both of the importance of the problem of climate change and the fact that the battle -- like the battle against prejudice, genocide and war -- is far from over.

We might even conclude that, like the mideast peace process for which Yasser Arafat, Shimon Peres and Yitzhak Rabin were given the Prize (some might ague, prematurely), climate change is one of those thorny and impossibly complex issues which may never be resolved.

That is, climate change will probably always persist as a consequence of our failure to realize the breakthrough carbon-emission-averting technologies upon which pundits like economist Bjorn Lomborg are counting, or our failure to cap carbon emissions as urged by Al Gore and many others, or simply our stubborn refusal to admit that climate change is even an issue, as some stalwarts in the Denialism camp have continued to do. (Did I miss any factions in that debate? Feel free to let me know in the comments.)

So, for its most subjective award, an international body has (in an era when international, and especially European bodies often stand in opposition to an American administration which, not coincidentally, is helmed by the one-time political opponent of one of the Prizewinners) granted a prize for work on an issue that, whatever your feelings pro or con, is second only to international terrorism in terms of global mindshare -- this should not surprise anyone.

With the hindsight that will be afforded to future generations as the only remotely impartial judge of the outcome of the debates swirling around this issue, let our grandchildren at least note that Scientific American did not shrink from taking a stand, whatever the consequences among this millenium's version of Flat Earthers known as climate change denialists (who, in their dogmatic opposition to a growing mountain of scientific evidence, are not to be confused with climate change skeptics, who are perhaps doing a service to us all by prompting good science and spirited debate even as they serve those who would delay action).

* For his work on informing the public about the dangers of climate change, we declared Al Gore to be one of the Scientific American 50 -- our policy leader of the year.

* In August of this year, we published The Physical Science behind Climate Change, in which leading climate scientists explained why they and so many of their peers are so confident that human activities are dangerously warming the Earth.

* Years before the subject of climate change had exploded into the mass consciousness, scientists like Thomas Karl, director of the National Oceanic and Atmospheric Administration's National Climatic Data Center, and Kevin Trenberth, head of the Climate Analysis Section at the National Center for Atmospheric Research, used the pages of Scientific American to alert the world to the fact that

...various projections suggest that the degree of change will become dramatic by the middle of the 21st century, exceeding anything seen in nature during the past 10,000 years. Although some regions may benefit for a time, overall the alterations are expected to be disruptive or even severe.

The Human Impact on Climate; December 1999; Scientific American Magazine

It's refreshing that the debate has mostly moved away from the causes of climate change and toward the best way to cope with it. Carbon cap and trade? Investment in breakthrough technology? Simply learning to live with a warmer earth?

What do YOU think about the committee's decision to award the Prize to Al Gore and the IPCC?


Posted by Christopher Mims · 24 comments   Permanent

October 11, 2007

10:41:39 am, Categories: Ethics and Science, Medicine, Politics and Science, Public Policy, 1007 words

Experimental drugs or clinical research methods on trial?

[Editor's Note: The October issue contains an article entitled "Experimental Drugs on Trial" (not free) that discusses the pros and cons of approving experimental drugs for individual use in certain cases. The following is some additional thoughts from Dr. Richard Miller, CEO of Pharmacyclics, a developer of cancer drugs who has had his own struggles with the issue.]

When I read Beryl Lieff Benderly's article entitled "Experimental Drugs on Trial" (not free) about the Abigail Alliance lawsuit, a topic I've been following for some time, I was glad to see it included a broad discussion of clinical trial methods and policies.

Much of the current debate on access to experimental drugs has been dominated by two points of view:

Individual dying patients have a constitutional right to early access

VERSUS

Drugs should be made available only after they have been proven to be safe and effective in definitive controlled clinical trials

Unfortunately, each of these extremes fails to address the most important issue in this debate: the need to rethink and reengineer clinical research methods and FDA policy for drug approvals.

Read more


Posted by David Biello · 2 comments   Permanent

October 10, 2007

08:24:43 pm, Categories: Medicine, 551 words

Last year, it was knockdown; this year, it's "knockout" for the Nobel medicine or physiology prize

Hot on the heels of the 2006 Nobel for physiology or medicine for the technology behind RNAi--a procedure for at least partially blocking the translation of a gene into a functional protein--the Nobel Foundation handed out its 2007 prize for the discovery of a procedure for knocking out a specific gene altogether.

The University of Utah's Mario Capecchi, Sir Martin Evans of Cardiff University in Wales and Oliver Smithies over at the University of North Carolina in Chapel Hill split the Nobel for their contributions to the designer mouse model-world we now live in.

As a journalist (one who writes a lot about genetics, to boot), studies involving knockout mice are ubiquitous, the way box scores are in the sports pages. So the awarding of this prize strikes me as a no-brainer.

Oddly enough, it didn't to a a Wired blogger (who, full disclosure, is a friend and former classmate of mine) who stepped into a small world of hurt when he adopted a "What have knockout mice done for me lately?"-sort of attitude. (He's since softened his position.)

A press statement from the Nobel Assembly at the Karolinska Institute supports my initial impression:

To date, more than ten thousand mouse genes (approximately half of the genes in the mammalian genome) have been knocked out. Ongoing international efforts will make "knockout mice" for all genes available within the near future.

Thus far, genes have been quieted to study models for everything from cancer to mental retardation.

Anyway, I'll try to describe what makes these men deserving of their medals: In the late-70s/early-80s, Capecchi and Smithies concurrently sought a technique to target and replace a specific gene in a cell's genome. They determined that if they inserted a similar (but inactivated) sequence of DNA into a cell, they could trick the cell into incorporating it during homologous recombination. (Homologous recombination occurs naturally when two complimentary strands of DNA cross and swap genetic material to form sex cells, like sperm and eggs.)

However, Capecchi and Smithies' insights into gene targeting needed to be paired with a delivery system, which was developed primarily by Evans, who discovered embryonic stem cells. The pluripotent cells were repurposed as the ideal vehicle for these mutated genes: The inactive genes are placed in stem cells, and injected into a blastocyst (clump of cells that become the embryo), which then develops into a mouse with some cells that have the inactive gene and others that do not. When a "mosaic mouse" mates with a normal mouse, some of the offspring will not have the gene at all, while others will. Those lacking, are the "knockout mice."

Then you watch the little knockout guys develop and see what's gone awry. The Reuters story on our site (available for limited time) regarding the Nobel announcement has a cherry quote from Capecchi: "If for example, you see a little finger disappear [in the knockout mice], then you know that gene is important for making little fingers."

Brilliant. Nobel-worthy. Trust me.

More Resources:
SciAm senior writer Gary Stix wrote an excellent profile of Capecchi in 1999. It has been freed from behind a paywall for you to download. And Capecchi contributed a piece to the magazine in 1994 about targeted gene replacement. (This one, folks, will require you to shell out some pesos.)


Posted by Nikhil Swaminathan · Leave a comment   Permanent

:: Next Page >>

 EXCLUSIVE ONLINE ISSUES
 & SPECIAL EDITIONS
Secrets of the Senses
Uncommon Genius
21st-Century Medicine

Halloween Costumes

Internet Marketing Consultant

Barcode Scanners

Used Cars

Neoprene, Nitrile & Rubber Latex Gloves


© 1996-2007 Scientific American, Inc. All rights reserved. Reproduction in whole or in part without permission is prohibited.

Subscribe  |   Customer Care  |   Subscriber Alert  |   Order Issues  |   Site Map  |   Search  |   Jobs  |   About Us  |   Contact Us
Advertising  |   Institutional Site License  |   Privacy Policy  |   Visitor Agreement  |   Permissions  |   Reprints  |   Custom Publishing  |   Partnerships/Licensing
International Editions: Brazil  |   France  |   Germany  |   Italy  |   Japan  |   Spain  |   Other